If I type 0.3
into the Chrome console it prints 0.3
back at me.
Is this basically saying: "you typed in a Number literal with the contents 0.3
, and I replay that representation back to you in the console as a convenience, even though under the hood and in reality it cannot be exactly represented and the best approximation I can come up with is 0.30000000000000004
"?
Actually, the internal representation is
0.299999999999999988897769753748434595763683319091796875
It could have printed 0.29999999999999999, which is the internal value rounded to 17 significant digits. Rounding to 17 digits is the conservative way to preserve any internal value. But in this case 0.3 works just as well (after all, that is what you started with), and since it's shorter that is what is printed.
So it didn't take your string input and echo it back -- it just worked out that way (as it would for any input of 15 significant digits or less).