When I try to Date.parse()
an integer or string 0
, it returns 946681200000, which translates to a date of:
Sat Jan 01 2000 00:00:00 GMT+0100 (CET)
I would assume that the parser interprets the single zero as a year 2000, but the specs say nothing about single-character year definition - both RFC 2822 and ISO 8601 require a four-character year in the string.
I would like to better understand how the string '0' is parsed into a Date, why is it accepted as a valid Date (should it not be NaN
or some such?) and why the year 2000 is chosen instead of for example 1900.
Update
After some trial & error, I discovered that the single number is in fact interpreted differently in different numeric ranges.
NaN
the specs say nothing about single-character year definition
The spec says:
If the String does not conform to that format the function may fall back to any implementation-specific heuristics or implementation-specific date formats.
For V8 specifically, see this bug report on unpredictable results when called with a single number. You can also read the source directly (dateparser.cc, dateparser.h, dateparser-inl.h).