Search code examples
javascriptdategmtmilliseconds

Javascript Date.getFullYear() returns 1943 instead of 2013, why?


I am a trying to create a Javascript date object from a time in milliseconds computed from GMT0 (or UTC).

I use the following code for a time located in 2013 (as verifiable here):

var t = new Date(Date.UTC(0, 0, 0, 0, 0, 0, 0));
t.setMilliseconds(1383447600000);

but when I call the following:

alert(t.getFullYear());
alert(t.getUTCFullYear());

I am getting 1943... and not 2013!

Why? And how to solve this? Thanks!

The JsFiddle is: http://jsfiddle.net/EVf72/


Solution

  • Short Answer: Use setTime instead of setMilliseconds.

    Long Answer:

    The problem is that your starting date is incorrect. The value of 1383447600000 is the number of seconds since epoch 0 (January 1, 1970, 00:00:00 UTC), but your starting date is not epoch 0! Instead, it is the year 1899:

    > var t = new Date(Date.UTC(0, 0, 0, 0, 0, 0, 0));
    > console.log(t.getFullYear());
    1899
    

    When you then use setMilliseconds and provide a range over 999, it will convert the value into the appropriate numbers of years, days, hours, seconds, and milliseconds and add it to the current date.

    1383447600000 corresponds to a little over 43 years. So you're basically telling JavaScript to add a little over 43 years to 1899, which gives you 1943.

    From the documentation for setMilliseconds:

    If you specify a number outside the expected range, the date information in the Date object is updated accordingly. For example, if you specify 1005, the number of seconds is incremented by 1, and 5 is used for the milliseconds.

    If you had instead provided the correct starting point to Date.UTC so that it matches epoch 0, you would have received the correct answer:

    > var t = new Date(Date.UTC(1970, 0, 0, 0, 0, 0, 0)); //First param is year
    > t.setMilliseconds(1383447600000);
    > console.log(t.getFullYear());
    
    2013
    

    But instead of doing all of that, you can simply use setTime:

    > var t = new Date();
    > t.setTime(1383447600000);
    > console.log(t.getFullYear());
    
    2013
    

    So to recap, the following are functionally equivalent:

    > var t = new Date(Date.UTC(1970, 0, 0, 0, 0, 0, 0)); //First param is year
    > t.setMilliseconds(1383447600000);
    > console.log(t.getFullYear());
    
    2013
    

    and

    > var t = new Date();
    > t.setTime(1383447600000);
    > console.log(t.getFullYear());
    
    2013
    

    But if you are dealing with milliseconds since epoch 0, you either need to use setTime, or make sure that you actually start with epoch 0 (using Date.UTC) if you are going to be using setMilliseconds.