Search code examples
javascriptpythonpython-3.xpytzpython-dateutil

Why Python datetime and JS Date does not match?


I have this code, that returns UTC offset from given date:

>>> import datetime
>>> import pytz
>>> cet = pytz.timezone("Europe/Moscow")
>>> cet.localize(datetime.datetime(2000, 6, 1))
datetime.datetime(2000, 6, 1, 0, 0, tzinfo=<DstTzInfo 'Europe/Moscow' MSD+4:00:00 DST>)
>>> int(cet.localize(datetime.datetime(2000, 6, 1)).utcoffset().seconds/60)
240

Ok, do it in JS using this code ( http://jsfiddle.net/nvn1fef0/ )

new Date(2000, 5, 1).getTimezoneOffset(); // -180

Maybe i doing something wrong? And how i can get plus-minus before offset (like in JS result)?


Solution

  • On my system both Python and Javascript produce the same result (modulo sign):

    >>> from datetime import datetime, timedelta
    >>> import pytz
    >>> tz = pytz.timezone('Europe/Moscow')
    >>> dt = tz.localize(datetime(2000, 6, 1), is_dst=None)
    >>> print(dt)
    2000-06-01 00:00:00+04:00
    >>> dt.utcoffset() // timedelta(minutes=1)
    240
    

    And new Date(2000, 6, 1).getTimezoneOffset() returns -240 (different sign, same value).

    Python uses: local time = utc time + utc offset definition. While Javascript uses a different definition: utc offset = utc time - local time i.e., both results are correct and have correct signs for the corresponding definitions.

    For a portable Javascript solution, you could use momentjs library that provides access to the same tz database as pytz Python module:

    > var moscow = moment.tz("2000-06-01", "Europe/Moscow");
    undefined
    > moscow.format()
    "2000-06-01T00:00:00+04:00"