I have two date strings, each in a different time zone. These strings are in what I believe is referred to as "simplified" ISO 8601 format. Two example dates are listed below.
2017-08-14T18:41:52.793Z
2017-08-14T23:41:52.000Z
The first date is in CDT while the second date is in UTC. I believe the last four digits of each of these strings indicates the time zone.
What's weird is when I set up new Date()
for each of these, I'm getting incorrect dates reported via console.log()
. For example:
const local_date = new Date("2017-08-14T18:41:52.793Z");
const remote_date = new Date("2017-08-14T23:41:52.000Z");
console.log("local_date = " + local_date);
console.log("remote_date = " + remote_date);
Outputs:
local_date = Mon Aug 14 2017 13:41:52 GMT-0500 (Central Daylight Time)
remote_date = Mon Aug 14 2017 18:41:52 GMT-0500 (Central Daylight Time)
It appears as though the first date is getting 5 hours subtracted even though the source date was provided in CDT; it's like it's assuming that both dates are provided in UTC.
https://jsfiddle.net/nkax7cjx/1/
What am I don't wrong here?
The last four digits are 3 digit milliseconds followed by the timezone, where Z indicates UTC time, and +hh:mm
and -hh:mm
indicates the offset from UTC time.
So 793Z is 793 milliseconds in UTC.
So both of your examples are in UTC, which is why you're seeing the output you're seeing.
const local_date = new Date("2017-08-14T18:41:52.793-05:00");
Would be CDT format.