I am parsing dates (got from server) in Java using this pattern: "yyyy-MM-dd'T'HH:mm:ss.SSS"
.
Incoming strings may be of these types:
2015-01-01T00:00:00.561
2015-01-01T00:00:00.5
My question is about milliseconds fraction. I am having trouble figuring out whether the .5
in the second string is 5
or 500
ms. Because when I parse it using my pattern I get 500
ms. Seems OK, but need to double check if there is any common contract to trim those zeros on server side. I would not ask if server returned 2015-01-01T00:00:00.500
, but with .5
I am not sure what is on server side 5
or 500
ms.
UPDATE:
I just had a talk with the server team, they confirmed .5
is .500
.
That pattern, YYYY-MM-DDTHH:MM:SS.SSS±HH:MM
, is a string format defined by the ISO 8601 standard.
Your particular usage omits the offset from UTC (the plus/minus at the end). Without the offset, the value is a "local time" meaning it refers to any locality such as "Christmas starts at 2015-12-25T00:00:00".
Both the Joda-Time library and java.time package use ISO 8601 formats as their defaults in generating/parsing strings.
Yes, the digits after the dot are indeed simply a decimal fraction. The dot is a decimal point. A value such as 0.5
is the same as 0.500
, both mean one half of a second, 500 milliseconds.
So, 2015-01-01T12:34:56.7
is the same as 2015-01-01T12:34:56.700
.