All,
I am working with a binary specification whose TimeStamp fields are defined as "Milliseconds since January 1, 2000 UTC time". I am doing the following calculation:
public static final TimeZone UTC = TimeZone.getTimeZone("UTC") ;
public static final Calendar Y2K_EPOCH = Calendar.getInstance(UTC);
static {
Y2K_EPOCH.clear();
// Month is 0 based; day is 1 based. Reset time to be first second of January 1, 2000
Y2K_EPOCH.set(2000, 0, 1, 0, 0, 0);
}
public static final long MS_BETWEEN_ORIGINAL_EPOCH_AND_Y2K_EPOCH = Y2K_EPOCH.getTimeInMillis();
public static long getMillisecondsSinceY2K(Date date) {
long time = date.getTime();
if (time < MS_BETWEEN_ORIGINAL_EPOCH_AND_Y2K_EPOCH) {
throw new IllegalArgumentException("Date must occur after January 1, 2000");
}
return time - MS_BETWEEN_ORIGINAL_EPOCH_AND_Y2K_EPOCH;
}
My question is, is this the correct way to do the conversion between standard Java Date objects and this datatype? Is there a better way of doing this? I know about Joda time but I'd rather not bring that external dependency in if I can help it.
Time is a tricky mess, especially UTC time. Assuming you want a pretty good time based of an arbitrary epoch, a simple subtraction like what you do should be fine. If you are worried about leap-seconds precision I would highly suggest you use Joda or some reliable external library.