I have the following values:
1465509600000
1402437600000
I tried the following:
attempt 1:
public long? CommitmentStartDate { get; set; }
public long? CommitmentEndDate { get; set; }
public DateTime? CommitmentStartDateDate => new DateTime( (CommitmentStartDate != null ? (long)CommitmentStartDate: Convert.ToInt64(DateTime.MinValue)) );
public DateTime? CommitmentEndDateDate => new DateTime(CommitmentEndDate != null ? (long)CommitmentEndDate: Convert.ToInt64(DateTime.MinValue));
This gives me the date in the worng format, i get this:
0001-01-02 16:42:30
0001-01-02 14:57:23
attempt 2:
static readonly DateTime _unixEpoch =
new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
public static DateTime DateFromTimestamp(long timestamp)
{
return _unixEpoch.AddSeconds(timestamp);
}
public long? CommitmentStartDate { get; set; }
public long? CommitmentEndDate { get; set; }
public DateTime? CommitmentStartDateDate => DateFromTimestamp(CommitmentStartDate != null ? (long)CommitmentStartDate: Convert.ToInt64(DateTime.MinValue));
public DateTime? CommitmentEndDateDate => DateFromTimestamp(CommitmentEndDate != null ? (long)CommitmentEndDate: Convert.ToInt64(DateTime.MinValue));
this gave me an argumentOutOfRange exception.
How do I do this?
EDIT:
Expected values:
2014-06-11
2016-06-10
EDIT 2:
Ticks that come from date
1402437600000 ---- > 2014-06-11
1465509600000 ---- > 2016-06-10
Your 2 samples are 2 years apart, taking the difference from those ticks and dividing by 2, 365, 24 and 3600 leaves 1000 so they are milliseconds.
A quick check reveals that they are indeed based on 1-1-1970 so
//return _unixEpoch.AddSeconds(timestamp);
return _unixEpoch.AddMilliSeconds(timestamp);