I'm using this public Postgres DB of NEAR protocol: https://github.com/near/near-indexer-for-explorer#shared-public-access
There is a field called included_in_block_timestamp
whose "data type" = "numeric", and "length/precision" = 20.
This code works:
to_char(TO_TIMESTAMP("public"."receipts"."included_in_block_timestamp"/1000000000), 'YYYY-MM-DD HH:mm') as moment,
and so does this:
function convertTimestampDecimalToDayjsMoment(timestampDecimal: Decimal) {
const timestampNum = Number(timestampDecimal) / 1_000_000_000; // Why is this necessary?
console.log({ timestampNum });
const moment = dayjs.unix(timestampNum); // https://day.js.org/docs/en/parse/unix-timestamp
return moment;
}
For example, sometimes included_in_block_timestamp
= 1644261932960444221.
I've never seen a timestamp where I needed to divide by 1 billion. Figuring this out just now was a matter of trial and error.
What's going on here? Is this common practice? Does this level of precision even make sense?
Timestamp units of measure in nanoseconds seems to be determined at the protocol-level as this appears in the docs here: https://docs.near.org/develop/contracts/environment/#environment-variables
and here: https://nomicon.io/RuntimeSpec/Components/BindingsSpec/ContextAPI
So yes, do take this into account before date-time conversions.