I have data from an external DB source in a BQ table. The data type in that external DB datetime columns (I think SQL Server) has the following schema information:
The datetime columns are saved as INTEGER in BigQuery.
Some real values for example:
CREATETIME | CHANGETIME |
---|---|
1380308645543333333 | 1416903877310000000 |
1416907372976666666 | 1458294273450000000 |
1363698020000000000 | 1406901312793333333 |
1717156474010000000 | 1717156490623333333 |
I need to convert this data to a timestamp column in BQ but BQ does not seem to support a conversion from unix epoch time with nanoseconds to a timestamp?
Highest precision I could find in the documentation (https://cloud.google.com/bigquery/docs/reference/standard-sql/timestamp_functions) is TIMESTAMP_MICROS
but there seems to be no TIMESTAMP_NANOS
?
And if I can't convert unix epoch with nanoseconds to a timestamp what is the best way to convert it to a timestamp with second precision when I don't need such a high precision?
As pointed out by @dan04 you can divide by 1000000000 and convert from seconds as follows:
SELECT
TIMESTAMP_SECONDS(CAST(CREATETIME / 1000000000 AS INT64)) AS CREATETIME_TIMESTAMP,
TIMESTAMP_SECONDS(CAST(CHANGETIME / 1000000000 AS INT64)) AS CHANGETIME_TIMESTAMP
FROM `dataset_id.table_name`;