Avro schema:
{
"name": "Entity",
"type": "record",
"namespace": "com.foobar.entity",
"fields": [
{
"name": "attribute",
"type": "string"
},
{
"name": "value",
"type": "int"
},
{
"name": "timestamp",
"type": { "type": "long", "logicalType": "timestamp-micros" }
}
]
}
The source timestamp is in Unix format with milli second precision.
When I put such records into BigQuery I get values like 1970-01-19 01:18:19.415 UTC
in the BigQuery data preview. However the value I stored is 1559899418
which is Friday, 7. June 2019 09:23:38
. Any ideas why?
Reference: https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-avro#logical_types
Your timestamp is off by a factor 1000. Indeed, 1559899418
corresponds to Friday, 7. June 2019 09:23:38
, but that's second-precision (Unix timestamp), not millisecond.
And 1559899
(one thousandth of 1559899418) does indeed correspond to 1970-01-19 01:18:19