I have a Stream Analytics job that converts a timestamp into a number of milliseconds since Epoch.
To do this, I use a JavaScript function that returns a bigint
, using the following code:
new Date(date).getTime()
When I test this job in the Azure portal, I get the correct result, e.g.:
2018-08-29T13:01:54.0000000Z
becomes 1535547714000
:
But when I run the job and it starts storing its output in an Azure table, 1535547714000
becomes -2050577968
.
I noticed that if I cast the bigint
1535547714000
into an int
, I get -2050577968
. So I checked the type of the column and, strangely, it's Int64
:
TL;DR:
The job outputs a bigint
, the column type is bigint
Int64
but somehow, somewhere in between, the value seems to be cast into an int
.
How do I fix that?
Rodolphe. If you want to transfer data into Azure Table Storage, you have to follow the rules of it.
Based on the doc, Azure Table Storage only supports int32
and int64
type, no bigint type. So, actually there is no mysterious mechanism to convert your data type, just when the bigint
data into the table, it is converted to the corresponding Int64
type.However, it supposed to be 1535547714000
,no way to be converted to int.You could commit feedback to Azure to post this issue.
As a workaround, you could try to get the data and convert it into Int64 in Azure Table Storage Azure Trigger Function. Please refer to this doc.
Hope it helps you.