I have a streaming data as shown below.
"TriggeredDateTime": {"dateTime":{"date":{"year":2019,"month":5,"day":16},"time":{"hour":4,"minute":56,"second":19,"nano":480389000}},"offset":{"totalSeconds":0}}
I have written SAQL to separate the date and time as below
WITH SIGNAL AS (
SELECT
input.TriggeredDateTime.dateTime.date.year AS yr,
input.TriggeredDateTime.dateTime.date.month AS mnth,
input.TriggeredDateTime.dateTime.date.day AS date,
input.TriggeredDateTime.dateTime.time.hour AS hr,
input.TriggeredDateTime.dateTime.time.minute AS mnt,
input.TriggeredDateTime.dateTime.time.second AS sec,
input.TriggeredDateTime.dateTime.time.nano AS nano,
input.TriggeredDateTime.dateTime.offset.totalSeconds AS offset
FROM [input]
)
SELECT
CONCAT (yr,'-',mnth,'-',date,' ',hr,':',mnt,':',sec,':',nano) AS TriggeredDateTime
INTO
[output]
FROM SIGNAL
I have to add the offset seconds also with this time, how to add offset time so that the output second is added with offset time.
You could try using CAST and DATEADD:
DATEADD(
second,
SIGNAL.offset,
CAST(<your date string> as datetime)
AS TriggeredDateTime
Update Answer:
The solution is still based on the usage of CAST and DATEADD,but please modify slightly for your sql.
According to my several tests,i found the CAST and DATEADD can't parse nano
unit,please see this link:
So my idea is convert nano
into microsecond
,then please follow below sql:
WITH i AS (
SELECT
input.TriggeredDateTime.dateTime.date.year AS yr,
input.TriggeredDateTime.dateTime.date.month AS mnth,
input.TriggeredDateTime.dateTime.date.day AS date,
input.TriggeredDateTime.dateTime.time.hour AS hr,
input.TriggeredDateTime.dateTime.time.minute AS mnt,
input.TriggeredDateTime.dateTime.time.second AS sec,
input.TriggeredDateTime.dateTime.time.nano/1000 AS nano,
input.TriggeredDateTime.dateTime.offset.totalSeconds AS offset
FROM input
)
select
DATEADD(second,offset,
DATEADD(microsecond,nano,
CAST(CONCAT (i.yr,'-',i.mnth,'-',i.date,' ',i.hr,':',i.mnt,':',i.sec) as datetime)
)
) AS TriggeredDateTime
from i
My output with you sample data: