I have some logs that read by Filebeat, filter in Logstash and sent to MongoDB. Filebeat generate @timestamp field on logs, the log might look like this
{
"_id" : ObjectId("622884e8ed814f1590000076"),
"@timestamp" : "\"2022-03-09T10:43:46.000\"",
"stream" : "stderr",
"message" : "[2022-03-09T10:43:46.528612+00:00] testing.INFO: Message error [] []"
}
but the @timestamp field write as String not Date. I have to read the timestamp as ISODate on MongoDB. It might look like this
"@timestamp": ISODate("2022-03-10T01:43:46.000Z")
instead of
"@timestamp" : "\"2022-03-09T10:43:46.000\""
Any suggestion how to change the datatype from string into date?
UPDATE Trying to using match on date filter.
date {
match => [ "@timestamp", "MMM dd yyyy HH:mm:ss", "MMM d yyyy HH:mm:ss", "ISO8601" ]
}
}
on mongo the datatype still not change. I also tried to make another field with current date (logstash_processed_at) with this
ruby {
code => "event.set('logstash_processed_at', Time.now());"
}
And using the date match too, but on the mongo the datatype still string
Instead of changing the filter it solved by adding new field on output mongodb at logstash config. the field is isodate => true.
output {
mongodb {
...
isodate => true
}
}
This field change the @timestamp field from string to ISODate() format when it sent to mongodb. If you are looking for how changing the timestamp into ISODate() in mongodb, @R2D2 has the answer up there.