Search code examples
logstashlogstash-configuration

logstash - Unable to parse timestamp


I have the following JSON log (new line separated )

{
      "logName": "projects/gg-sanbox/logs/appengine.googleapis.com%2Fnginx.request",
      "timestamp": "2018-04-02 22:26:02.869 UTC",
      "receiveTimestamp":"2018-04-02 22:28:06.742394 UTC",
}

and logstash config

input
    {
      file {
        type => "json"
        path => "/logs/mylogs.log"
        codec => json
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
    }

    filter{
        json{
            source => "message"
        }
    date {
        match => [ "timestamp", "yyyy-MM-dd HH:mm:ss.SSS Z"]
    }
    }

    output
    {
        stdout
        {
            #codec => rubydebug
        }
        elasticsearch
        {
        codec => "json_lines"
            hosts => ["127.0.0.1:9200"]
           # document_id => "%{logstash_checksum}"
            index => "appengine_nginx-requests"
        }

    }

I am getting the following in the logstash output

"@timestamp"=>2018-04-07T15:26:31.857Z, "tags"=>["_dateparsefailure"],

Notice that its falling back to the current data and time instead of the time mentioned in the log line which is actually the event had occured and I want to see in the Kibana timeline.

Not sure what is the problem here.


Solution

  • Take a look at date filter plugin documentation, the format option Z does not match UTC; because alone it's not a timezone (for it to be valid it would need to be +0000). You'll need to add it like so: yyyy-MM-dd HH:mm:ss.SSS 'UTC'

    To answer your other question about the precision of the seconds; it's simply not supported to have any precision lower than milliseconds. If you look at link above, you'll find:

    Maximum precision is milliseconds (SSS). Beyond that, zeroes are appended.