Search code examples
elasticsearchlogstashlogstash-grok

datetime parse in ELK


I am trying to parse log using ELK stack. following is my sample log

2015-12-11 12:05:24+0530 [process] INFO: process 0.24.5 started 

I am using the following grok

grok{
    match => {"message" => "(?m)%{TIMESTAMP_ISO8601:processdate}\s+\[%{WORD:name}\]\s+%{LOGLEVEL:loglevel}"}
    }

and my elastic search mapping is

{
    "properties": {
        "processdate":{
            "type":   "date",
            "format" : "yyyy-MM-dd HH:mm:ss+SSSS"
        },
        "name":{"type" : "string"},
        "loglevel":{"type" : "string"},
    }
}

But while loading into Elastic search i am getting below error,

 "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [processdate]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2015-12-11 12:05:39+0530\" is malformed at \" 12:05:39+0530\""}}}}, :level=>:warn}

How to modify it to a proper data format? I have added the proper date format in elastic search.

Update: localhost:9200/log

{"log":{"aliases":{},"mappings":{"filelog":{"properties":{"processdate":{"type":"date","format":"yyyy-MM-dd' 'HH:mm:ssZ"},"loglevel":{"type":"string"},"name":{"type":"string"}}}},"settings":{"index":{"creation_date":"1458218007417","number_of_shards":"5","number_of_replicas":"1","uuid":"_7ffuioZS7eGBbFCDMk7cw","version":{"created":"2020099"}}},"warmers":{}}}

Solution

  • The error you're getting means that your date format is wrong. Fix your date format like this, i.e. use Z (timezone) at the end instead of +SSSS (fraction of seconds):

    {
        "properties": {
            "processdate":{
                "type":   "date",
                "format" : "yyyy-MM-dd HH:mm:ssZ"
            },
            "name":{"type" : "string"},
            "loglevel":{"type" : "string"}
        }
    }
    

    Also, according to our earlier exchange, your elasticsearch output plugin is missing the document_type setting and should be configured like this instead in order to make use of your custom filelog mapping type (otherwise the default logs type is being used and your custom mapping type is not kicking in):

    output {
        elasticsearch {
            hosts => ["172.16.2.204:9200"] 
            index => "log" 
            document_type => "filelog" 
        } 
    }