Search code examples
datetimestamplogstashlogstash-grok

Logstash - Custom Timestamp Error


I am trying to input a timestamp field in Logstash and i am getting dateparsefailure message.

My Message -

2014-08-01;11:00:22.123

Pipeline file

input {
stdin{}
#beats {
#        port => "5043"
#    }
}
# optional.
filter {
  date {
        locale => "en"
        match => ["message", "YYYY-MM-dd;HH:mm:ss.SSS"]
        target => "@timestamp"
        add_field => { "debug" => "timestampMatched"}
   }
 }
output {
elasticsearch {
        hosts => [ "127.0.0.1:9200" ]
    }
stdout { codec => rubydebug }
}

Can someone tell me what i am missing ?

Update 1

I referred to the link - How to remove trailing newline from message field and now it works.

But, in my log message, i have multiple values other than timestamp

<B 2014-08-01;11:00:22.123 Field1=Value1 Field2=Value2

When i give this as input, it is not working. How to read a part of the log and make it as timestamp ?

Update 2

it works now.

Changed the config file as below

filter {
kv
   {

   }
 mutate {
    strip => "message"
  }
  date {
        locale => "en"
        match => ["timestamp1", "YYYY-MM-dd;HH:mm:ss.SSS"]
        target => "@timestamp"
        add_field => { "debug" => "timestampMatched"}
   }

 }

Solution

  • I am posting the answer below and steps i used to solve the issue so that i can help people like me.

    Step 1 - I read the message in the form of key and value pair

    Step 2 - I trimmed off the extra space that leads to parse exception

    Step 3 - I read the timestamp value and other fields in respective fields.

    input {
    beats {
            port => "5043"
        }
    }
    # optional.
    filter {
      kv { }
      date {
                match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
                remove_field => [ "timestamp" ]
            }
    }
    output {
    elasticsearch {
            hosts => [ "127.0.0.1:9200" ]
        }
    }