Search code examples
elasticsearchlogstashlogstash-grok

ElasticSearch - not setting the date type


I am trying the ELK stack, and so far so good :)

I have run in to strange situation regardgin the parsing the date field and sending it to ElasticSearch. I manage to parse the field, and it really gets created in the ElasticSearch, but it always end up as string. I have tried many different combinations. Also I have tried many different things that people suggested, but still I fail.

This is my setup:

The strings that comes from Filebeat:

[2017-04-26 09:40:33] security.DEBUG: Stored the security token in the session. {"key":"securitysecured_area"} []

[2017-04-26 09:50:42] request.INFO: Matched route "home_logged_in". {"route_parameters":{"controller":"AppBundle\Controller\HomeLoggedInController::showAction","locale":"de","route":"homelogged_in"},"request_uri":"https://qa.someserver.de/de/home"} []

The logstash parsing section:

if [@metadata][type] == "feprod" or [@metadata][type] == "feqa"{
 grok {
   match => { "message" => "%{TIMESTAMP_ISO8601:logdate}" }
 }
date {
#timezone => "Europe/Berlin"
match => [ "logdate", "yyyy-MM-dd HH:mm:ss"]
  }
}

According to the documentation, my @timestamp field should be overwritten with the logdate value. But it is no happening.

In the ElasticSearch I can see the field logdate is being created and it has value of 2017-04-26 09:40:33, but its type is string.

I always create index from zero, I delete it first and let the logstash populate it.

I need either @timestamp overwritten with the actual date (not the date when it was indexed), or that logdate field is created with date type. Both is good


Solution

  • Unless you are explicitly adding [@metadata][type] somewhere that you aren't showing, that is your problem. It's not set by default, [type] is set by default from the 'type =>' parameter on your input.

    You can validate this with a minimal complete example:

    input {
        stdin {
            type=>'feprod'
        }
    }
    filter {
        if [@metadata][type] == "feprod" or [@metadata][type] == "feqa"{
            grok {
                match => { "message" => "%{TIMESTAMP_ISO8601:logdate}" }
            }
            date {
                match => [ "logdate", "yyyy-MM-dd HH:mm:ss"]
            }
        }
    }
    
    output {
        stdout { codec => "rubydebug" }
    }
    

    And running it:

    echo '[2017-04-26 09:40:33] security.DEBUG: Stored the security token in the session. {"key":"securitysecured_area"} []' | bin/logstash -f test.conf
    

    And getting the output:

    {
        "@timestamp" => 2017-05-02T15:15:05.875Z,
          "@version" => "1",
              "host" => "xxxxxxxxx",
           "message" => "[2017-04-26 09:40:33] security.DEBUG: Stored the security     token in the session. {\"key\":\"securitysecured_area\"} []",
              "type" => "feprod",
              "tags" => []
    }
    

    if you use just if [type] ==... it will work fine.

    {
        "@timestamp" => 2017-04-26T14:40:33.000Z,
           "logdate" => "2017-04-26 09:40:33",
          "@version" => "1",
              "host" => "xxxxxxxxx",
           "message" => "[2017-04-26 09:40:33] security.DEBUG: Stored the security token in the session. {\"key\":\"securitysecured_area\"} []",
              "type" => "feprod",
              "tags" => []
    }