Search code examples
elasticsearchlogstashlogstash-groklogstash-configuration

Index based on the value of a converted unix timestamp in logstash


I need to index records based on the value of a UNIX timestamp (seconds since epoch) field, however the index needs to be in the form of YYYY-MM-dd. However the value of the timestamp field needs to remain a UNIX timestamp.

So the question is two parts.

  1. How can I convert the timestamp file to YYYY-MM-dd without destroying the UNIX timestamp?
  2. How can I apply the YYYY-MM-dd value to the index?

Here is what I have so far.

input {
    tcp {
            port => 5000
    }
}

filter {
    csv {
            separator => "  " # <- this white space is actually a tab
            skip_empty_columns => true
            # other fields omitted for brevity
            columns => ["timestamp"]
    }

    #grok {
    #       match => {"indexdate" => "%{NUMBER:timestamp}"}
    #}

    date {
            match => ["timestamp", "YYYY-MM-DD"]
            target => "@timestamp"
    }

    mutate {
            # omitted for brevity
            remove_field => []
    }
}

output {
    elasticsearch {
            hosts => "elasticsearch:9200"
            index => "indexname-%{+YYYY-MM-dd}"
    }
}

I have made several different grok attempts but no luck so far.


Solution

  • Simply change your date filter like this and it should work. The pattern you use in the date filter should match the pattern of the field you're parsing, since you have epoch seconds, then the pattern should be UNIX:

    date {
            match => ["timestamp", "UNIX"]
            target => "@timestamp"
    }
    

    Note that you might need to add a convert setting in your csv filter to make sure that the timestamp field is an integer

    csv {
       ...
       convert => { "timestamp" => "integer" }
    }