Search code examples
elasticsearchlogstashlogstash-grokfilebeat

Add extra value to field before sending to elasticsearch


I'm using logstash, filebeat and grok to send data from logs to my elastisearch instance. This is the grok configuration in the pipe

filter {
    grok {
        match => {
            "message" => "%{SYSLOGTIMESTAMP:messageDate} %{GREEDYDATA:messagge}"
        }
    }
}

This works fine, the issue is that messageDate is in this format Jan 15 11:18:25 and it doesn't have a year entry.
Now, i actually know the year these files were created in and i was wondering if it is possible to add the value to the field during the process, that is, somehow turn Jan 15 11:18:25 into 2016 Jan 15 11:18:25 before sending to elasticsearch (obviously without editing the files, which i could do and even with ease but it'll be a temporary fix to what i have to do and not a definitive solution)

I have tried googling if it was possible but no luck...


Solution

  • By reading thoroughly the grok documentation i found what google couldn't find for me and which i apparently missed the first time i read that page

    https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#plugins-filters-grok-add_field

    Using the add_field and remove_field options i managed to add the year to my date, then i used the date plugin to send it to logstash as a timestamp. My filter configuration now looks like this

    filter {
        grok {
            match => {
                "message" => "%{SYSLOGTIMESTAMP:tMessageDate} %{GREEDYDATA:messagge}"
                add_field => { "messageDate" => "2016 %{tMessageDate}" }
                remove_field => ["tMessageDate"]
            }
        }
        date {
            match => [ "messageDate", "YYYY MMM dd HH:mm:ss"]
        }
    }
    

    And it worked fine