Search code examples
logstashkibanalogstash-grokfilebeat

Using grok to create fields in sample log


FMT="1358  15:41:07W19/03/21 (A) Interlocking Link 116 Restored" STY="A" AMSEQ="LINKFAIL" AMSST="RTN" ALTID="1358" TS="20210319154107" CP="LOC A" CP="LOC X" MP="104.95" MP="104.95" EQ="MDIPRIMARYOFF" POS="TC-NORTH"

The log format is as above. I would like to capture the following fields using grok

Time - 15:41:07
Date - 19/03/21
Message - Interlocking Link 116 Restored
Location - Loc X

Anyone help with creating grok pattern that I can use on my logstash filter to parse my logs?


Solution

  • I would not use grok to start with. This is key/value data, so a kv filter will get you started, then you can grok the parts of the FMT field out.

        kv { include_keys => [ "FMT", "CP" ] target => "[@metadata]" }
        mutate { add_field => { "Location" => "%{[@metadata][CP][1]}" } }
        grok { match => { "[@metadata][FMT]" => "%{NUMBER} %{TIME:Time}W%{DATE_EU:Date} \(%{WORD}\) %{GREEDYDATA:Message}" } }
    

    will result in

       "Message" => "Interlocking Link 116 Restored",
          "Date" => "19/03/21",
          "Time" => "15:41:07",
      "Location" => "LOC X",
    

    Although having multiple CP fields feels fragile.

    The include_keys option on the kv filter tells the filter to ignore other keys. Using target to put the fields under [@metadata] means they are available to other filters but are not sent to the output. The remove_field option on the kv filter is only processed if the filter is able to parse the message, so if your kv data is invalid you will have a [message] field on the event that you can look at.