Search code examples
elasticsearchlogstashlogstash-grok

logstash grok issue while filtering data


I have a data that's basically for data deletion via rm command which looks as follows.

ttmv516,19/05/21,03:59,00-mins,dvcm,dvcm 166820 4.1 0.0 4212 736 ? DN 03:59 0:01 rm -rf /dv/project/agile/mce_dev_folic/test/install.asan/install,/dv/svgwwt/commander/workspace4/dvfcronrun_IL-SFV-RHEL6.5-K4_kinite_agile_invoke_dvfcronrun_at_given_site_50322

I'm using below logstash grok on this, which was working fine but until recently i see two weird issue 1) _grokparsefailure another 2) Hostname Field not appearing correctly ie its initial chars are not there like ttmv516 would appear like mv516.

%{HOSTNAME:Hostname},%{DATE:Date},%{HOUR:dt_h}:%{MINUTE:dt_m},%{NUMBER:duration}-%{WORD:hm},%{USER:User},%{USER:User_1} %{NUMBER:Pid} %{NUMBER:float} %{NUMBER:float} %{NUMBER:Num_1} %{NUMBER:Num_2} %{DATA} (?:%{HOUR:dt_h1}:|)(?:%{MINUTE:dt_m1}|) (?:%{HOUR:dt_h2}:|)(?:%{MINUTE:dt_m2}|)%{GREEDYDATA:CMD},%{GREEDYDATA:PWD_PATH}

However, testing same with grok Debugger in Kibana data appears correctly.

enter image description here

My logstash file as follows.

cat /etc/logstash/conf.d/rmlog.conf
input {
  file {
    path => [ "/data/rm_logs/*.txt" ]
    start_position => beginning
    sincedb_path => "/data/registry-1"
    max_open_files => 64000
    type => "rmlog"
  }
}

filter {
  if [type] == "rmlog" {
    grok {
     match => { "message" => "%{HOSTNAME:Hostname},%{DATE:Date},%{HOUR:dt_h}:%{MINUTE:dt_m},%{NUMBER:duration}-%{WORD:hm},%{USER:User},%{USER:User_1} %{NUMBER:Pid} %{NUMBER:float} %{NUMBER:float} %{NUMBER:Num_1} %{NUMBER:Num_2} %{DATA} (?:%{HOUR:dt_h1}:|)(?:%{MINUTE:dt_m1}|) (?:%{HOUR:dt_h2}:|)(?:%{MINUTE:dt_m2}|)%{GREEDYDATA:CMD},%{GREEDYDATA:PWD_PATH}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      remove_field => [ "@version", "host", "message", "_type", "_index", "_score" ]
   }
  }
 }
output {
        if [type] == "rmlog" {
        elasticsearch {
                hosts => ["myhost.xyz.com:9200"]
                manage_template => false
                index => "pt-rmlog-%{+YYYY.MM.dd}"
  }
 }
}

Any help suggestion would highly be appreciated.

EDIT:

Messages on which its failing as per my observation ..

ttmv540,19/05/21,03:59,00-hrs,USER,USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND,/local/ntr/ttmv540.373
ttmv541,19/05/21,03:43,-mins,USER,USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND,/local/ntr/ttmv541.373

However, i have tried to edit the grok with below condition but still it drops the the few fields..

input {
  file {
    path => [ "/data/rm_logs/*.txt" ]
    start_position => beginning
    max_open_files => 64000
    sincedb_path => "/data/registry-1"
    type => "rmlog"
  }
}
filter {
  if [type] == "rmlog" {
    grok {
     match => { "message" => "%{HOSTNAME:hostname},%{DATE:date},%{HOUR:time_h}:%{MINUTE:time_m},%{NUMBER:duration}-%{WORD:hm},%{USER:user},%{USER:group} %{NUMBER:pid} %{NUMBER:float} %{NUMBER:float} %{NUMBER:num_1} %{NUMBER:num_2} %{DATA} (?:%{HOUR:time_h1}:|)(?:%{MINUTE:time_m1}|) (?:%{HOUR:time_h2}:|)(?:%{MINUTE:time_m2}|)%{GREEDYDATA:cmd},%{GREEDYDATA:pwd}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      remove_field => [ "@version", "host", "message", "_type", "_index", "_score" ]
   }
  }
  if "_grokparsefailure" in [tags] {
    grok {
      match => { "message" => "%{HOSTNAME:hostname},%{DATE:date},%{HOUR:time_h}:%{MINUTE:time_m},-%{WORD:duration},%{USER:user},%{USER:group}%{GREEDYDATA:cmd}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      remove_field => [ "@version", "host", "message", "_type", "_index", "_score" ]
  }
 }
}
output {
        if [type] == "rmlog" {
        elasticsearch {
                hosts => ["myhost.xyz.com:9200"]
                manage_template => false
                index => "pt-rmlog-%{+YYYY.MM.dd}"
  }
 }
}

Note: Looks like _grokparsefailure tag works on the below messages but still fails on another..

1) this works..

 ttmv541,19/05/21,03:43,-mins,USER,USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND,/local/ntr/ttmv541.373

ttmv540,19/05/21,03:59,00-hrs,USER,USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND,/local/ntr/ttmv540.373

2) Second line of text log fails because it has 00-hrs number associated with it, now not getting to meet both conditions with below grok ..

%{HOSTNAME:hostname},%{DATE:date},%{HOUR:time_h}:%{MINUTE:time_m},-%{WORD:duration},%{USER:user},%{USER:group}%{GREEDYDATA:cmd}

Solution

  • I'd split the processing into two parts, one to deal with the hostname and timestamp issues, and one to handle the rest of the row. I find this makes maintenance easier.

    So you're left with these two inputs:

    ttmv541,19/05/21,03:43,-mins
    ttmv540,19/05/21,03:59,00-hrs
    

    Your two patterns will match the first pieces well, so the issue is how you want to parse out the stuff after the time. In your original pattern, you were using duration for the numeric part and hm for the units. In your second pattern, you seem to put the units into duration, which probably isn't right.

    Without more information, it looks like the duration is optional, but you'll always have the units. That can be reflected in your pattern, e.g.:

    (%{NUMBER:duration})?-%{WORD:hm}
    

    Also note that if you ever end up needing multiple patterns, you don't have to depend on the grokparsefailure to use them - match->message can take an array. See the doc for an example.