Search code examples
linuxelasticsearchlogstashkibanafilebeat

Logstash config with filebeat issue when using both beats and file input


I am trying to config a filebeat with logstash. At the moment I managed to successfully config filebeat with logstash and I am running into same issues when creating multiple conf files in the logstash. So currently I have one filebeats input which is something like :

input {
  beats {
    port => 5044
  }
}
filter {
}
output {
  if [@metadata][pipeline] {
        elasticsearch {
        hosts => ["localhost:9200"]
        manage_template => false
        index => "systemsyslogs"
        pipeline => "%{[@metadata][pipeline]}"
        }}
else {
        elasticsearch {
        hosts => ["localhost:9200"]
        manage_template => false
        index => "systemsyslogs"
        }

}}

And a file Logstash config which is like :

input {
  file {
    path => "/var/log/foldername/number.log"
    start_position => "beginning"
 }
}
filter {
  grok {
    match => { "message" => "%{WORD:username} %{INT:number} %{TIMESTAMP_ISO8601:timestamp}" }
  }
}
output {
  elasticsearch {
    hosts => [ "localhost:9200" ]
    index => "numberlogtest"
  }
}

The grok filter is working as I successfully managed to create 2 index patterns in kibana and view the data correctly. The problem is that when I am running logstash with both configs applied, logstash is fetching the data from number.log multiple times and logstash plain logs are getting lots of warning, therefore using a lot of computing resources and CPU is going over 80% ( this is an oracle instance ). If I remove the file config from logstash the system is running properly. I managed to run logstash with each one of these config files applied individually, but not both at once.

I already added an exception in the filebeats config :

 exclude_files:
  - /var/log/foldername/*.log

Logstash plain logs when running both config files:

[2023-02-15T12:42:41,077][WARN ][logstash.outputs.elasticsearch][main][39aca10fa204f31879ff2b20d5b917784a083f91c2eda205baefa6e05c748820] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"numberlogtest", :routing=>nil}, {"service"=>{"type"=>"system"}
"caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:607"}}}}}

I already added an exception in the filebeat config :

 exclude_files:
  - /var/log/foldername/*.log

Solution

  • Fixed by creating a single logstash config with both inputs :

    input {
      beats {
        port => 5044
      }
      file {
        path => "**path**"
        start_position => "beginning"
      }
    }
    
    filter {
      if [path] == "**path**" {
        grok {
          match => { "message" => "%{WORD:username} %{INT:number} %{TIMESTAMP_ISO8601:timestamp}" }
        }
      }
    }
    
    output {
      if [@metadata][pipeline] {
        elasticsearch {
          hosts => ["localhost:9200"]
          manage_template => false
          index => "index1"
          pipeline => "%{[@metadata][pipeline]}"
        }
      } else {
        if [path] == "**path**" {
          elasticsearch {
            hosts => ["localhost:9200"]
            manage_template => false
            index => "index2"
          }
        } else {
          elasticsearch {
            hosts => ["localhost:9200"]
            manage_template => false
            index => "index1"
          }
        }
      }
    }