Search code examples
elasticsearchlogstashelastic-stackfilebeat

How to prevent old log appending from filebeat to logstash?


I am using filebeat to get logs from the remote server and shipping it to logstash so it's working fine. But when new logs being appending in the source log file so filebeat reads those logs from the beginning and send it to logstash and logstash appending all logs with older logs in the elasticsearch even though we already have those older logs in elasticsearch so here repetition is happening of logs.

So my question is how to ship only new added log into logstash. Once new lines of log append in log file so those new files should ship to logstash to elasticsearch.

Here is my filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /home/user/Documents/ELK/locals/*.log

logstash input is logstash-input.conf

input {
  beats {
    port => 5044
  }
}

Solution

  • I assume you're doing the same mistake I did for testing the filebeat a few months back (using vi editor for manually updating the log/text file). When you edit a file manually, using vi editor, it creates a new file on disk with new meta-data. Filebeat identifies the state of the file using its meta-data, not the text. Hence, reloads the complete log file.

    If this is the case, try to append it to the file like this. echo "something" >> /path/to/file.txt

    For more : Read this