Search code examples
logstashelastic-stackfilebeatelk

ELK - How to collect logs when the system outputs one file per log?


Let's say that I have a system that, instead of appending log lines on a given log file, outputs individual files for each 'log event'. The files have a common name pattern but contain a timestamp and another variable parameter.

./log/202104231115012_SYSTEM_FOO.out
./log/202104231116452_SYSTEM_BAR.out
./log/202104231117568_SYSTEM_BAZ.out
./log/202104231120711_SYSTEM_FOO.out

Is there any standard feature or configuration within Filebeat or Logstash to collect these logs or first do I need another background process (i.e. bash script) that continuously appends all these files in one global .log file ?


Solution

  • You could use a file input in read mode. Note that the default operation is to delete the file after reading it, but you can change that to just log that it was read.

    filebeat can also read based on a wildcard pattern. It is lighter weight than logstash, so if you are not doing other processing in the pipeline you might prefer that.