Search code examples
inputlogstashacquia

How can I tell logstash to read file over ssh


Using the ELK stack, I have to parse some file but they are remote. My solution

  • rsync over ssh to get the remote file locally

My concern is that my elasticsearch index is growing exponentially (more tha 130MB) whereas the logfile are only 25MB. Is that possible that each rsync cron (*/5 mn) leads logstash to read the whole file again without taking the sincedb stuff ?

Thanks for your help :)

The context, I'm using acquia as hoster for drupal site and so I do not have control over how I can access the logfile

Guillaume Renard


Solution

  • As I wanted to check acquia's log, I try another way, use of logstream (https://github.com/acquia/logstream) and supervisord (http://supervisord.org/introduction.html) and it saves my day.

    ...
    [program:logstream_artecinema_drupal-watchdog]
    command=logstream tail prod:artetv prod --no-color --types=drupal-watchdog
    autorestart=true
    redirect_stderr=true
    stdout_logfile=/var/log/logstream/artecinema_drupal-watchdog.log
    stdout_logfile_maxbytes=20MB
    stdout_logfile_backups=7
    environment=HOME="/root"
    ...
    

    And my logstash read the logfile

    file {
        path => "/var/log/logstream/artecinema_drupal-watchdog.log"
        start_position => "beginning"
        type => "drupal-watchdog"
        add_field => { "platform" => "cinema" }
      }