Search code examples
ruby-on-rails-3logstashkibanalogstash-forwarder

Logstasher+Kibana : message double quoted and hard to parse


I use this stack:

  • On each front server
    • rails
    • logstasher gem (formats rails log in json)
    • logstash-forwarder (just forwards logs to logstash on central server)
  • On log server:
    • logstash (to centralize and index logs)
    • kibana to display

Kibana to works well with JSON format. But "message" data is provided by a string, not as a json (cf the provieded snippet). Is there a way to fix this? For example, access the status is a bit tricky

Here's a message sample

{
  _index: logstash-2014.09.18
  _type: rails
  _id: RHJgU2L_SoOKS79pBzU_mA
  _version: 1
  _score: null
  _source: {
  message: "{"@source":"unknown","@tags":["request"],"@fields":{"method":"GET","path":"/foo/bar","format":"html","controller":"items","action":"show","status":200,"duration":377.52,"view":355.67,"db":7.47,"ip":"123.456.789.123","route":"items#show","request_id":"021ad750600ab99758062de60102da8f"},"@timestamp":"2014-09-18T09:07:31.822782+00:00"}"
  @version: 1
  @timestamp: 2014-09-18T09:08:21.990Z
  type: rails
  file: /home/user/path/logstash_production.log
  host: webserver.example.com
  offset: 23200721
  format: json_event
  }
  sort: [
    rails
  ]
}

Thank you for help ;).

EDIT 1: Add logstash configuration files:

/etc/logstash/conf.d/01-lumberjack-input.conf

input {
  lumberjack {
    port => 5000
    type => "logs"
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
    codec => "json"
  }
}

/etc/logstash/conf.d/10-syslog.conf

filter {
 if [type] == "syslog" {
   grok {
     match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
     add_field => [ "received_at", "%{@timestamp}" ]
     add_field => [ "received_from", "%{host}" ]
   }
   syslog_pri { }
   date {
     match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
   }
 }
}

/etc/logstash/conf.d/30-lumberjack-output.conf

output {
  elasticsearch { host => localhost }
#  stdout { codec => rubydebug }
}

if useful, logstash-forwarder configuration: /etc/logstash-forwarder on web servers

{
  "network": {
    "servers": [ "123.465.789.123:5000" ],
    "timeout": 45,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
  },
  "files": [
    {
      "paths": [
        "/var/log/messages",
        "/var/log/secure"
       ],
      "fields": { "type": "syslog" }
    },
    {
      "paths": [
        "/home/xnxx/gportal/shared/log/logstash_production.log"
      ],
      "fields": { "type": "rails", "format": "json_event" }
    }
  ]
}

My config files are mainly inspired from this tutorial: https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04


Solution

  • The final way was to stop and start logstash, otherwise(restart) the configuration seems not to be updated.

    So instead of :

    sudo service logstash restart

    I did:

    sudo service logstash stop

    wait for ~1 minute, then

    sudo service logstash start

    Don't really understand the reason (init script do this, but doesn't wait 1 minutes), but it worked for me.