Search code examples
elasticsearchlogstashlogstash-forwarderlogstash-logback-encoder

Logstash logback encoder, logstash forwarder and logstash


Fallowing the advice https://blog.codecentric.de/en/2014/10/log-management-spring-boot-applications-logstash-elastichsearch-kibana/ I had setup the logstash encoder + logstash forwarder to push everything to my logstash deamon and finally index everything in ElasticSearch.

Here is my configuration:

logstash.xml

<included>
    <include resource="org/springframework/boot/logging/logback/base.xml"/>

    <property name="FILE_LOGSTASH" value="${LOG_FILE:-${LOG_PATH:-${LOG_TEMP:-${java.io.tmpdir:-/tmp}}/}spring.log}.json"/>
    <appender name="LOGSTASH"
              class="ch.qos.logback.core.rolling.RollingFileAppender">
        <encoder>
            <pattern>${FILE_LOG_PATTERN}</pattern>
        </encoder>
        <file>${FILE_LOGSTASH}</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
            <fileNamePattern>${FILE_LOGSTASH}.%i</fileNamePattern>
        </rollingPolicy>
        <triggeringPolicy
            class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
            <MaxFileSize>10MB</MaxFileSize>
        </triggeringPolicy>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
            <includeCallerInfo>true</includeCallerInfo>                
        </encoder>
    </appender>

    <root level="INFO">
        <appender-ref ref="LOGSTASH"/>
    </root>
</included>

logstash-forwarder.conf

{
    "network": {
        "servers": [
            "logstash:5043"
        ],
        "ssl certificate": "/etc/pki/tls/certs/logstash-forwarder/logstash-forwarder.crt",
        "ssl key": "/etc/pki/tls/private/logstash-forwarder/logstash-forwarder.key",
        "ssl ca": "/etc/pki/tls/certs/logstash-forwarder/logstash-forwarder.crt",
        "timeout": 15
    },
    "files": [
        {
            "paths": [
                "${ENV_SERVICE_LOG}/*.log.json"
            ],
            "fields": {
                "type": "${ENV_SERVICE_NAME}"
            }
        }
    ]
}

logstash.conf

input {
    lumberjack {
        port => 5043

        ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder/logstash-forwarder.crt"
        ssl_key => "/etc/pki/tls/private/logstash-forwarder/logstash-forwarder.key"
    }
}

output {
    elasticsearch { host => "localhost" }
}

Everything works fine, the logs are getting saved in the ElasticSearch.

At this point I would like to be able to specify additional fields to be indexed by ElasticSearch, like for instance log level. Searching the @message content for presence of Error or Warn is not so much useful.

How can I do this? Which configuration should I alter to make the level appear as indexed field in ElasticSearch?


Solution

  • What you're looking for is a logstash filter, which would be used on your indexer as a peer to the input and output stanzas.

    There are a ton of filters (see the doc), but you would use grok{} to apply a regexp to your message field and extract the log level.

    You didn't include a sample message, but, given a string like "foo 123 bar", this pattern would extract the "123" into an integer field called loglevel:

    grok {
        match => ["message", "foo %{NUMBER:loglevel:int} bar"]
    }
    

    There's a decent amount of information on writing grok patterns on the web. Try this one.