I am trying to parse custom log messages which also have error stacktraces that span multiple lines. My GROK pattern fails to parse if its a multiline stacktrace and all i see in the elasticsearc index is the first line of the message. Strangely if I use a parser like grokdebugger to test the pattern works for multiline as well. What am I missing in the logstash config
Following is the snippet of my grok pattern in logstash:
grok {match => [
"message" , "%{TIMESTAMP_ISO8601:timestamp} \[%{SPACE}%{DATA:loglevel}\] %{DATA:class} \[%{DATA:operation}\] \(user=%{DATA:userid}\) (?m)%{GREEDYDATA:stacktrace}"
]
}
Sample message that gets parsed:
2018-01-09 21:38:21,414 [ INFO] abc.xyz.def:444: [Put] [Protect] (user=xyz) Random Message
Message that does not get parsed:
2018-01-09 21:38:21,415 [ ERROR] abc.xyz.def:41: [Error] (user=xyz) Unhandled exception encountered...
Traceback (most recent call last):
File "/usr/local/lib/abc/xyz.py", line 113, in some_requestrv = self.dispatch_request()
You can indeed use multiline codec, in your case:
input {
file {
path => "/var/log/someapp.log"
codec => multiline {
# Grok pattern names are valid! :)
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => "previous"
}
}
}