Basically, I would like LogStash to consume its own logs and populate fields such as @timestamp
, level
, etc. for use in Kibana.
My current config looks like this:
input {
file {
path => "/path/to/logstash/logs/*.log"
type => "logstash"
}
}
That seems hard to do -- without reverting to writing a Grok filter. Is that really the case that LogStash can't consume its own logs? (Hard to Google and I couldn't find anything.)
Or is this the wrong approach in the first place?
Example log output from LogStash:
{
:timestamp=>"2014-09-02T10:38:08.798000+0200",
:message=>"Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones",
:level=>:warn
}
You have to resort to use grok
because logstash is not able to use rubydebug
as an input codec.
The logs are in a fixed format, so the grok
to do it is straight forward. We then use date
to replace @timestamp
filter {
grok {
match => ["message", '{:timestamp=>"(?<timestamp>.*)", :message=>"(?<msg>.*)", :level=>:(?<level>.*)}']
}
mutate {
replace => ["message", "%{msg}" ]
remove_field => msg
}
date {
match => [ "timestamp", "ISO8601" ]
remove_field => 'timestamp'
}
}
The mutate
step is needed because if you just put <message>
into the match, it'll turn message
into an array and add the extracted message to it, so capture as a different name and then replace the message.