I have the following message in my log file...
2015-05-08 12:00:00,648064070: INFO : [pool-4-thread-1] com.jobs.AutomatedJob: Found 0 suggested order events
This is what I see in Logstash/Kibana (with the Date and Message selected)...
May 8th 2015, 12:16:19.691 2015-05-08 12:00:00,648064070: INFO : [pool-4-thread-1] com.pcmsgroup.v21.star2.application.maintenance.jobs.AutomatedSuggestedOrderingScheduledJob: Found 0 suggested order events
The date on the left in Kibana is the insertion date. (May 8th 2015, 12:16:19.691)
The next date is from the log statement (2015-05-08 12:00:00,648064070)
Next is the INFO level of logging.
Then finally the message.
I'd like to split these into there components, so that the level of logging is its own FIELD in kibana, and to either remove the date in the message section or make it the actual date (instead of the insertion date).
Can someone help me out please. I presume I need a grok filter?
This is what I have so far...
input
{
file {
debug => true
path => "C:/office-log*"
sincedb_path => "c:/tools/logstash-1.4.2/.sincedb"
sincedb_write_interval => 1
start_position => "beginning"
tags => ["product_qa"]
type => "log4j"
}
}
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601}: %{LOGLEVEL}" ]
}
}
output {
elasticsearch {
protocol => "http"
host => "0.0.0.x"
}
}
This grok filter doesn't seem to change the events shown in Kibana. I still only see host/path/type,etc.
I've been using http://grokdebug.herokuapp.com/ to work out my grok syntax
You will need to name the result that you get back from grok and then use the date filter to set @timestamp
so that the logged time will be used instead of the insert time.
Based on what you have so far, you'd do this:
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logdate}: %{LOGLEVEL:loglevel} (?<logmessage>.*)" ]
}
date {
match => [ "logdate", "ISO8601" ]
}
#logdate is now parsed into timestamp, remove original log message too
mutate {
remove_field => ['message', 'logdate' ]
}
}