I'm new to ELK stack and am starting out by pushing the IIS logs on my Windows Server 2012 application server to Elastic Search using Filebeat and Logstash.
I plan on extending this so that it also pushes up custom application logs which are written by our applications (as opposed to IIS). To do this I'll need to differentiate one type of log from another in Logstash.
So in filebeat.yml I have added a custom field called "log_type":
type: log
enabled: true
paths:
- C:\inetpub\logs\LogFiles\*\*
fields:
log_type: iis
In my Logstash I'm trying to perform some conditional logic based on the value of "log_type" but it's not working. If I remove the conditional logic the filter works.
filter {
if [fields.log_type] == "iis" {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NOTSPACE:referer} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:timetaken:int}"}
}
date {
match => [ "log_timestamp", "ISO8601" ]
target => "@timestamp"
}
}
}
I've searched and searched but can't find out how to do this. Would really appreciate some help.
To access the field in Logstash the syntax should be [fields][log_type]
rather than [fields.log_type]
.
Reference