I am stuck in middle of ELK- Stack configuration, any lead will be highly appreciated.
Case Study: I am able to see the logs(parsed through logstash without any filter) but I want to apply filter's while parsing the logs. For ex: system.process.cmdline: "C:\example1\example.exe" -displayname "example.run" -servicename "example.run"
I can see the above logs in kibana dashboard but I want only the -servicename keys, value. Expected output in Kibana, where servicename is an index and example.run will be associate value.
I am newbie in ELK.So, Please help me out...
My environment: Elasticsearch- 6.6 Kibana- 6.6 Logstash- 6.6 Filebeat- 6.6 Metricbeat- 6.6 Logs coming from- Windows server 2016
input {
beats {
port => "5044"
}
}
filter {
grok{
match =>{"message" => "%{NOSPACE:hostname} "}
}
}
output {
file {
path => "/var/log/logstash/out.log"
}
}
I have tried with the above logstash pipeline. But i am not successfull in getting the required result. Assuming i have to add more lines in filter but don't know what exactly.
use this in you filter:
grok{
match => { "message" => "%{GREEDYDATA:ignore}-servicename \"%{DATA:serviceName}\"" }
}
your service name should be now in serviceName key