I want to parse the following JSON logs to elasticsearch / logshash. I have tried to parse it but unsuccessful. When I tried to parse it to logstash, it has shown _jsonparsefailure
error, and shows some hexadecimal / different value instead of the data given in logs, as an output (message) (screenshot).
LOG:
{
"rows": [
[18446738026507172656, "cmd.exe", 1644, 2528, 0, -1, 0, 0, "2017-10-08 10:19:10 UTC+0000", "2017-10-08 10:19:11 UTC+0000"]
],
"columns": ["Offset(V)", "Name", "PID", "PPID", "Thds", "Hnds", "Sess", "Wow64", "Start", "Exit"]
}
input {
tcp{
port => 5043
codec => "json"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => "127.0.0.1:9200"
index => "logs-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}
You don't need to use json filter because input is already using JSON codec.
Remove JSON filter.