My log file has this pattern:
[Sun Oct 30 17:16:09 2016] [TRACE_HIGH] [TEST1] MessageTest1
[Sun Oct 30 17:16:10 2016] [TRACE_HIGH] [TEST2] MessageTest2
Pattern:
\A\[%{HTTPDERROR_DATE}](?<message>(.|\r|\n)*)
Filter:
filter {
if [type] == "mycustomlog" {
grok {
match => { "message" => "\A\[%{HTTPDERROR_DATE}](?<message>(.|\r|\n)*)"}
}
date {
# Format: Wed Jan 13 11:50:44.327650 2016 (GROK: HTTPDERROR_DATE)
match => [ "timestamp", "EEE MMM dd HH:mm:ss yyyy"]
}
multiline {
pattern => "^%{SYSLOG5424SD}%{SPACE}"
what => "previous"
negate=> true
}
}
}
I am trying to use my datetime log into @timestamp field, but I cannot parse this format into @timestamp. Why the date filter did not replace the @timestamp value?
My @timestamp is different from the log row:
row[0]
I am following this tutorial:
Using:
Elasticsearch 2.2.x, Logstash 2.2.x, and Kibana 4.4.x
To apply the new field you must enter the target to overwrite the field:
target => "@timestamp"
By example:
date {
match => [ "timestamp", "dd MMM yyyy HH:mm:ss" ]
target => "@timestamp"
locale => "en"
remove_field => [ "timestamp" ]
}