I'm facing timestamp issue in ELK,
Right now what I'm facing issue with the @timestamp field of ELK that it shows the current date time while importing old log file.
I need to update @timestamp from custom datetime field from log.
Below is the sample log.
{ "datetime":"2021-08-24 04:13:39,167", "servername":"vm-ws", "serverip":"(null)", "process":"4656", "thread":"4", "level":"DEBUG", "appname":"AcManager", "page":"Program.cs ","method":"ExecuteAsync","line":"63","message":"Starting AcMa Module","otherinfo":{"token":"null","clientip":"null","clientbrowserversion":"null","clienttype":"null"},"moreinfo":"null"}
I have used grok filter with below configuration in logstash
input {
stdin {
type => "stdin-type"
}
file {
type => "json"
path => [ "/home/testuser/mylogs/*.log", "/home/testuser/mylogs/*/*.log" ]
start_position => "beginning"
}
}
filter {
date {
match => ["datetime", "yyyy-MM-dd HH:mm:ss"]
target => ["@timestamp"]
}
# Step 1. Extract the JSON String, put it in a temporary field called "payload_raw"
# Docs: https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html
grok {
match => {
"message" => [ "%{JSON:payload_raw}" ]
}
pattern_definitions => {
"JSON" => "{.*$"
}
}
# Step 2. Parse the temporary "payload_raw" field, put the parsed data in a field called "payload"
# Docs: https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html
json {
source => "payload_raw"
target => "payload"
}
# Step 3. Remove the temporary "payload_raw" field (and other fields)
# Docs: https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html
mutate {
remove_field => [ "payload_raw","message" ]
}
# Tried this but not working
# date {
# match => [ "datetime", "yyyy-MM-dd HH:mm:ss" ]
# target => "@timestamp"
# }
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "localhost:9200"
}
}
Since your "datetime" field is present inside the "payload" field, you need to mention the field in this way :
date {
match => [ "[payload][datetime]", "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "@timestamp"
}
This should be the right script :
input {
stdin {
type => "stdin-type"
}
file {
type => "json"
path => [ "/home/testuser/mylogs/*.log", "/home/testuser/mylogs/*/*.log" ]
start_position => "beginning"
}
}
filter {
# Step 1. Extract the JSON String, put it in a temporary field called "payload_raw"
# Docs: https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html
grok {
match => {
"message" => [ "%{JSON:payload_raw}" ]
}
pattern_definitions => {
"JSON" => "{.*$"
}
}
# Step 2. Parse the temporary "payload_raw" field, put the parsed data in a field called "payload"
# Docs: https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html
json {
source => "payload_raw"
target => "payload"
}
# Step 3. Remove the temporary "payload_raw" field (and other fields)
# Docs: https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html
mutate {
remove_field => [ "payload_raw","message" ]
}
# Try this
date {
match => [ "[payload][datetime]", "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "@timestamp"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "localhost:9200"
}
}