I am trying to insert this entry to elasticsearch using logstash:
2016-05-18 00:14:30,915 DEBUG http-bio-/158.134.18.57-8200-exec-1, HTTPReport - Saved report job 1000 for report
2016-05-18 00:14:30,937 DEBUG http-bio-/158.134.18.57-8200-exec-1, JavaReport -
************************************************************************************************
Report Job information
Job ID : 12000
Job name : 101
Job priority : 1
Job group : BACKGROUND
Report : Month End
2016-05-18 00:17:38,868 DEBUG JobsMaintenanceScheduler_Worker-1, DailyReport - System information: available processors = 12; memory status : 2638 MB of 4096 MB
I have this filter in the logstash conf file:
input {
file {
path => "/data/*.log"
type => "app_log"
start_position => "beginning"
}
}
filter {
multiline {
pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-"
negate => true
what => "previous"
}
if [type] == "app_log" {
grok {
patterns_dir => ["/pattern"]
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp},%{NUMBER:Num_field} %{WORD:error_level} %{GREEDYDATA:origin}, %{WORD:logger} - %{GREEDYDATA:event%}"}
}
}
mutate { add_field => {"type" => "app_log"}}
mutate { add_field => {"machine_name" => "server101"}}
}
output {
elasticsearch {
hosts=> "localhost:9200"
index => "app_log-%{+YYYY.MM.dd}"
manage_template => false
}
}
I am getting this error:
translation missing: en.logstash.runner.configuration.file-not-found {:level=>:error}
Not able to insert it. Any ideas what might be wrong?
Upgrade to the latest version of Logstash (= 2.3.2), fix your grok filter like below and it will work:
grok {
add_field => {"machine_name" =>"server010"}
match =>{"message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:error_level} %{DATA:origin}, %{DATA:logger_name} - %{GREEDYDATA:EVENT}"}
}
UPDATE