i read below articles to understand logstash technology i established ELK environment. https://tpodolak.com/blog/tag/kibana/
input {
file {
path => ["C:/logs/*.log"]
start_position => beginning
ignore_older => 0
}
}
filter {
grok {
match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
}
# set the event timestamp from the log
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
date {
match => [ "logdate", "yyyy-MM-dd HH:mm:ss.SSSS" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
}
stdout {}
}
i added input path C/logs/*.log in logstash.conf. i have test.log file that is not empty, it has :
TimeStamp=2016-07-20 21:22:46.0079 CorrelationId=dc665fe7-9734-456a-92ba-3e1b522f5fd4 Level=INFO Message=About
TimeStamp=2016-07-20 21:22:46.0079 CorrelationId=dc665fe7-9734-456a-92ba-3e1b522f5fd4 Level=INFO Message=About
TimeStamp=2016-11-01 00:13:01.1669 CorrelationId=77530786-8e6b-45c2-bbc1-31837d911c14 Level=INFO Message=Request completed with status code: 200
According to above article. i have to see that my logs inside of the elasticsearch.
(From "https://tpodolak.com/blog/tag/kibana/" sample result )
But my result if i write to my browser this adress: http://localhost:9200/_cat/indices?v i can not see logstash logs in elasticsearch? where is logstash logs stored in elastic search? logstash.conf looks ok. but there is no satisfied result. As a result. i want to get all logs from under C/logs/*.log TO elastic by logstash? but what is my error in my logstash.conf?
My LOGS (C:\monitoring\logstash\logs\C:\monitoring\logstash\logs.log):
[2017-03-13T10:47:17,849][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:46:35,123][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:48:20,023][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T11:55:10,808][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-03-13T11:55:10,871][INFO ][logstash.pipeline ] Pipeline main started
[2017-03-13T11:55:11,316][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-03-13T12:00:52,188][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T12:02:48,309][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2017-03-13T12:06:33,270][ERROR][logstash.agent ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 52 (byte 52) after output { elasticsearch { hosts "}
[2017-03-13T12:08:51,636][ERROR][logstash.agent ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input { file { path "}
[2017-03-13T12:09:48,114][ERROR][logstash.agent ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input { file { path "}
[2017-03-13T12:11:40,200][ERROR][logstash.agent ] Cannot load an invalid configuration {:reason=>"Expected one of #, => at line 1, column 22 (byte 22) after input { file { path "}
[2017-03-13T12:19:17,622][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
First of all you have a few configuration issues:
hosts => ["myHost:myPort3]
), see the docsincedb_path
as Logstash will not try to parse again a file it already parsed (it checks into a .sincedb to see if it already parsed a file, by default located at $HOME/.sincedb, you need to delete it in between parsing when you test with the same log file)That's why after a few research (actually a lot, not being a windows user), I could come up with this config that works:
input {
file {
path => "C:/some/log/dir/*"
start_position => beginning
ignore_older => 0
sincedb_path => "NIL" #easier to remove from the current directory, the file will be NIL.sincedb
}
}
filter {
grok {
match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
}
# set the event timestamp from the log
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
date {
match => [ "TimeStamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout {}
}