I have used the zip files to start logstash, kibana and elasticsearch. I am ingesting a csv file from logstash to elastic search
input {
file {
path => "D:\tls202_part01\tls202_part01.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["appln_id", "appln_title_lg", "appln_title"]
}
mutate {
convert => ["appln_id", "integer"]
convert => ["appln_title_lg", "string"]
convert => ["appln_title", "string"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "title"
}
stdout {
codec => rubydebug
}
}
this is my config file. When I search for index title it is not there and logstash logs are these:
Sending Logstash logs to D:/logstash-6.5.4/logs which is now configured via log4j2.properties
[2018-12-26T10:22:35,672][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-26T10:22:35,699][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.4"}
[2018-12-26T10:22:41,588][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-26T10:22:42,051][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-12-26T10:22:42,297][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-12-26T10:22:42,370][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-12-26T10:22:42,376][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-12-26T10:22:42,417][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-12-26T10:22:42,439][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-12-26T10:22:42,473][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-12-26T10:22:43,330][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/logstash-6.5.4/data/plugins/inputs/file/.sincedb_bb5ff7ebd070422c5b611ac87e9e7087", :path=>["D:\\tls202_part01\\tls202_part01.csv"]}
[2018-12-26T10:22:43,390][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x389cc614 run>"}
[2018-12-26T10:22:43,499][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-12-26T10:22:43,532][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-26T10:22:43,842][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
The CSV file is large of 2GB csv data. Also, kibana is showing no Elasticsearch data found for creating indexes.
It seems that logstash didn't found your file, change your path from backslash to forward slash and see if it works.
path => "D:/tls202_part01/tls202_part01.csv"