push csv file from client PC to elastic on server side
the elastic have been installed, nicely. I can accessed it from my pc and use demo data. Now I would like to learn how to push it with my own data. I've prepared my data from kaggle.
I've downloaded filebeat on client side and extracted it. i've edited the filebeat.yml as
filebeat.inputs:
- input_type: log
paths:
- C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv
document_type: test_log_csv
output.logstash:
hosts: ["10.64.2.246:5044"]
I also tested it with
./filebeat test config
it return : Config Ok
edited logstash.conf as
input {
beats {
port =>5044
}
}
filter {
if "test_log_csv" in [type]
{
csv {
columns=>["Date","Price"]
separator=>","
}
mutate{
convert => ["Price","integer"]
}
date{
match=>["Date","d/MMM/yy"]
}
}
}
output {
if "test_log_csv" in [type]
{
elasticsearch
{
hosts=>"127.0.0.1:9200"
index=>"test_log_csv%{+d/MM/yy}"
}
}
I run
Start-Service filebeat
it returns nothing.
I checked my kibana and there are no logs . what did i miss?
filebeat.inputs:
- input_type: log
paths:
- 'C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv'
fields:
document_type: test_log_csv
output.logstash:
hosts: ["10.64.2.246:5044"]
The document_type
option was removed from Filebeat in version 6.X so the type
field is not created anymore, since your conditionals are based on this field, your pipeline will not work. Also, you should try to use forward slashes (/
) even on windows.
Try to change your config for the one below and test again.
filebeat.inputs:
- input_type: log
paths:
- 'C:/Users/Charles/Desktop/DATA/BrentOilPrices.csv'
fields:
type: test_log_csv
fields_under_root: true
output.logstash:
hosts: ["10.64.2.246:5044"]
The option fields_under_root: true
will create the field type
in the root of your document, if you remove this option, the field will be created as [fields][type]
and you will need to change your conditionals to that field.