Search code examples
elasticsearchlogstashlogstash-jdbc

Logstash - Got error as An unknown error occurred sending a bulk request to Elasticsearch


I am trying to move SQL Server table record to elasticsearch via logstash. Its basically a synchronization. But I am getting an error from LogStash as unknown error. I have provided my configuration file as well as Error log.

Configuration:

input {
  jdbc {
    #https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-record_last_run
    jdbc_connection_string => "jdbc:sqlserver://localhost-serverdb;database=Application;user=dev;password=system23$"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver" 
    jdbc_user => nil
    # The path to our downloaded jdbc driver
    jdbc_driver_library => "C:\Program Files (x86)\sqljdbc6.2\enu\sqljdbc4-3.0.jar"
    # The name of the driver class for SqlServer
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"    

    #executes every minutes.
    schedule => "* * * * *"
    #executes 0th minute of every day, basically every hour.
    #schedule => "0 * * * *"

    last_run_metadata_path => "C:\Software\ElasticSearch\logstash-6.4.0\.logstash_jdbc_last_run"
    #record_last_run => false
    #clean_run => true

    # Query for testing purpose 
    statement => "Select * from tbl_UserDetails"    
  }
}

output {
  elasticsearch {
    hosts => ["10.187.144.113:9200"]
    index => "tbl_UserDetails"
    #document_id is a unique id, this has to be provided during syn, else we may get duplicate entry in ElasticSearch index.
    document_id => "%{Login_User_Id}"
  }
}

Error Log:

[2018-09-18T21:04:32,171][ERROR][logstash.outputs.elasticsearch] 
An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {
:error_message=>"\"\\xF0\" from ASCII-8BIT to UTF-8", 
:error_class=>"LogStash::Json::GeneratorError", 
:backtrace=>["C:/Software/ElasticSearch/logstash-6.4.0/log
stash-core/lib/logstash/json.rb:27:in `jruby_dump'", 
"C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'"
, "org/jruby/RubyArray.java:2486:in `map'", 
"C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in `block in bulk'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in `bulk'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9
.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:275:in `safe_bulk'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:180:in `submit'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0
/gems/logstash-output-elasticsearch-9.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:148:in `retrying_submit'", "C:/Software/ElasticSearch/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.0-java/lib/log
stash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:114:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:97:in `multi_receive'", "C:/Soft
ware/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:372:in`block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "C:/Software/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:371:in `output_batch'", "C:/Software/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:323:in `worker_loop'", "C:/Software/ElasticSearch/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:285:in `block in start_workers'"]}

[2018-09-18T21:05:00,140][INFO ][logstash.inputs.jdbc     ] (0.008273s) Select *
 from tbl_UserDetails

Logstash Version : 6.4.0 Elasticsearch Version :6.3.1

Thanks in advance.


Solution

  • The above issue resolved.

    Thanks for your support guys.

    The change what I did was in the input -> jdbc I added the below two properties

    input {
     jdbc  {         
        tracking_column => "login_user_id"
        use_column_value => true
        }
      }
    

    and under output->elasticsearch I changed the two properties

     output {
         elasticsearch {     
        document_id => "%{login_user_id}"
        document_type => "user_details"
        }
      }
    

    the main take away from here is all the values should be mentioned in lowercase.