Search code examples
elasticsearchlogstashlogstash-jdbc

SQL Server Sync with Elasticsearch through Logstash - retransfer does not happens


Background: We are doing a POC of Sync SQL Server error log data to elasticsearch(ES) to bring a dashboard in kibana. I used Logstash with jdbc input plugin to move sql server table data to (ES), it was succeeded. In the log table it was around 5000 records, each got moved to ES.

Problem Statement: For testing I deleted the index from ES which was which was earlier synced by Logstash and I ran the Logstash again with the same input config file. But no records was moved If I add a new record to the SQL Server table, that was reflecting, but the older records (5000) was not updated.

Config Below is my config file used to sync

input {
  jdbc {
    #https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-record_last_run
    jdbc_connection_string => "jdbc:sqlserver://localhost:40020;database=application;user=development;password=XXX$"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"    

    jdbc_user => nil
        # The path to our downloaded jdbc driver
    jdbc_driver_library => "C:\Program Files (x86)\sqljdbc6.2\enu\sqljdbc4-3.0.jar"
        # The name of the driver class for SqlServer
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
        # Query for testing purpose     
    schedule => "* * * * *"
    last_run_metadata_path => "C:\Software\ElasticSearch\logstash-6.4.0\.logstash_jdbc_last_run"
    record_last_run => true
    clean_run => true
    statement => "  select * from Log"  

  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "application_log"
    #document_id is a unique id, this has to be provided during syn, else we may get duplicate entry in ElasticSearch index.
    document_id => "%{Id}"
  }
}

Please help me out and explain what went wrong.

Logstash Version:6.4.0 Elasticsearch Version: 6.3.1

Thanks in advance


Solution

  • I found and resolved this issue.

    The problem I found is every fields are case sensitive, it accepts only if in lowercase letter.

    Below is the change what I did and it works fine for me.

    output {
      elasticsearch {
        hosts => ["localhost:9200"]
        index => "application_log"
        #document_id is a unique id, this has to be provided during syn, else we may get duplicate entry in ElasticSearch index.
        document_id => "%{Id}"
      }
    }
    

    and no change in the input section.

    Thanks for your support.