Search code examples
elasticsearchlogstashlogstash-jdbc

Elastic is retaining only last record from logstash


Here is the select statement , which is a transaction data , every time the data is inserted into UserProfile , the old index values gets deleted from elastic

  jdbc {
    statement => "SELECT userId,salesTeam FROM UserProfile with (nolock)"
}
output {
    elasticsearch {
    hosts => ["localhost:9200"]
    index => "q_d"
    document_type => "cd"
    document_id => "%{userId}%"
  }
  stdout { codec => rubydebug }
}

I want to update the existing documents if there is any change, else index the new document.
what am i doing wrong here?


Solution

  • input {
        jdbc {
            # Postgres jdbc connection string to our database, mydb
            jdbc_connection_string => "jdbc:postgresql://localhost:5432/bhavya"
            # The user we wish to execute our statement as
            jdbc_user => "postgres"
            # The path to our downloaded jdbc driver
            jdbc_driver_library => "/root/postgresql-42.2.2.jar"
            # The name of the driver class for Postgresql
            jdbc_driver_class => "org.postgresql.Driver"
            jdbc_password => "postgres"
            jdbc_validate_connection => true
            #You can schedule input from this plugin,the following uses cron syntax
            schedule => "* * * * *"
            # our query
            statement => "SELECT uid,email,first_name,last_name FROM contacts"
        }
    }
    
    output {
        elasticsearch {
        hosts => ["localhost:9200"]
        index => "contacts"
        document_type => "record"
        document_id => "%{uid}"
      }
      stdout { codec => rubydebug }
    }
    

    First you should add above options that I specified in input plugin according to your database. I used Postgresql as the database. Accordingly you would need to download the corresponding driver library jar for that database and specify path corresponding to that.

    Secondly you should use schedule option in "jdbc" plugin so that it reads data from database periodically.

    Third you should remove an extra '%" from the "document_id" part in "output" plugin part.

    You should refer this page for importing data into logstash from database :->

    https://www.elastic.co/blog/logstash-jdbc-input-plugin