I have a table of 20GB data having 50 million rows. Need to migrate to ElasticSearch using logstash jdbc input plugin. I have tried all basic implementation but need help in migrating data in batch i.e only 10,000 rows at a time. I am not sure how and where to specify this count and how to update it the next time i run logstash. Please help me solve this issue
This is what i have:
input {
jdbc {
jdbc_driver_library => "mysql-connector-java-5.1.12-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost/db"
jdbc_validate_connection => true
jdbc_user => "root"
jdbc_password => "root"
clean_run => true
record_last_run => true
use_column_value => true
jdbc_paging_enabled => true
jdbc_page_size => 5
tracking_column => id
statement => "select * from employee"
}
}
Thanks in advance.
You need to set jdbc_paging_enabled
to true in order for pagniation to work.
But you also need to make sure that clean_run
is set to false, otherwise pagination won't work.