I use logstash with jdbc plugin to insert data into elastic search. The jdbc plugin reads an last_modified column periodically and insert all the data, which is newer than the last run. (typical jdbc-logstasah-handling)
But sometimes I need to run an full index from scratch.
How can I trigger that full import (database -> elastic).
My current way is to shutdown logstash, reset the .logstash_jdbc_last_run
- counter and start logstash again. That doesn't seem very elegant to me.
Is there an different way to trigger the full-import?
You simply need to add the clean_run
parameter to your jdbc
input configuration:
input {
jdbc {
...
clean_run => true
...
}
}
Adding that will ignore the state stored in the .logstash_jdbc_last_run
file.