My Logstash pipeline has 2 outputs:
output {
elasticsearch {
hosts => "blabla"
}
jdbc {
connection_string => "blabla"
statement => [ "blabla" ]
}
}
Let's imagine that one of this outputs is not available. What will happen with pipeline? Will it stop passing events or continue using available output? Is such a behavior configurable?
There is a similar question logstash multiple output doesn't work if one of outputs fails asked 5 years ago but Logstash released 5 major versions since then, probably something has changed.
If you have multiple outputs in a single pipeline and one of those outputs is not available, it will block every other output.
There is an issue with sugestions to solve this, but there is nothing implemented yet.
One way to avoid this is to use the pipeline-to-pipeline communication.
You will need something like this in your main pipeline.
output {
pipeline {
send_to => [outputElastic]
}
pipeline {
send_to => [outputjdbc]
}
Then you will need to create a pipeline to receive this output, one for every output you want, something like this.
input {
pipeline {
address => outputElastic
}
}
output {
elasticsearch {
hosts => ["blablabla"]
}
}
This way your main pipeline will always deliver, and if one of your outputs fails, it will not block the others because they are now isolated.
You can achieve the same thing using Kafka as your output and inputs, but this will depends on your Kafka being always up.