Search code examples
apache-kafkalogstash

logstash output config file


In the logstash kafka configuration file, if there is a connection failure in the output, it will block. How to ensure that one connection fails and the remaining connections run normally enter image description here

input{
stdin{}
}
output {
kafka {
bootstrap_servers =\> ":9091"
security_protocol =\> ""
ssl_key_password =\> ""
ssl_keystore_password =\> ""
ssl_truststore_password =\> ""
ssl_keystore_location =\> "/tmp"
ssl_truststore_location =\> "/tmp"
topic_id =\> "test"
ssl_endpoint_identification_algorithm =\> ""
}
stdout {}
}

I want to use multiple KafKas to connect successfully without being affected by a single connection failure


Solution

  • You can use the output isolator pattern to prevent Logstash from becoming blocked if one of multiple outputs experiences a temporary failure. Logstash, by default, is blocked when any single output is down. This behavior is important in guaranteeing at-least-once delivery of data.

    example code:

    # config/pipelines.yml
    - pipeline.id: intake
      config.string: |
        input { beats { port => 5044 } }
        output { pipeline { send_to => [es, http] } }
    - pipeline.id: buffered-es
      queue.type: persisted
      config.string: |
        input { pipeline { address => es } }
        output { elasticsearch { } }
    - pipeline.id: buffered-http
      queue.type: persisted
      config.string: |
        input { pipeline { address => http } }
        output { http { } }
    

    I hope this help you