Search code examples
amazon-web-serviceselasticsearchlogstashelastic-stackhttp-status-code-403

Logstash AWS solving code 403 trying to reconnect


I'm trying to push documents from local to elastic server in AWS, and when trying to do so I get 403 error and Logstash keeps on trying to establish connection with the server like so:

[2021-05-09T11:09:52,707][TRACE][logstash.inputs.file     ][main] Registering file input {:path=>["~/home/ubuntu/json_try/json_try.json"]}
[2021-05-09T11:09:52,737][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x5033269f run>"}
[2021-05-09T11:09:53,441][DEBUG][logstash.outputs.amazonelasticsearch][main] Waiting for connectivity to Elasticsearch cluster. Retrying in 4s
[2021-05-09T11:09:56,403][INFO ][logstash.outputs.amazonelasticsearch][main] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://my-dom.co:8001/scans, :path=>"/"}
[2021-05-09T11:09:56,461][WARN ][logstash.outputs.amazonelasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://my-dom.co:8001/scans", :error_type=>LogStash::Outputs::AmazonElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '403' contacting Elasticsearch at URL 'https://my-dom.co:8001/scans/'"}
[2021-05-09T11:09:56,849][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-05-09T11:09:56,853][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-05-09T11:09:57,444][DEBUG][logstash.outputs.amazonelasticsearch][main] Waiting for connectivity to Elasticsearch cluster. Retrying in 8s
.
.
.

I'm using the following logstash conf file:

input {
        file{
                type => "json"
                path => "~/home/ubuntu/json_try/json_try.json"
                start_position => "beginning"
                sincedb_path => "/dev/null"
        }
}

output{
        amazon_es {
                hosts => ["https://my-dom.co/scans"]
                port => 8001
                ssl => true
                region => "us-east-1b"
                index => "snapshot-%{+YYYY.MM.dd}"
        }
}

Also I've exported AWS keys for the SSL to work. Is there anything I'm missing in order for the connection to succeed?


Solution

  • I've been able to solve this by using elasticsearch as my output plugin instead of amazon_es.

    This usage will require cloud_id of the target AWS node, cloud_auth for it and also the target index in elastic for the data to be sent to. So the conf file will look something like this:

    input {
            file{
                    type => "json"
                    path => "~/home/ubuntu/json_try/json_try.json"
                    start_position => "beginning"
                    sincedb_path => "/dev/null"
            }
    }
    
    output{
            elasticsearch {
                    cloud_id: "node_name:node_hash"
                    cloud_auth: "auth_hash"
                    index: "snapshot-%{+YYYY.MM.dd}"
            }
    }