My project stack is currently as follows: A python running flask (using AWS Elastic Beanstalk or AWS Lambda, but it doesn't matter), each route does something for a user and writes a log directly to elasticsearch (using Elastic Cloud).
I have a problem - when my elasticsearch is slow or down, my flask is affected, moreover, logs can be lost.
I'm searching for a solution which my flask would write the log to some queue/pubsub, and there would be SOMETHING that would read the logs from there and write to elasticsearch.
Do you have suggestion to what would be that SOMETHING? I know it could be logstash, but I want something that would be easy to deploy on AWS or Elastic Cloud, would be scalable, would be easy to write to it and easy to write from it to the elasticsearch.
Best, Shahar
Try some kind of queue (AWS SQS?) in between elastic and your application. Your app writes to the queue, some service/lambda/logstash/... reads the queue and stores the docs in elastic. Kafka for example has already an elastic connector fot that purpose too.