I am writing a kafka cosumer. The consumer's job is primarily about creating multiple db entities and save them after processing the payload. I am trying to code for handling errors that can occur while consuming the data. For this I can think of 2 options (In Spring eco-system)
The failed messages needs to be processed again.
In Case1: Again I have to write another @KafkaListner, which listens to dead-letter-topic and processes the message. Here the problem is I cannot have more control over how to initiate the re-processing flow. (Like a scheduler) Because KafkaListener will start processing the data as soon as the data is published in the dead letter topic.
In Case 2: I have more control over re-process flow as I can write a REST end point or Scheduler which will try to re-process the failed messages. (Here I have dilemma over which DB to use. Relational OR some key value store)
I am basically having a design dilemma and unable to determine which approach is better in Spring Eco-System.
Appreciate the response.
I think using Kafka is the best solution.
Because KafkaListener will start processing the data as soon as the data is published in the dead letter topic.
You can control the behavior by setting autoStartup
to false on that listener, then start/stop the listener using the KafkaListenerEndpointRegistry
as needed:
registry.getListenerContainer(myListenerId).start();
Or you can use your own KafkaConsumer
(created by the consumer factory) and poll for as many records as you want and close the consumer when you are done.