Search code examples
spring-kafka

Out-of-the-box capabilities for Spring-Kafka consumer to avoid duplicate message processing


I stumbled over Handling duplicate messages using the Idempotent consumer pattern : enter image description here

Similar, but slightly different is the Transactional Inbox Pattern which acknowledges the kafka message receipt after the transaction INSERT into messages (no business transaction) concluded successfully and having a background polling to detect new messages in this table and trigger the real business logic (i.e. the message listener) subsequently.

Now I wonder, if there is a Spring magic to just provide a special DataSource config to track all received messages and discard duplicated message deliveries? Otherwise, the application itself would need to take care to ack the kafka message receipt, message state changes and data cleanup of the event table, retry after failure and probably a lot of other difficult things that I did not yet thought about.


Solution

  • The framework does not provide this out of the box (there is no general solution that will work for all), but you can implement it via a filter, to avoid putting this logic in your listener.

    https://docs.spring.io/spring-kafka/docs/2.7.9/reference/html/#filtering-messages