I have an application where I publish a lot of messages on the topic in Kafka. Currently, I'm consuming the messages one by one by configuring the Kafka listener like this:
@KafkaListener(id = "groupId", topics = "topic-name"})
public void consumeEvent(MyPojo myPojo) {
// Process the message one by one
}
The problem with the above way of handling the messages in Kafka is: there is an opportunity in the consumer code above to handle a lot of events at once. I'm looking for a configuration that can help pull a batch of messages together, like 500 messages at once. Is there any way we can achieve that using spring boot? Could there be any issue or any other thing we might need to handle/take care of in case we're processing messages like this in bulk?
@KafkaListener(id = "groupId", topics = "topic-name"})
public void consumeEvents(List<MyPojo> myPojoItems) {
// Process the messages in bulk
}
I have used @KafkaListener batch property in my project.
See https://docs.spring.io/spring-kafka/api/org/springframework/kafka/annotation/KafkaListener.html for more details.
@KafkaListener(topics = TOPIC, groupId = GROUP_ID, batch = "true")
void messageListener(ConsumerRecords<String, Message> records) {
for (ConsumerRecord<String, Message> cr : records) {
Message message = cr.value();
}
}