I want to fetch a batch of messages (say 10000 messages) from a Kafka topic at regular intervals (for bulk processing all the 10k records at once instead of one-by-one).
Is there a way to achieve this in a Spring Cloud Stream processor? If so, are there any snippets or any examples that I can refer to?
Thanks
Well, unfortunately there is still no KafkaMessageDrivenChannelAdapter.ListenerMode.batch
support on the spring.cloud.stream.kafka.bindings.TARGET.consumer.
level. Feel free to raise an issue on the matter.
Meanwhile as a workaround I can suggest to use Spring Integration Aggregator
as a Kafka topic consumer to really batch records on the application level. and only after that send them for processing.
Well, another workaround might be like using Spring Integration Kafka (KafkaMessageDrivenChannelAdapter
) manually, not Kafka Binder. Or just use Spring Kafka with its @KafkaListener
and ConcurrentKafkaListenerContainerFactory
bean with appropriate batchListener
option.