Search code examples
apache-kafkaspring-cloud-streamspring-kafka

Spring Cloud Stream with Kafka binder and how to queue data from source


I'm a newbie to spring cloud stream with kafka and wondering something about the concept of it. In my application, when the source sends a message to kafka binder streamingly, data doesn't pile up ... I can see data source is working in Kafka consumer which is not actually consume, just for checking out that data really comes out.

However, the thing is I couldn't consume data piled at the receiver.(the listener of processor). It can consume only the data streamed from source in near real time.

Let me give you an example. For producer,

data1, data2, data3, data4, data5, ... (streaming for producer)

For consumer, it would be started when data4 is producing. Then, my application would get data from data 4...

data4, data5, data6, .... (streaming for consumer)

As I know the concept of Kafka, data1, data2, data3 should be waiting for a consumer but it doesn't for me. What do I know something wrong and any idea about to resolve this?


Solution

  • You need to show your configuration.

    Anonymous consumers (those without a spring.cloud.stream.bindings.xxx.group) start consuming from the end of the topic (at the time that they are started), so will likely "miss" some messages.

    Consumers with a group (that have never consumed) start at the beginning; consumers with a group(that have previously consumed) start from where they left off.