I'm using the spring-kafka dependency version 2.8.1 with the following config:
kafka:
bootstrap-servers: localhost:9092
producer:
acks: all
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
buffer-memory: 16384
I'm trying to send about 65.000 messages to a topic but my server crashes with the following exception:
org.springframework.kafka.KafkaException: Send failed; nested exception is org.apache.kafka.clients.producer.BufferExhaustedException: Failed to allocate memory within the configured max blocking time 60000 ms.
This is how I'm sending all of these messages to my topic:
public processMessages(List<Message> messages){
for (Message msg: messages) {
kafkaTemplate.send("prepared-messages", message.toJson());
}
}
I tried setting the batch size to 0 but that also didn't work.
Have you tried increasing buffer.memory
?
https://kafka.apache.org/documentation/#producerconfigs_buffer.memory
The total bytes of memory the producer can use to buffer records waiting to be sent to the server. If records are sent faster than they can be delivered to the server the producer will block for max.block.ms after which it will throw an exception.
This setting should correspond roughly to the total memory the producer will use, but is not a hard bound since not all memory the producer uses is used for buffering. Some additional memory will be used for compression (if compression is enabled) as well as for maintaining in-flight requests.