Search code examples
javaspring-bootapache-kafkaspring-kafka

Kafka message compression not working as expected


My project uses Apache Kafka for sending messages. Over the period of time, payload becomes heavy and has crossed the default size of 1MB. We thought of compressing the message at Producer end, but it is not working as expected.

Here is the producer config,

spring:
    kafka:
        consumer:
          group-id: group_name
          bootstrap-servers: broker-server:9343
        producer:
          bootstrap-servers: ${spring.kafka.consumer.bootstrap-servers}
          key-serializer: org.apache.kafka.common.serialization.StringSerializer
          value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
          compression-type: gzip

but with above configuration, if I send a message more than 1 MB, I am facing below error,

org.springframework.kafka.KafkaException: Send failed; nested exception is org.apache.kafka.common.errors.RecordTooLargeException: The message is 1909152 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration

Is there any other configuration required to enable compression ?


Solution

  • The configuration max-request-size will help you in this case. This configuration is used to specify the maximum size of a request. By default, It is set to 1MB, that why you are seeing the error. You can add the following property in your property file:

    spring:
      kafka:
       producer:
         max-request-size: 2097152
    

    Note 2MB = 2097152 bytes (you can change it according to your requirement).