I'm using Confluent schema registry and Avro. Data was ingested to kafka using a JDBC connector which uses an SMT for creating proper avro schemas. Problem occurs during deserialization using SpecificAvroSerde. I've got plenty of similar cases and they all work great. So in general the approach of ingesting data, generating avro schema and consuming in stream processors using avro works. The difference with this case is that record contains an array (kind of master/detail record). Below a simplified version of the schema:
{
"namespace": "io.confluent.base.model",
"type": "record",
"name": "Test1",
"fields": [
{ "name": "opt_identifier", "type": [ "null", "string" ],"default": null },
{ "name": "opt_amount", "type": [ "null", { "type":"bytes", "logicalType":"decimal", "precision":31, "scale":8 }], "default": null},
{ "name": "arr_field", "type": ["null", { "type": "array",
"items": {
"name": "TestTest1",
"type": "record",
"fields": [
{ "name": "opt_identifier_", "type": [ "null", "string" ],"default": null },
{ "name": "opt_amount_", "type": [ "null", { "type":"bytes", "logicalType":"decimal", "precision":31, "scale":8 }], "default": null}
]
},
"default": [] }],
"default": null}
]
}
The schema is compiled using avro maven plugin. Both, connector and sonsumer are using same avro jar versions. The exception I receive is
org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 79
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:409) ~[kafka-avro-serializer-7.0.1.jar:na]
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:114) ~[kafka-avro-serializer-7.0.1.jar:na]
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:88) ~[kafka-avro-serializer-7.0.1.jar:na]
at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55) ~[kafka-avro-serializer-7.0.1.jar:na]
at io.confluent.kafka.streams.serdes.avro.SpecificAvroDeserializer.deserialize(SpecificAvroDeserializer.java:66) ~[kafka-streams-avro-serde-7.0.1.jar:na]
at io.confluent.kafka.streams.serdes.avro.SpecificAvroDeserializer.deserialize(SpecificAvroDeserializer.java:38) ~[kafka-streams-avro-serde-7.0.1.jar:na]
at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60) ~[kafka-clients-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:58) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:66) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.RecordQueue.updateHead(RecordQueue.java:176) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:112) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:304) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:960) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.TaskManager.addRecordsToTasks(TaskManager.java:1000) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.pollPhase(StreamThread.java:914) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:720) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:583) ~[kafka-streams-3.0.0.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:555) ~[kafka-streams-3.0.0.jar:na]
Caused by: java.lang.ClassCastException: class java.nio.HeapByteBuffer cannot be cast to class java.math.BigDecimal (java.nio.HeapByteBuffer and java.math.BigDecimal are in module java.base of loader 'bootstrap')
at io.confluent.base.model.TestTest1.put(TestTest1.java:416) ~[classes/:na]
at org.apache.avro.generic.GenericData.setField(GenericData.java:818) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:139) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readArray(GenericDatumReader.java:298) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:183) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:136) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:187) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) ~[avro-1.10.1.jar:1.10.1]
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153) ~[avro-1.10.1.jar:1.10.1]
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:400) ~[kafka-avro-serializer-7.0.1.jar:na]
... 17 common frames omitted
I can read the same message using GenericRecord and all fields are there. Hence the avro record got serialized correctly.
My current understanding:
opt_amount
) without problemsopt_amount_
however throws the exception hence I suspect that this nested detailed record TestTest1
is not used in the same way as the master record Test1
.The issue is with AVRO internals. This PR will solve your issue once merged. https://github.com/apache/avro/pull/1721