Is there limitation of the fields that you can put in the AVRO schema.
For example can we have an AVRO schema with 1 000 000 fields inside it? Will there be some impact on the performance?
Thank you in advance!
I only saw that we have limitation of the version - 1,000 total schema versions across all subjects based on the confluent documentation https://docs.confluent.io/cloud/current/stream-governance/packages.html
Avro does not care.
Kafka itself has a 1MB record size limit, so that will apply at both the schema registry (if you use it), as well as the serialized Kafka records themselves.
Confluent Cloud (or any hosted Kafka solution) also may enforce a payload limit size to their APIs