I just finished this tutorial to use Kafka and Schema Registry :http://cloudurable.com/blog/kafka-avro-schema-registry/index.html I also played with Conlfuent Platform : https://docs.confluent.io/current/installation/installing_cp.html
Everything works fine, until I rebooted my Virtual Machine (VMBOX) : All schemas/subjects have been deleted (or disappeared) after I rebooted.
I read that Schema Registry to not store itself the data but use Kafka to do that. Of course, as I work for the moment only on my laptop, Kafka was also shutdown during the machine reboot.
Is it normal behavior, do we have to expect to RE-store all schemas all the time we reboot??? (-> maybe last version so!)
Do anybody have good best practices about that?
How persistence of schemas can be managed to avoid this problem ?
Environment : Ubuntu 16... , Kafka 2.11.1.0.0, Confluent Platform 4.0
Thanks a lot
nota: I already read this topics which discuss about keeping schema's ID, but has I don't recover any schemas, it's not a problem of Ids : Confluent Schema Registry Persistence
Schema Registry persists its data in Kafka.
Therefore your question becomes, why did you lose your data from Kafka on reboot.
My guess would be you've inadvertently used /tmp
as the data folder. Are you using Confluent CLI in your experiments?