I have 2 tables lets say Emp and Courses table. Emp has 30k rows and Courses have 100k rows.
1 employee can have many courses .i.e one to many relation.I need to fetch the records from the table and send to a Kafka.
Data From Table-----> Convert To JSON --->Send To KAFKA
I don't want to load all the rows in the memory at once as it can cause memory out of exception error.
How to achieve it? I will probably be using JDBCTEMPLATE or SPRING DATA JPA?
I am using SPRING BOOT 2+ version and JAVA 8
FYI For eg. in EMP table i have emp_id =1 which has 5 corresponding rows in Courses table. So i will convert this 5 rows to 1 Java object and then 1 Json object.
Importing data from a database to Apache Kafka is a really common use case. Kafka Connect allows you to stream data from and to Kafka in a reliable, scalable and fault tolerant way. Specifically, the JDBC source connector does what you are trying to do, if you build your custom solution you'll probably end up having a partial implementation of what the connector is already doing.