I am trying to store last read record of my source in a separate topic in kafka for a stream source. How can i achieve this with spring cloud data flow stream app. Any suggestion wud be of great help..
Spring Cloud Stream applications can support multiple destinations.
You can add a second output destination and send a message it it.
I want to use the RDBMS as a source. The current JDBC app starter needs an extra flag in the source table to mark the row as read..but most of the scenarios, that wont be possible..So I am trying to build it based on timestamp.. So I will be storing the last read timestamp in a separate topic.. and each time i will use this timestamp, to continue reading from the RDBMS(incremental load)
You can consume from the topic during startup to get the initial starting value.