as we know there's no sharing state mechanism in Flink at the moment, but I suppose we can achieve it. Suppose we have a Flink job (with a single input source) and we want to know what happened at the end of it, in order to adjust the job processing steps. I have thought:
I think the first should be the more suitable, as other requires extra setup and extend the complexity to other systems. What's your opinion on those options? Are there other ways?
Thanks
If you use Stateful Functions then it's easy to send a message from the final processing step back to the upstream operator(s).
If you're OK with potentially losing this state if it's in-flight and your job restarts (so it's a hint re adjusting job processing, versus a requirement), then you can use an IterativeStream to send it back upstream. That would remove the need for Kafka or some other external feedback system. See also How does Flink treat checkpoints and state within IterativeStream?