Is it possible to implement a complext parallel processors flow in spring cloud data flow using the preset modules?
For example:
Processor 1, 2, 3 are all preset modules (httpclient etc). Processor 1 and 3 will get the same message from source at the same time, then processor 1 pipes the output to processor 2. Upon both processor 2 and 3 complete, they pipe the output to processor 4 at the same time. The output from processor 2 is irrelevant, it is fine that processor 4 only gets the output from 3. However, processor 3 should only send messages to 4 when processor 2 is complete.
Is this something that can be accomplished using Spring Cloud Data Flow?
Using taps and named destinations you can get most of the way there.
The DSL looks like:
main-stream=source: http | processor1: script | processor2: script > :destination
second-stream=:main-stream.source > processor3: script > :destination
sink-stream=:destination > sink: log
Synchronizing between processor2 and processor3 sounds like an anti-pattern. It sounds more like you need processor3 to receive from processor2, rather than work in parallel (it just also needs an original copy of the message).