We have this requirement in the project wherein we were using a parent-child pipeline design to dynamically generate a set of queries for each table in a database and then run a for each loop to pass these queries to child pipelines ( one instance of child pipeline per table to be queried) and these child pipelines pass the data to logic hub end point to publish that to Event hubs.
Parent pipeline:
Child Pipeline:
These pipelines run every 2 minutes. There is good amount of cost associated with these runs and we wanted to reduce the number of orchestration runs.
The proposed solution is to - Run a single query at the source and bundle all result set into a single set - parse it out in logic app and then publish ( all entities have different structures).
Is there a way to handle this scenario ?
It might not be possible to achieve your requirement only using a single copy activity because of the following reasons.
So, your current solution is the better approach than your proposed solution. But you can make some changes as per your pipeline structure.