In my ADF I have two pipelines as below:
I have a pipeline 'Pipeline A' which alters a specific data blob 'Blob X', and it runs once in a day with a dataset dependency (it is a dated dataset and only on that file creation this pipeline runs).
There is another pipeline 'Pipeline B' which does some cleanups and all on multiple data blobs including 'Blob X' and this pipeline runs every 6 hours once.
As both pipelines modify same blob, when both pipelines execute in parallel (happens rarely, but happens) it is leading to weird failures. So, how can I make a dependency to 'Pipeline B' such that it should not execute when 'Pipeline A' is executing.
Note: 'Pipeline B' uses tumbling window trigger, so upon failure it will make sure it executes for a successful execution.
You can use the Web activity to check whether the Pipeline is in progress or not based on the REST API Pipeline-Get
https://learn.microsoft.com/en-us/rest/api/datafactory/pipelines/get?tabs=HTTP
Reference : mrpaulandrew.com/2019/11/21/get-any-azure-data-factory-pipeline-run-status-with-azure-functions/ https://learn.microsoft.com/en-us/answers/questions/60424/get-status-of-pipeline-triggered.html
You can keep this web activity within an Untill loop and once the pipeline is done, proceed for the next set of tasks to avoid any conflicts.