I have a pipeline that is calling another pipeline which has a ForEach
activity and it's passing this value through a DataFlow
activity.
My problem is that I need to see if a particular loop in the ForEach has succeeded or not, so that I can pass this information to another colleague and he can work on his notebooks on Synapse. I don't what is the best way to do this or the best practices also.
I know that I can get this information through Azure API, from Azure Metrics (monitor), but I still didn't found a way to get that and save as a file or ingest in a Kusto table for example.
I saw that now is possible to get the Pipeline Return Value from a pipeline, but I didn't understand how could I save this info on each loop to an output file (doesn't matter the file format now).
So I would have the pipeline name, date, status (success/failed) and the loop value.
This is my parent pipeline:
This is the child pipeline:
So every this the ForEach
activity above succeed or not, I will add the requirement values into a file.
I don't have any SQL or Synapse database, neither an API to use the POST method, that's why my goal is to save this into a file and then I could ingest into a Kusto table, OR if possible ingest directly into a Kusto table. But in case it's also easy to get that information from a Kusto query from Azure Logs/Monitor/API (don't know which one) for each pipeline run, I would also prefer this way (so there wouldn't be any ingestion, only retrieval).
To get the status for each loop if it got succeeded or not you need to store the status based on the execution of dataflow activity as if it succeeded then store succeed and if failed then store failed.
@concat('file_',item())
You can also check this similar SO Thread to combine all files