I am in Data factory and I need help to parameterize a dataset so I can use multiple files in a blob folder to process in PowerQuery and then send to an AzureSQL table for sink(storage). I do not want to make 100 datasets if I want to process the 100 excel files in the same way using PowerQuery.
I have successfully executed a pipeline as follows:
Get Metadata activity to read all files in a blob folder (.xlsx), then use a for each loop to get each file and inside that an activity to copy it to an Azure SQL DB.
Now I want to see if I can do following or similar:
Get a list of files in a blob folder or get full blob addresses of files, and then use a for loop to try to pass one file/file address (per loop turn) and process it through PowerQuery.
Is this even possible right now? Really stuck with this. I want to process data in PowerQuery before loading it to a sink basically.
Alternative ideas encouraged!
You can pass dynamically the filename in power query dataset also. Below are the detailed steps.
@item().name
.This way, you can achieve the requirement of using the same power query pipeline for all files.