Search code examples
filedynamicazure-data-factoryazure-blob-storagedataflowtask

Need to provide dynamic file from Azure blob to the any data activitiy in Azure Data Factory, please share some link


Created integrationruntime, ADF and I'm trying to transform blob csv file to Oracle table. Blob storage will receive new csv file for every one hour.

I saw a few videos, but I'm not able to get it done. Appreciate anyone who can help


Solution

  • You need to use Storage event triggers to achieve your requirement. You can give the file name dynamically by using the trigger parameter @triggerBody().fileName.

    First create string parameter in the pipeline.

    enter image description here

    Now create a storage event trigger and give the container name and folder path of the file in the trigger like below. Here my file is of csv type, you can give your file type.

    enter image description here

    Click on continue and at end of the trigger creation, it will ask to provide the value for the parameter of the pipeline and here give the trigger parameter @triggerBody().fileName like below.

    enter image description here

    Click on ok and trigger will be created. Now we need to use this parameter for the file name of the source dataset. For that create a dataset parameter and give it in the file name like below.

    enter image description here

    Give your folder path and use dataset parameter for the file name.

    enter image description here

    Use copy activity and give the above dataset for the source of the copy activity and here give the pipeline parameter @pipeline().parameters.filename for the dataset parameter like below.

    enter image description here

    Give your Oracle table dataset in the sink of the copy activity. If your target table is same for every file, then just give the table name. But if your target table is different for every file, then you need to make sure that correct table will be given for the correct file.

    You need to ensure that you publish the pipeline and trigger before the usage.

    This storage event trigger will trigger the pipeline whenever any new file uploaded or modified in the given location of the storage account and it will give the filename of that file dynamically to the copy activity and file will be copied to the destination.