Search code examples
azureazure-data-factoryazure-blob-storageazure-storage

unable to get the proper steps of doing the blob container copy data activity in ADF across two different subscriptions in same tenant


how to copy the fileshare data between the subscriptions of storage accounts - scenario given in description

Using the same solution given from the above question I tried to implement for blob containers across the storage accounts in different subscriptions which are public access enabled.

When Coming to lInked service step from that solution, I cannot do this step @{linkedService().filesharename} because here it is blob storage service.

Can anyone help me on this.

Scenario:

In the Same Azure Tenant, we have 2 storage accounts in 2 subscriptions. In 1st Subscription - blob container storage account, we have data in different blob containers such as:

  • blobcontainer1uat
  • blobcontainer1uat

We need to move the data to the 2nd subscription - blob container storage account which has blob containers like:

  • blobcontainer1stg
  • blobcontainer1stg

In Simple Context, we have to move data from Fileshare1uat to fileshare1stg and same to next blob containers.


Solution

  • In the case of Fileshare, the fileshare names should be specified in the linked service itself, so that's why we need to use linked service parameters in that scenario.

    In this case, as you want to copy the data between the containers, the blob containers can be specified at the dataset level itself. So, there is no need to use linked service parameters. To specify the container, you need to use only the dataset parameters.

    First you need to create two Blob storage linked service for your storage accounts in different subscriptions. There is no need to use any parameters here, create linked service like below.

    enter image description here

    Similarly, create another one for the target blob storage account.

    Now, make an array with source and target container names and give to the pipeline array parameter.

    [
    {
    "source_container":"sourcecontainer1",
    "target_container":"target1"
    },
    {
    "source_container":"sourcecontainer2",
    "target_container":"target2"
    }
    ]
    

    Create Binary datasets for the source and target and while creation leave the filepath of those as empty. After creation, create a dataset parameter container_name of string type and use that in the container name of the file path in the dataset.

    enter image description here

    Do the same for the target Binary dataset.

    Take For-Each and give the containers list to it. Here, inside for-loop, we need to two copy activities. One for the folders of the container and another is for the files which are in the container root level.

    sourcecontainer
        folder1
        folder2 // One copy activity to copy these folders
        file1.csv
        file2.csv // Another copy activity to copy these files to the target container
    

    In the first copy activity give the source and sink dataset configurations like below.

    Source:

    enter image description here

    Sink:

    enter image description here

    In the second copy activity, give the configurations like below.

    Source:

    enter image description here

    Sink:

    enter image description here

    Now, debug the pipeline, it will give the desired results.

    enter image description here