Search code examples
azureazure-data-factoryazure-python-sdk

Refer dataflow definition from Storage link service in azure python sdk for ADF


I am using azure pythOn SDK for ADF, I am looking to create a dataflow dynamically.

I have data flow definitions in json format available on linked storage account. Is there a way, I can refer definition from storage account and create a dataflow? like below

adf_client.data_flows.create_or_update(location = 'path_to_dataflow_json')

Solution

    • No, you cannot refer to the path of the JSON file directly. The definition has to be given in order for you to create or update the dataflow.

    • When I install the required packages and use pythons help on the adf_client.data_flows.create_or_update, you can see what arguments.

    credentials = ClientSecretCredential(client_id='<client_id', client_secret='<client_secret', tenant_id='<tenant_id>')
    
    adf_client = DataFactoryManagementClient(credentials, subscription_id)
    help(adf_client.data_flows.create_or_update)
    

    enter image description here

    • As you can see in the above image, the dataflow definition requires azure.mgmt.datafactory.models.DataFlowResource but there is no parameter that accepts path to the definition file.