I am using azure pythOn SDK for ADF, I am looking to create a dataflow dynamically.
I have data flow definitions in json format available on linked storage account. Is there a way, I can refer definition from storage account and create a dataflow? like below
adf_client.data_flows.create_or_update(location = 'path_to_dataflow_json')
No, you cannot refer to the path of the JSON file directly. The definition has to be given in order for you to create or update the dataflow.
When I install the required packages and use pythons help
on the adf_client.data_flows.create_or_update
, you can see what arguments.
credentials = ClientSecretCredential(client_id='<client_id', client_secret='<client_secret', tenant_id='<tenant_id>')
adf_client = DataFactoryManagementClient(credentials, subscription_id)
help(adf_client.data_flows.create_or_update)
azure.mgmt.datafactory.models.DataFlowResource
but there is no parameter that accepts path to the definition file.