Search code examples
azurecontinuous-integrationcontinuous-deploymentazure-data-factory

How to update ADF Pipeline level parameters during CICD


Being novice to ADF CICD i am currently exploring how we can update the pipeline scoped parameters when we deploy the pipeline from one enviornment to another. Here is the detailed scenario -
I have a simple ADF pipeline with a copy activity moving files from one blob container to another
Example - Below there is copy activity and pipeline has two parameters named :
1- SourceBlobContainer
2- SinkBlobContainer
with their default values.

enter image description here

Here is how the dataset is configured to consume these Pipeline scoped parameters.

enter image description here

Since this is development environment its OK with the default values. But the Test environment will have the containers present with altogether different name (like "TestSourceBlob" & "TestSinkBlob").
Having said that, when CICD will happen it should handle this via CICD process by updating the default values of these parameters.

When read the documents, no where i found to handle such use-case.
Here are some links which i referred -


Solution

  • There is another approach in opposite to ARM templates located in 'ADF_Publish' branch. Many companies leverage that workaround and it works great.
    I have spent several days and built a brand new PowerShell module to publish the whole Azure Data Factory code from your master branch or directly from your local machine. The module resolves all pains existed so far in any other solution, including:

    • replacing any property in JSON file (ADF object),
    • deploying objects in an appropriate order,
    • deployment part of objects,
    • deleting objects not existing in the source any longer,
    • stop/start triggers, etc.

    The module is publicly available in PS Gallery: azure.datafactory.tools
    Source code and full documentation are in GitHub here.
    Let me know if you have any question or concerns.