Search code examples
azureazure-blob-storageazure-data-lakeazure-data-lake-gen2microsoft-fabric

Ways to automate deployment to ADLS and its containers using ADO


I am trying to automate release pipeline for a simple files like .csv for now to be able to fetch from azure repos and deploy to Azure Data Lake Storage in containers.

Can we do that, I have below script which I am not sure about.

stages:
- stage: DeployToADLS
  displayName: 'Deploy CSV to ADLS'
  jobs:
  - job: UploadFile
    displayName: 'Upload CSV to Azure Data Lake'
    pool:
      vmImage: 'ubuntu-latest'

    steps:
    - task: DownloadBuildArtifacts@0
      displayName: 'Download CSV Artifact'
      inputs:
        buildType: 'current'
        artifactName: 'csv-artifact'
        downloadPath: '$(System.ArtifactsDirectory)'

    - task: AzureCLI@2
      displayName: 'Upload CSV to ADLS'
      inputs:
        azureSubscription: 'Your-Service-Connection-Name'  # Replace with your Azure service connection
        scriptType: 'bash'
        scriptLocation: 'inlineScript'
        inlineScript: |
          # Variables
          STORAGE_ACCOUNT="yourstorageaccount"  # Replace with your ADLS storage account
          CONTAINER="your-container-name"  # Replace with your ADLS container
          LOCAL_CSV_FILE="$(System.ArtifactsDirectory)/csv-artifact/sample.csv"  # Path to downloaded CSV
          DEST_PATH="csv-files/sample.csv"  # Destination path in ADLS

          # Upload CSV file to ADLS
          az storage blob upload \
            --account-name $STORAGE_ACCOUNT \
            --container-name $CONTAINER \
            --name $DEST_PATH \
            --file $LOCAL_CSV_FILE \
            --auth-mode login \
            --overwrite

Solution

  • stages:
    - stage: DeployToADLS
      displayName: 'Deploy CSV to ADLS'
      jobs:
      - job: UploadFile
        displayName: 'Upload CSV to Azure Data Lake'
        pool:
          vmImage: 'ubuntu-latest'
    
        steps:
        - task: PowerShell@2
          displayName: 'Generate CSV Artifact'
          inputs:
            targetType: 'inline'
            script: |
              $person1 = @{
                  Name = 'John Smith'
                  Number = 1
              }
    
              $person2 = @{
                  Name = 'Jane Smith'
                  Number = 2
              }
    
              $allPeople = $person1, $person2
              $allPeople | Export-Csv -Path $(Pipeline.Workspace)/sample.csv
    
        - task: PublishPipelineArtifact@1
          displayName: 'Publish CSV Artifact to tmp'
          inputs:
            targetPath: $(Pipeline.Workspace)/sample.csv
            artifactName: drop
    
        - task: DownloadPipelineArtifact@2
          displayName: 'Download CSV Artifact from tmp'
          inputs:
            artifactName: drop
            targetPath: $(System.DefaultWorkingDirectory)/csv-artifact
    
        - task: Bash@3
          displayName: 'Debug --- List file in download path'
          inputs:
            targetType: 'inline'
            script: |
              ls -l $(System.DefaultWorkingDirectory)/csv-artifact
    
        - task: AzureCLI@2
          displayName: 'Upload CSV to ADLS'
          inputs:
            azureSubscription: 'DevOpsSub1Connection-Test'
            scriptType: 'pscore'
            scriptLocation: 'inlineScript'
            inlineScript: |
              # Variables
    
              $STORAGE_ACCOUNT="XXXX"
    
              $CONTAINER="XXXX" 
    
              $LOCAL_CSV_FILE="$(System.DefaultWorkingDirectory)/csv-artifact/sample.csv"
              
              $NAME="sample.csv"
    
              # Upload CSV file to ADLS
              az storage blob upload --account-name $STORAGE_ACCOUNT --container-name $CONTAINER --name $NAME --file $LOCAL_CSV_FILE --auth-mode login --overwrite
    

    enter image description here

    Result:

    enter image description here


    Please make sure your azure pipeline connection has upload file to storage account permissions