Search code examples
powershellazure-devopsyamlazure-databricksazure-pipelines-yaml

Conditional Execution of a Task in CICD pipeline Job not getting Executed


I have the below main cicd-pipeline.yaml file which reads the latest files from our repo and prepares it to be deployed to our Dev workspace. I am using a Powershell task in the job to get the modified files from git for the current commit and then saving it to a temp location before proceeding to push it to the Build.ArtifactsStagingDirectory using the CopyFiles@2 task, before my DeployNotebooks stage could utilize those files from that location to be pushed to the databricks workspace. But I want this CopyFiles@2 task to be run only when the Powershell task is success and the flag inside it is set to True.

trigger:
  branches:
    include:
      - 'main'

variables:
  - group: dbw-cicd-dev
  
  - name: vmImageName
    value: "windows-latest"
  - name: notebooksPath
    value: "notebooks"
  - name: notebooksFolder
    value: false

pool:
  vmImage: $(vmImageName)

stages:
  - stage: ChangedFolder
    displayName: "Checking the Notebooks folder changes"
    jobs:
      - job: preCheck
        displayName: "Copy_Notebooks"
        steps:
          - checkout: self
            fetchDepth: 2

          - powershell: |
              $files=$(git diff HEAD HEAD~ --name-only)
              $temp=$files -split ' '
              $count=$temp.Length
              echo "Total number of changed files: $count"
              For ($i=0; $i -lt $count; $i++)
              {
                $name=$temp[$i]
                echo "Changed file is: $name"
                if ($name -like 'notebooks/*')
                {
                  $flag = 'True'
                  echo "Is it a notebooks change: $flag"
                  Write-Host "##vso[task.setvariable variable=notebooksFolder;isOutput=true]True"
                }
              }
              ##echo ${{variables.notebooksPath}}
              if (($flag -eq 'True'))
              {
                ## Now create a temp folder to hold the changed files
                New-Item -ItemType directory -Path $(system.defaultworkingdirectory)\temp

                foreach($file in $temp)
                {
                  if(Test-Path -path $file)
                  {
                    Copy-Item -Path $file -Destination $(system.defaultworkingdirectory)\temp
                    echo "File Moved: $file"
                  }
                }
              }              
            name: taskVariable

          - task: CopyFiles@2
            inputs:
              SourceFolder: '$(system.defaultworkingdirectory)\temp'
              Contents: '*.py'
              TargetFolder: '$(Build.ArtifactStagingDirectory)/notebooks'
              CleanTargetFolder: true
            condition: and(succeeded(), eq(variables.notebooksFolder, True))

  - stage: DeployNotebooks
    displayName: "Deploy to DEV Environment"
    dependsOn: ChangedFolder
    condition: |
      and( 
        succeeded('ChangedFolder'),
        eq(dependencies.changedFolder.outputs['precheck.taskVariable.notebooksFolder'], 'True')
      )
    jobs:
      - job: Deploy_Notebooks
        steps:
          - template: templates/deploy-notebooks.yml
            parameters:              
              environmentName: $(dev-environment-name)
              resourceGroupName: $(dev-resource-group-name)
              serviceConnection: $(dev-service-connection-name)
              notebooksPath: $(notebooksPath)

But whenever the pipeline is run, this task is being skipped on evaluation.

Azure DevOps Pipeline screenshot

Could someone look into the condition block of CopyFiles@2 task and let me know what exactly am I missing here.

But when I change the condition to OR for the CopyFiles@2 task, it runs and errors out as below:

enter image description here


Solution

  • Regarding the last point of my question, where there is an issue of path not being recognized as the sourcefolder for CopyFiles task, I have switched that to PublishBuildArtifcats in my first stage and subsequently using the DownloAdBuildArtifacts task in my following stage to be used. Moreover, I had to identify whether its a self-hosted agent or not, and accordingly use the above two mentioned tasks and proceed accordingly.

    enter image description here

    Adding my final code changes as well, incase if someone has to refer for any similar issues [stages, jobs, tasks may vary according to the situation]

    cicd-pipeline.yml

    trigger:
      branches:
        include:
          - 'main'
    
    variables:
      - group: dbw-cicd-dev
      
      - name: vmImageName
        value: "windows-latest"
      - name: notebooksPath
        value: "notebooks"
      - name: notebooksFolder
        value: false
    
    pool:
      vmImage: $(vmImageName)
    
    stages:
      - stage: ChangedFolder
        displayName: "Checking the Notebooks folder changes"
        jobs:
          - job: preCheck
            displayName: "Copy Notebooks"
            steps:
              - checkout: self
                fetchDepth: 2
    
              - powershell: |
                  $files=$(git diff HEAD HEAD~ --name-only)
                  $temp=$files -split ' '
                  $count=$temp.Length
                  echo "Total number of changed files: $count"
                  For ($i=0; $i -lt $count; $i++)
                  {
                    $name=$temp[$i]
                    echo "Changed file is: $name"
                    if ($name -like 'notebooks/*')
                    {
                      $flag = 'True'
                      echo "Is it a notebooks change: $flag"
                      Write-Host "##vso[task.setvariable variable=notebooksFolder;isOutput=true]True"
                    }
                  }
                  ##echo ${{variables.notebooksPath}}
                  if (($flag -eq 'True'))
                  {
                    ## Now create a temp folder to hold the changed files
                    New-Item -ItemType directory -Path $(system.defaultworkingdirectory)\temp
    
                    foreach($file in $temp)
                    {
                      if(Test-Path -path $file)
                      {
                        Copy-Item -Path $file -Destination $(system.defaultworkingdirectory)\temp
                        echo "File Moved: $file"
                      }
                    }
                    ## zip the temp folder which only have the changed files
                    Compress-Archive -Path $(system.defaultworkingdirectory)\temp\* -DestinationPath $(Build.ArtifactStagingDirectory)\changedfiles.zip
                  }              
                name: taskVariable
    
              - task: PublishBuildArtifacts@1
                inputs:
                  PathtoPublish: '$(Build.ArtifactStagingDirectory)'
                  ArtifactName: 'MyNotebooks'
                  publishLocation: 'Container'
                condition: and(succeeded(), eq(variables['taskVariable.notebooksFolder'], True))
    
      - stage: DeployNotebooks
        displayName: "Deploy to DEV Environment"
        dependsOn: ChangedFolder
        condition: |
          and( 
            succeeded('ChangedFolder'),
            eq(dependencies.changedFolder.outputs['precheck.taskVariable.notebooksFolder'], 'True')
          )
        jobs:
          - job: DeployNotebooks
            displayName: "Deploy Notebooks to DEV"
            steps:
              - template: templates/deploy-notebooks.yml
                parameters:              
                  environmentName: $(dev-environment-name)
                  resourceGroupName: $(dev-resource-group-name)
                  serviceConnection: $(dev-service-connection-name)
                  notebooksPath: $(notebooksPath)
    

    Kindly note the fetchDepth variable in the above yml to be set to '2', as by default its set to 1 and no files are returned as part of the git diff command for the current commit.

    deploy-notebooks.yml

    parameters:
      - name: environmentName
        type: string
      - name: resourceGroupName
        type: string
      - name: serviceConnection
        type: string
      - name: notebooksPath
        type: string
    
    steps:
      - checkout: self
    
      - task: DownloadBuildArtifacts@1
        displayName: "Download Build Artifacts"
        inputs:
          artifactName: 'MyNotebooks'
          downloadPath: '$(System.DefaultWorkingDirectory)'
          CleanTargetFolder: true
    
      - task: PowerShell@2
        displayName: "Unzip Build Artifacts" 
        inputs:
          pwsh: true
          targetType: 'inline'
          script: |
            if(Test-Path -path $(System.DefaultWorkingDirectory)\notebooks)
            {
              Remove-Item -Path $(System.DefaultWorkingDirectory)\notebooks -Recurse -Force
            }
            New-Item -ItemType directory -Path $(System.DefaultWorkingDirectory)\notebooks
            Expand-Archive -Path $(System.DefaultWorkingDirectory)\MyNotebooks\changedfiles.zip -DestinationPath $(System.DefaultWorkingDirectory)\notebooks
        condition: succeeded()
    
      - task: AzureCLI@2
        displayName: "Deploy to Databricks Workspace"
        inputs:
          azureSubscription: ${{parameters.serviceConnection}}
          scriptType: "pscore"
          scriptLocation: "inlineScript"
          inlineScript: |
            az config set extension.use_dynamic_install=yes_without_prompt
    
            $databricksWorkspace = (az resource list --resource-group ${{parameters.resourceGroupName}} --query "[?type=='Microsoft.Databricks/workspaces']" | ConvertFrom-Json)[0]
            $databricksWorkspaceInfo = (az databricks workspace show --ids $databricksWorkspace.id | ConvertFrom-Json)
    
            $bearerToken = $(Build.Repository.LocalPath)/CICD/scripts/DatabricksToken.ps1 -databricksWorkspaceResourceId $databricksWorkspaceInfo.id -databricksWorkspaceUrl $databricksWorkspaceInfo.workspaceUrl -databricksOrgId $databricksWorkspaceInfo.workspaceId
    
            Install-Module -Name azure.databricks.cicd.tools -Force -Scope CurrentUser
            Import-Module -Name azure.databricks.cicd.tools
            Import-DatabricksFolder -BearerToken $bearerToken -Region $databricksWorkspaceInfo.location -LocalPath $(System.DefaultWorkingDirectory)\${{parameters.notebooksPath}} -DatabricksPath '/live'