Search code examples
azure-pipelinesazure-devops-self-hosted-agent

Using two self-hosted Azure pipeline agents for artifact publishing in parallel mode


I’ve got a build pipeline in Azure Pipelines containing several stages. The language – C#, .Net Core 3.1/.Net Framework 4.7 At the first stage of the pipeline, the whole solution is built, and unit and integration tests are run. At the next stages, different microservices and separate parts of the API are published and downloaded to Azure as separate artifacts. All this stuff is run on a self-hosted build agent.

I was trying to parallelize the stages after testing. The logic was that at these stages no build is done, only simple file copying and archiving are done. To do this, I ran two different build agents from the same pool on one computer. The agents were using the same local folder as their work folders. But when I tried to run the build pipeline, the agents started to compete for resources. The problem with the folder for build artifacts was solved, but it was not the only one. For example, both the agents were trying to create/delete the same temporary files in _temp folder. In this case, one task was failing with the following error:

[error]Unhandled: ENOENT: no such file or directory, open 'e:\_build\_temp\.taskkey'

Also, another type of strange errors started to occur. The text of the errors was something like on the screenshot below: A strange error text

I guess, that conflicts between two agents caused these errors.

Using two different work folders for the agents seems not the best solution, because in this case, one agent will not have access to the files built on the first stage.

Does anybody have a successful experience in running two Azure Pipelines agents on the same machine for processing one pipeline?


Solution

  • As Daniel mentioned above you should not use the same local folder as work folders for your local agents

    There are workarounds to share files between different stages on local machine agents.

    For example, you can use publish build artifacts task in stage A to publish the files that are needed in other stages to azure devops server. Then you can use download build artifacts task in Stage B to download the files in its working directory.

    Update:

    Another workaround is to add a script task to define a variable and set its value to the working directory of Stage A. And add dependency to Stage A in Stage B.

    Then you can using expression dependencies.<Previous stage name>.outputs['<name of the job which execute the task.setvariable >.TaskName.VariableName'] to get the value of the variable defined in Stage A, which is Stage A's working directory.

    Please check below example: Stage B can access the working directory of Stage A and copy its files out to c:\test\copyfromoriginalfolder

    stages: 
    - stage: A
      jobs:
      - job: a
        pool: Default
        steps:
        - task: PowerShell@2
          inputs:
            targetType: inline
            script: |
              echo "##vso[task.setvariable variable=PathA]$(system.defaultworkingdirectory)"
          name: power1
    
    - stage: B
      dependsOn: A
      variables:
        PathInA: $[dependencies.A.outputs['a.power1.PathA']]
      jobs:
      - job: b
        pool: Default
        steps:
        - powershell: |
            cd $(PathInA)
            ls
        - task: CopyFiles@2
          inputs:
            SourceFolder: $(PathInA)
            contents: '**'
            TargetFolder: 'c:\test\copyfromoriginalfolder'
            CleanTargetFolder: true
    

    Hope above help!