How to I refresh environment variables for later steps in the same build that indirectly changes those variables?
Section of a the test YAML file that reproduces the described behavior.
jobs:
- job: welldone
pool:
name: noodle
steps:
- script: |
echo select TestStand 2016
start /wait "" "C:\Program Files (x86)\National Instruments\Shared\TestStand Version Selector\TSVerSelect.exe" /version 16.0 /installing /noprompt
displayName: 'select TestStand version 16'
- script: |
echo Check TestStand version
echo %TestStand%
call RefreshEnv.cmd
echo %TestStand%
displayName: 'print TestStand version'
- script: |
call checkTSversion.bat
call RefreshEnv.cmd
call checkTSversion.bat
displayName: 'call bat file to print TestStand version'
First script calls TestStand Version Selector that changes environment variables among other things.
2nd script prints environment variables starting with "teststand", then calls refreshenv.cmd and prints variables again. First prints old variables, second - updated. This is consistent with expected behavior of cmd, I suppose.
3rd script does the same, but now the echo %TestStand%
is in a separate batch file. It behaves exactly as 2nd script.
What can I do in the 1st script to make sure that consecutive scripts will read updated environment variables?
I'm not entirely sure that I understand your sample pipeline, but it seems to me like you want to set variables and/or change variable values in a job step and then use the new value in a later job step. Correct? If so, you're looking for something like this:
steps:
# Create a variable
# Note that this does _not_ update the environment of the current script.
- bash: |
echo "##vso[task.setvariable variable=sauce]crushed tomatoes"
# An environment variable called `SAUCE` has been added to all downstream steps
- bash: |
echo "my environment variable is $SAUCE"
- pwsh: |
Write-Host "my environment variable is $env:SAUCE"