I am interested in the question: Is it possible not to run the current job if the future one will not be executed?
For example, a normal pipeline prepares dependencies, and then runs tests with these dependencies:
A (prepare) -> B (test)
But sometimes job B does not start due to special rules. So the pipeline will look like this:
A (prepare)
Which is absolutely useless, and I would like to prevent this situation.
Real example (shortened):
stages:
- prepare
- test
default:
image: node:20-alpine
tags: [ devops ]
install:
stage: prepare
script:
- npm install
yamllint:
stage: test
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
paths:
- ./*/**/*.{yml,yaml}
- .yaml-lint.json
- package.json
script:
- npm run lint
As you can see, if there are no changes in any xml/yaml files or configuration files, then the yamllint
job will not be executed.
So, how to make it so that in such a situation, job install
would not run too?
I can't fully accept @Ragnu suggestion because the repository settings require the "Pipelines must succeed". This leads to a situation where, if there are no jobs to run, then "GitLab checks the pipeline status" runs forever.
Therefore, I additionally implemented a stub job that runs if the main jobs are not started.
Result full .yml configuration:
stages:
- prepare
- test
default:
image: node:20-alpine
tags: [ devops ]
stub:
stage: prepare
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
paths:
- ./*/**/*.{yml,yaml}
- .yaml-lint.json
- package.json
# Attention to the next line
when: never
script:
- echo "There are no tasks to run."
install:
stage: prepare
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
paths:
- ./*/**/*.{yml,yaml}
- .yaml-lint.json
- package.json
script:
- npm install --save-dev
yamllint:
stage: test
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
paths:
- ./*/**/*.{yml,yaml}
- .yaml-lint.json
- package.json
script:
- npm run lint