Our team is creating a CI/CD component project for running common deployment tasks. In this scenario, we have a component called terraform
that inits and applies terraform code.
Here's how our main pipeline code looks like.
# .gitlab-ci.yml
stages:
- deploy
deploy-environment-alpha:
stage: deploy
include:
- local: path/to/deploy.yml
# Trigger downstream pipeline with component(s)
trigger-downstream-pipeline:
stage: triggers
trigger:
include:
- local: path/to/deploy.yml
strategy: depend
#### Additional CD triggers ...
Given the contents below for path/to/deploy.yml, the pipeline works as expected. It triggers the child pipeline and imports the component.
# path/to/deploy.yml
include:
- component: /path/to/terraform@1.0.0
inputs:
job-name: deploy-alpha
job-stage: deploy
foo: alpha
Now, I want to update the deploy.yml to include two components.
# path/to/deploy.yml
include:
- component: /path/to/terraform@1.0.0
inputs:
job-name: deploy-alpha
job-stage: deploy
foo: alpha
# path/to/deploy.yml
include:
- component: /path/to/terraform@1.0.0
inputs:
job-name: deploy-beta
job-stage: deploy
foo: beta
For some reason, Gitlab is only showing the deploy-alpha job. Any reason why it is ignoring deploy-beta?
The include
keyword is global and as such does not belong to any job. To be honest I'm surprised that your actual setup does work as the include
keyword is not supposed to be used within a job, as the docs specify:
Keyword type: Global keyword.
That said, I think that what you want to achieve can be done in a different way. Given a sample component:
spec:
inputs:
foo:
default: alpha
---
deploy-environment-alpha:
stage: $[[ inputs.foo ]]
script: echo "Deploy alpha"
deploy-environment-beta:
stage: $[[ inputs.foo ]]
script: echo "Deploy beta"
Then you can reference it multiple times like this in your yaml
file:
stages: [alpha, beta]
include:
- component: /path/to/terraform@1.0.0
inputs:
foo: alpha
- component: /path/to/terraform@1.0.0
inputs:
foo: beta