I’m looking for the proper or available way to do artifact handling and storage
I’m using Gitlab CI/CD as my terraform planning and deployment method. I have two stages: plan
, and apply
.
In stage ‘plan’ it creates the plan file per project directory, and at the end it stores the artifacts of each project directory. So the plan files that were created, and a file that contains a list of directories that were run.
In order for users to apply the changes, they have to submit a merge request, and once approved, it’ll against the main branch.
This runs the stage apply
, where ideally what I want it to do is pull down the artifact and apply.
Except because now it’s a pipeline running on the main branch, there’s no artifact for it to pull down.
curl -k --location --header "PRIVATE-TOKEN:super_secret_token" "https://gitlab.my_domain.tld/api/v4/projects/2/jobs/artifacts/my_source_branch_name/download?job=terraform_plan" -o artifact.zip
Replacing my_source_branch_name with $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME from the predefined variables. Except if the branch gets deleted, as it is an option in the merge request, it can’t pull it down.
So my only bet is to do something like an external artifact storage where I can pass the branch name as a folder structure? Unless there’s something else I can do with Gitlab?
Or I might just be off my rocker in ideas.
Just confirming, this was my answer:
Also, is generic_packages an option to use for you ? – Nicolas Pepinster
When using the generic package, I used an ID system that included the branch name, and a version number that was the pipeline ID (to get unique). And then in the pipeline, I can CURL to pull the packages from the repo, sort by the version number, and pull down "latest" (without it being latest) in my apply
stage.