Search code examples
gitgitlabgitlab-cigitlab-ci-runnergit-lfs

Storing large build files for gitlab CI


I'm working on a project that requires custom builds of large libraries.

Ordinarily, these library dependencies would be uploaded to say artifactory and named something like QT6.static.tar.gz. When a gitlab CI/CD pipeline would start, it can pull down these dependencies and start a build.

I'm now in a situation where my only tool is gitlab. Artifacts in gitlab appear to be inextricably linked to the pipelines that created them.

I would like my CI workflow to be as follows :

  1. Git commit triggers pipeline with a variable that denotes which artifacts to use.
  2. If specified artifacts don't exist, start a job that creates them, then export the artifacts.
  3. Compile the code commit changes against the artifacts as normal

I can somewhat replicate this by using the container registry and docker layer caching - but it seems a bit contrived. Alternatively, these built files could be stored in git LFS, but I also don't like that solution.


Solution

  • If you just want to use GitLab as a drop-in replacement for Artifactory in this case, you can just use the Generic Package Registry instead of GitLab CI artifacts, which as you noted, are mostly meant to be relevant only within the pipeline that creates them (or child pipelines that depend on those artifacts).

    So, by using the generic package registry instead, you can upload your dependencies to a package registry and be sure they'll always be available, irrespective of the pipeline (or even the project) that is running -- basically just like how you would do in the workflow you described with Artifactory.

    Keep in mind that there is a default 5GB file size limit. This limit can be configured on self-hosted GitLab instances, but not on gitlab.com -- if your artifacts are larger than this limit, you'll either need to configure the limit (if on self-hosted) or split up your archive into multiple parts.