Search code examples
jenkinsshared-librariesjenkins-pipelinepipelinegitlab-ci

Share gitlab-ci.yml between projects


We are thinking to move our ci from jenkins to gitlab. We have several projects that have the same build workflow. Right now we use a shared library where the pipelines are defined and the jenkinsfile inside the project only calls a method defined in the shared library defining the actual pipeline. So changes only have to be made at a single point affecting several projects.

I am wondering if the same is possible with gitlab ci? As far as i have found out it is not possible to define the gitlab-ci.yml outside the repository. Is there another way to define a pipeline and share this config with several projects to simplify maintainance?


Solution

  • First let me start by saying: Thank you for asking this question! It triggered me to search for a solution (again) after often wondering if this was even possible myself. We also have like 20 - 30 projects that are quite identical and have .gitlab-ci.yml files of about 400 - 500 loc that have to each be changed if one thing changes.

    So I found a working solution:

    Inspired by the Auto DevOps .gitlab-ci.yml template Gitlab itself created, and where they use one template job to define all functions used and call every before_script to load them, I came up with the following setup.

    Files

    So using a shared ci jobs scipt:

    #!/bin/bash
    
    function list_files {
      ls -lah
    }
    
    function current_job_info {
      echo "Running job $CI_JOB_ID on runner $CI_RUNNER_ID ($CI_RUNNER_DESCRIPTION) for pipeline $CI_PIPELINE_ID"
    }
    

    A common and generic .gitlab-ci.yml:

    image: ubuntu:latest
    
    before_script:
      # Install curl
      - apt-get update -qqq && apt-get install -qqqy curl
      # Get shared functions script
      - curl -s -o functions.sh https://gitlab.com/giix/demo-shared-ci-functions/raw/master/functions.sh
      # Set permissions
      - chmod +x functions.sh
      # Run script and load functions
      - . ./functions.sh
    
    job1:
      script:
        - current_job_info
        - list_files
    

    You could copy-paste your file from project-1 to project-2 and it would be using the same shared Gitlab CI functions.

    These examples are pretty verbose for example purposes, optimize them any way you like.

    Lessons learned

    So after applying the construction above on a large scale (40+ projects) I want to share some lessons learned so you don't have to find out the hard way:

    • Version (tag / release) your shared ci functions script. Changing one thing can now make all pipelines fail.
    • Using different Docker images could cause an issue in the requirement for bash to load the functions (e.g. I use some Alpine-based images for CLI tool based jobs that have sh by default)
    • Use project based CI/CD secret variables to personalize build jobs for projects. Like environment URL's etc.