Search code examples
google-cloud-platformgitlabcontinuous-integrationgoogle-cloud-buildcd

CloudBuild with GITLAB at module level


I was working on GITHUB and GCP(Cloud Build for deployments) and working good. Below are the steps:

  • Created multiple Cloud Functions and used same GIT HUB repository.
  • Created separate Cloud Build Trigger for each Cloud Function where separate cloudbuild.yml in each Cloud Function folder in repository.
  • Trigger gets run when there are changes in respective cloud function scripts.

Now i need to integrate Cloud Build with GITLAB. I have gone through the documentation but found that only webhook is the option and the trigger will be based on whole repository changes. It will require separate repository for each cloud function or Cloud Run. There is no option to select the repository itself.

Can experts guide me on this how I can do this integration because, we are planning to have one repo and multiple service/applications stored in that repository. And we want CI to run on GCP environment itself.


Solution

  • Personally I found GitLab being the worst in comparison to GitHub and BitBucket in terms of integration with the GCP Cloud Build (to run the deployment within GCP).

    I don't know ideal solutions, but I probably have 2 ideas. None of them is good from my point of view.

    1/ Mirror GitLab repository into GCP repository as described here - Mirroring GitLab repositories to Cloud Source Repositories One of the biggest drawbacks from my point of view - the integration solution is based on a personal credentials, and there should be a person to make it working -

    Mirroring stops working if the Google Account is closed or loses access rights to the Git repository in Cloud Source Repositories

    When mirroring is done - you probably can work with the GCP based repository in an ordinary way and trigger cloud build jobs as usual. A separate question - how to provide deployment logs to those who initiated the deployment...

    2/ Use webhooks. That does not depend on any personal accounts, but not very granular - as you mentioned push on the whole repository level. To overcome that limitation, there might be a very tricky (inline) yaml file - executed by a cloud build trigger. In that yaml file, not only we should fetch the code, but also parse all changes (all commits) in that push to find out which subdirectories (thus separate components - cloud functions) are potentially modified. Then, for each affected (modified) subdirectory we can trigger (asynchronously) some other cloud build job (with a yaml file for it located inside that subdirectory).

    An obvious drawback - not clear who and how should get the logs from all those deployments, especially if something went wrong, and the development (and management) of such deployment process might be time/effort consuming and not easy.