Search code examples
google-cloud-platformgoogle-cloud-functionsgoogle-cloud-buildbuild-triggersgoogle-cloud-repository

How to deploy multiple cloud functions that are newly pushed using google cloud build and Source Repository?


I have a project folder with different cloud functions folders e.g.

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
         -cloudbuild.yaml
            --------- and so on!

Now what i have right now is. I push the code to the Source Repository one by one from the Cloud Fucntions folder to Source Repository(Separate Repos for each function folder). And then it has a Trigger enabled which trigger the cloud-build and then deploy the function. The cloudbuild.yaml file i have is like this below..

 steps:

 - name: 'python:3.7'
 entrypoint: 'bash'
 args: 
   - '-c'
   - |
       pip3 install -r requirements.txt
       pytest

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function
  - --runtime=python37
  - --source=.
  - --entry-point=function_main
  - --trigger-topic=Function
  - --region=europe-west3  

Now, what I would like to do is I would like to make a single source repo and whenever i change the code in one cloud function and push it then only it get deploys and rest remains like before.


Update

Now i have also tried something like this below but it also deploy all the functions at the same time even though i am working on a single function.

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
    -cloudbuild.yaml
    -requirements.txt

cloudbuild.yaml file looks like this below

 steps:

 - name: 'python:3.7'
 entrypoint: 'bash'
 args: 
   - '-c'
   - |
       pip3 install -r requirements.txt
       pytest

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function1
  - --runtime=python37
  - --source=./Cloud-Function-Folder1
  - --entry-point=function1_main
  - --trigger-topic=Function1
  - --region=europe-west3  

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function2
  - --runtime=python37
  - --source=./Cloud-Function-Folder2
  - --entry-point=function2_main
  - --trigger-topic=Function2
  - --region=europe-west3 

Solution

  • It's more complex et you have to play with limit and constraint of Cloud Build.

    I do this:

    • get the directory updated since the previous commit
    • loop on this directory and do what I want

    Hypothesis 1: all the subfolders are deployed by using the same commands

    So, for this I put a cloudbuild.yaml at the root of my directory, and not in the subfolders

    steps:
    - name: 'gcr.io/cloud-builders/git'
      entrypoint: /bin/bash
      args:
        - -c
        - |
            # Cloud Build doesn't recover the .git file. Thus checkout the repo for this
            git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
            # Copy only the .git file
            mv /tmp/repo/.git .
            # Make a diff between this version and the previous one and store the result into a file
            git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff
    
    # Do what you want, by performing a loop in to the directory
    - name: 'python:3.7'
      entrypoint: /bin/bash
      args:
        - -c
        - |
           for i in $$(cat /workspace/diff); do
           cd $$i
               # No strong isolation between each function, take care of conflicts!!
               pip3 install -r requirements.txt
               pytest
           cd ..
           done
    
    - name: 'gcr.io/cloud-builders/gcloud'
      entrypoint: /bin/bash
      args:
        - -c
        - |
           for i in $$(cat /workspace/diff); do
           cd $$i
               gcloud functions deploy .........           
           cd ..
           done
    

    Hypothesis 2: the deployment is specific by subfolder

    So, for this I put a cloudbuild.yaml at the root of my directory, and another one in the subfolders

    steps:
    - name: 'gcr.io/cloud-builders/git'
      entrypoint: /bin/bash
      args:
        - -c
        - |
            # Cloud Build doesn't recover the .git file. Thus checkout the repo for this
            git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
            # Copy only the .git file
            mv /tmp/repo/.git .
            # Make a diff between this version and the previous one and store the result into a file
            git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff
    
    # Do what you want, by performing a loop in to the directory. Here launch a cloud build
    - name: 'gcr.io/cloud-builders/gcloud'
      entrypoint: /bin/bash
      args:
        - -c
        - |
           for i in $$(cat /workspace/diff); do
           cd $$i
               gcloud builds submit
           cd ..
           done
    

    Be careful to the timeout here, because you can trigger a lot of Cloud Build and it take times.


    Want to run manually your build, don't forget to add the $BRANCH_NAME as substitution variable

    gcloud builds submit --substitutions=BRANCH_NAME=master