Search code examples
pythongoogle-cloud-platformpipyarnpkggoogle-cloud-build

Caching yarn and python pip in google cloud build


Currently building with cloud build takes arround 10 minutes (5 minutes frontend, 5 minutes backend). Is there anyway I can cache the yarn install and pip install so the build time cut significantly? this is my current config for cloud build

steps:
  - name: "node:18.17.1"
    entrypoint: bash
    args:
      - "-c"
      - |
        yarn install
        yarn run create-app-yaml
        yarn build
    env:
      - redacted

  - name: "python:3.10.11"
    entrypoint: bash
    args:
      - "-c"
      - |
        python -m pip install -r requirements.txt
        python ./manage.py collectstatic --noinput

  - name: "gcr.io/cloud-builders/gcloud"
    args: ["app", "deploy"]
    timeout: "1600s"


Solution

  • Between different build, you have 2 solutions:

    • Either you use Cloud Storage to store the dependencies data. At the beginning of the build, you load the lib and you store them (or their update) at the end. It's also a good idea to tar.gz your data before transferring them to Cloud Storage. A single object is fastest than multiple small ones. You can use the Artifact storage feature for that
    • Or create a container image with all the dependencies and use it as the step image in Cloud Build. This time, you need a side pipeline to prepare and build this step image.