I have the following file structure:
my_repo
├── dataflow
. ├── pipelines
. │ ├── pipeline1
. │ │ ├── Dockerfile
│ │ ├── pipeline1_module.py
│ │ └── main.py
│ └── pipeline2
│ ├── Dockerfile
│ ├── pipeline2_module.py
│ └── main.py
└── common
├── common_module1.py
└── common_module2.py
Every Dockerfile
looks somewhat like:
FROM python:3.11-slim
WORKDIR /my_repo
# do some stuff
.
.
.
COPY . /my_repo
To build a new docker, using google cloud build, I'm running:
cd my_repo/dataflow/pipelines/pipeline1
gcloud builds submit . --tag=$IMAGE
My problem is that only the pipeline1
folder contents are being copied in the COPY . /my_repo
dockerfile command.
What modifications do I need to do to copy either the entire dataflow
folder, or even better, the pipeline1
directory along with the common
directory (everything but pipeline2
)?
In docker CLI this could be achieved by utilizing the docker build -f
flag, however I'm not sure how to do it similarly in google cloud build.
Thanks in advance
If you want to do something similar to -f
in Docker, there is similar alternative in Google Cloud Build as well.
You can create cloudbuild.yaml file in the same directory as the Dockerfile. If you only want to run it from my-repo, you can have config like this:
steps:
# Docker Build
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t',
'$REGION-docker.pkg.dev/$PROJECT_ID/my-docker-repo/myimage',
'-f', 'pipelines/pipeline1/Dockerfile',
'.']
# Docker Push
- name: 'gcr.io/cloud-builders/docker'
args: ['push',
'$REGION-docker.pkg.dev/$PROJECT_ID/my-docker-repo/myimage']
Then you can invoke it from my-repo using:
gcloud builds submit . --config pipelines/pipeline1/cloudbuild.yaml
I've also tried it locally and it worked. :)