Search code examples
dockerbitbucketbitbucket-pipelines

Copying files into built Docker image using bitbucket-pipelines.yml?


I have a bitbucket-pipelines.yml that looks like this:

definitions:
  steps:
  - step: &docker_build
    name: "Docker Build"
    services:
      - docker
    caches:
      - pip
    script:
      - cd .
      - export IMAGE_NAME_EXTENSION="latest"
      - VERSION=$(<version.txt)
      - docker build -t $IMAGE_NAME .
      - docker save --output tmp-image.docker $IMAGE_NAME
    artifacts:
      - tmp-image.docker


pipelines:
  branches:
   master:
   - step: *docker_build

My Dockerfile looks like this:

FROM python:3.8-slim

RUN apt-get -y update && apt-get install -y --no-install-recommends \
     libgl1-mesa-glx \
     libglib2.0-0 \
     wget \
     git \
     python3 \
     ca-certificates \
     gcc \
     libc6-dev \
&& rm -rf /var/lib/apt/lists/*

COPY requirements.txt .
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
COPY src/some_code/ .
COPY config/ .
CMD ["python3","./abc.py"]

Inside my code I use Paho MQTT client. My problem is that currently certificates for MQTT client are currently checked in with code at root level of my repo inside a "config" folder. Paho MQTT client automatically reads them from there. I would like to move those certificates out of my repo. I copied my certificates in AWS Secrets manager and I am able to fetch them when my bitbucket-pipelines.yml runs. However, now I would like to copy these certificates inside a "config" folder in my already built Docker image.

I have tried couple of things here:

  1. I tried to run Docker image after it is built in bitbucket-pipelines.yml and then try to use Python shutil to copy the files - Problem: Docker image collapses as soon as the step finishes in bitbucket-pipelines.yml
  2. Tried to run it using Python. Problem: I don't have a container name in bitbucket-pipelines.

Can someone point me towards right documentation please ? I tried looking but I couldn't find appropriate post for my use case.


Solution

  • I was able to achieve this like so:

    1. Store secrets in AWS Secrets Manager.

    2. Add this line to Docker (Make sure your repo doesn't have a config folder at root level - or account for it and change code accordingly)

      COPY config/ .
      
    3. Fetch them using AWS CLI in pipelines and build Docker:

      - SECRET_ROOT_CA=$(aws secretsmanager get-secret-value --secret-id root-CA --region us-west-1)
      - echo "$SECRET_ROOT_CA" | jq -r '.SecretString' > config/root-CA.pem
      - SECRET_KEYFILE=$(aws secretsmanager get-secret-value --secret-id keyfile --region us-west-1)
      - echo "$SECRET_KEYFILE" | jq -r '.SecretString' > config/results.pem.key
      - SECRET_CERTFILE=$(aws secretsmanager get-secret-value --secret-id certfile --region us-west-1)
      - echo "$SECRET_CERTFILE" | jq -r '.SecretString'> config/bt2-results.pem.crt
      
      # Building Docker
      - cd .
      - export IMAGE_NAME_EXTENSION="latest"
      - VERSION=$(<version.txt)
      - docker build -t $IMAGE_NAME .
      - docker save --output tmp-image.docker $IMAGE_NAME
      
      artifacts:
        - config/**
        - tmp-image.docker
      

    Doing this fetches certificates and all secrets needed by MQTT and places then at a level where paho_mqtt client needs then.