Search code examples
aws-lambdagitlab-ciaws-sam-cli

Build and deploy AWS Lambda of type image using SAM in Gitlab runner


I'm trying to set up CI/CD for an AWS Lambda using the SAM cli tool inside a Gitlab runner.

My Lambda function is a Go app shipped as a container image.

I followed this article to set it up properly: https://aws.amazon.com/blogs/apn/using-gitlab-ci-cd-pipeline-to-deploy-aws-sam-applications/

Unfortunately, the used .gitlab-ci.yml seems to be only applicable to functions of PackageType Zip, i.e. by uploading the application code to an S3 bucket:

image: python:3.8

stages:
  - deploy

production:
  stage: deploy
  before_script:
    - pip3 install awscli --upgrade
    - pip3 install aws-sam-cli --upgrade
  script:
    - sam build
    - sam package --output-template-file packaged.yaml --s3-bucket #S3Bucket#
    - sam deploy --template-file packaged.yaml --stack-name gitlab-example --s3-bucket #S3Bucket# --capabilities CAPABILITY_IAM --region us-east-1
  environment: production

I adjusted the script to these lines:

script:
    - sam build
    - sam deploy  

sam build fails at this stage:

Building codeuri: /builds/user/app runtime: None metadata: {'DockerTag': 'go1.x-v1', 'DockerContext': '/builds/user/app/hello-world', 'Dockerfile': 'Dockerfile'} architecture: x86_64 functions: ['HelloWorldFunction']
Building image for HelloWorldFunction function
Build Failed
Error: Building image for HelloWorldFunction requires Docker. is Docker running?

This guide at Gitlab suggests using image: docker with dind enabled, so I tried this config:

image: docker

services:
  - docker:dind

stages:
  - deploy

production:
  stage: deploy
  before_script:
    - pip3 install awscli --upgrade
    - pip3 install aws-sam-cli --upgrade
  script:
    - sam build
    - sam deploy
  environment: production

This in turn fails because the base docker image does not ship Python. How can I combine those two approaches to have Docker and SAM available at the same time?


Solution

  • You can:

    1. use the docker image and install python in your job OR
    2. use the python image and install docker in your job OR
    3. build your own image containing all your dependencies (docker, python, awscli, etc) and use that as your job's image:.

    Installing python in the docker image:

    production:
      image: docker
      stage: deploy
      before_script:
        - apk add --update python3 py3-pip
        - pip3 install awscli --upgrade
        - pip3 install aws-sam-cli --upgrade
      script:
        - sam build
        - sam deploy
      environment: production
    

    Installing docker in the Python image:

    
    production:
      image: python:3.9-slim
      stage: deploy
      before_script:
        - apt update && apt install -y --no-install-recommends docker.io
        - pip3 install awscli --upgrade
        - pip3 install aws-sam-cli --upgrade
      script:
        - sam build
        - sam deploy
      environment: production
    

    Or build your own image:

    FROM python:3.9-slim
    RUN apt update && apt install -y --no-install-recommends docker.io
    
    docker build -t myregistry.example.com/myrepo/myimage:latest
    docker push myregistry.example.com/myrepo/myimage:latest
    
    production:
      image: myregistry.example.com/myrepo/myimage:latest
      # ...