Search code examples
node.jsdockersingle-sign-onaws-sdk-jsaws-sdk-nodejs

How to SSO login to AWS in Docker container (using aws-sdk v3)


When developing locally, I need to have access to an S3 bucket.

The access is provided via SSO.

I'm using aws-sdk v3 and node.js.

When running the same node.js app without docker, I get access and everything works fine. Here's what I do:

aws configure sso
aws sso login --profile **profile name**

And here's how my code looks like:

const { S3Client } = require('@aws-sdk/client-s3');
const { fromSSO } = require('@aws-sdk/credential-provider-sso');

const credentials = fromSSO({
  profile: process.env.AWS_PROFILE,
  ssoStartUrl: process.env.AWS_SSO_START_URL,
  ssoAccountId: process.env.AWS_ACCOUNT_ID,
  ssoRegion: process.env.AWS_REGION,
  ssoRoleName: process.env.AWS_SSO_ROLE_NAME,
});

const client = new S3Client({ credentials });

However, when running the same app in docker (using docker compose), I keep getting the error

The SSO session associated with this profile is invalid. To refresh this SSO session run aws sso login with the corresponding profile.

I'm using the node:18-alpine image and to add aws-cli to container, I do

docker compose run api sh

apk update && apk add --no-cache curl gcompat zip &&  \
    curl -s https://awscli.amazonaws.com/awscli-exe-linux-x86_64-2.1.39.zip -o awscliv2.zip && \
    unzip awscliv2.zip && ./aws/install

/usr/local/bin/aws configure sso
/usr/local/bin/aws sso login --profile **my profile**

I've checked the env variables, they're OK. However, it keeps crashing my app with the error above.

Also, here's the contents of my docker-compose.yml just in case.


What am I missing or doing wrong?

I feel this is a completely incorrect way to do this, but is there a better way?

SSO is my only option and I'm fine with the flow without Docker, but also really need to make this work with Docker.


I'm seeing at least 2 problems:

  • add aws-cli installation to docker-compose.yml
  • figure out why SSO sessions keep being invalid.

Solution

  • The problem was that the SSO session information was not being properly persisted within the container.

    To fix that, I had to mount the .aws directory in the container and also add AWS_CONFIG_FILE=/root/.aws/config and AWS_SSO_SESSION=my_sso_session to environment:

    api:
      image: node:18-alpine
      env_file:
        - .env
      volumes:
        - ./api:/usr/src/app
        - ~/.aws:/root/.aws
    

    My credentials object in app code started to look like this:

    const credentials = fromSSO({
      profile: process.env.AWS_PROFILE,
      ssoStartUrl: process.env.AWS_SSO_START_URL,
      ssoAccountId: process.env.AWS_ACCOUNT_ID,
      ssoRegion: process.env.AWS_REGION,
      ssoRoleName: process.env.AWS_SSO_ROLE_NAME,
      ssoSession: process.env.AWS_SSO_SESSION,
    });
    

    Also, newer versions of aws-cli v2 recommend setting a session name when running aws configure sso (my version didn't). Hence, I modified the ~/.aws/config file the following way:

    [profile AwsDev]
    sso_account_id = my_account_id
    sso_role_name = my_role_name
    sso_session = my_sso_session
    output = text
    
    [sso-session my_sso_session]
    sso_start_url = my_sso_start_url
    sso_region = my_region
    sso_registration_scopes = sso:account:access
    

    With this setup everything works as it should.

    When the SSO session expires, I can re-run aws sso login --profile AwsDev and restart my docker container.