Search code examples
dagster

AWS credentials not found for celery-k8s deployment


I'm trying to run dagster using celery-k8s and using the examples/celery-k8s as a start. upon running the pipeline from playground I get

Initialization of resources [s3, io_manager] failed.
botocore.exceptions.NoCredentialsError: Unable to locate credentials

I have configured aws credentials in env variables as mentioned in the document

deployments:
    - name: "user-code-deployment-test"
      image:
        repository: "somasays/dagster-usercode-example"
        tag: "0.5"
        pullPolicy: Always
      dagsterApiGrpcArgs:
        - "-f"
        - "/workspace/repo.py"
      port: 3030
      env:
        AWS_ACCESS_KEY_ID: AAAAAAAAAAAAAAAAAAAAAAAAA
        AWS_SECRET_ACCESS_KEY: qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
        AWS_DEFAULT_REGION: eu-central-1

and I can also see these values are set in the env variables of the pod and can also access the s3 location after pip install awscli and aws s3 ls see the screenshot below the job pod however throws Unable to locate credentials

Screenshot Please help


Solution

  • The deployment configuration applies to the user code servers. Meanwhile the celery executor runs your pipeline code in separate kubernetes jobs. To provide your secrets there, you will want to configure the env_secrets field of the celery-k8s executor in your pipeline run config.

    See https://github.com/dagster-io/dagster/blob/master/python_modules/libraries/dagster-k8s/dagster_k8s/job.py#L321-L327 for details on the config.