Search code examples
docker-composeairflowmwaa

Amazon MWAA Local Runner : Where to add the Airflow variables and connections in docker-compose-local.yml


I am using Amazon MWAA local runner repository for developing and testing my dags locally before I submit a PR to main/dev branch. I have forked it from here I would like to export an Airflow variable and an Airflow connection as soon as I start the container : ./mwaa-local-env start The Airflow variable : Key = deploy_environment and Value = qa The Airflow connection : conn id = slack_conn ; conn type = HTTP ; password = *****

Something like thisAirflow Connection

I was only able to change the docker/docker-compose-local.yml to include the Airflow variable in the file.

version: '3.7'
services:
    postgres:
        image: postgres:10-alpine
        environment:
            - POSTGRES_USER=airflow
            - POSTGRES_PASSWORD=airflow
            - POSTGRES_DB=airflow
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - "${PWD}/db-data:/var/lib/postgresql/data"

    local-runner:
        image: amazon/mwaa-local:2.0.2
        restart: always
        depends_on:
            - postgres
        environment:
            - LOAD_EX=n
            - EXECUTOR=Local
            - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
            - AIRFLOW_VAR_DEPLOY_ENVIRONMENT=qa
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - ${PWD}/dags:/usr/local/airflow/dags
            - ${PWD}/plugins:/usr/local/airflow/plugins
            - $HOME/.aws/credentials:/usr/local/airflow/.aws/credentials:ro
        ports:
            - "8080:8080"
        command: local-runner
        healthcheck:
            test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
            interval: 30s
            timeout: 30s
            retries: 3

I thought AIRFLOW_VAR_DEPLOY_ENVIRONMENT=qa would do the job. However this is what I get after I start Airflow environment. The value is Invalid!

For adding an Airflow connection, I have not been able to figure out how to export that in docker-compose-local.yml

Any help in exporting the above two is appreciated!


Solution

  • The connection information can be stored as a JSON string in the environment variable. Not all of the keys are required. Only provide what's needed for the Slack provider module.

    export AIRFLOW_CONN_SLACK_CONN='{
        "conn_type": "my-conn-type",
        "login": "my-login",
        "password": "my-password",
        "host": "my-host",
        "port": 1234,
        "schema": "my-schema",
        "extra": {
            "param1": "val1",
            "param2": "val2"
        }
    }'
    

    You can also store the connection information in URI format in the environment variable.

    export AIRFLOW_CONN_SLACK_CONN='my-conn-type://login:password@host:port/schema?param1=val1&param2=val2'
    

    The environment variable name must be prefixed with AIRFLOW_CONN_.

    Reference: Storing connections in environment variables (Airflow)