Search code examples
dockerairflowgoogle-container-registry

Airflow pull docker image from private google container repository


I am using the https://github.com/puckel/docker-airflow image to run Airflow. I had to add pip install docker in order for it to support DockerOperator.

Everything seems ok, but I can't figure out how to pull an image from a private google docker container repository.

I tried adding the connection in the admin section type of google cloud conenction and running the docker operator as.

    t2 = DockerOperator(
            task_id='docker_command',
            image='eu.gcr.io/project/image',
            api_version='2.3',
            auto_remove=True,
            command="/bin/sleep 30",
            docker_url="unix://var/run/docker.sock",
            network_mode="bridge",
            docker_conn_id="google_con"
    )

But always get an error...

[2019-11-05 14:12:51,162] {{taskinstance.py:1047}} ERROR - No Docker registry URL provided

I also tried the docker_conf_option

    t2 = DockerOperator(
            task_id='docker_command',
            image='eu.gcr.io/project/image',
            api_version='2.3',
            auto_remove=True,
            command="/bin/sleep 30",
            docker_url="unix://var/run/docker.sock",
            network_mode="bridge",
            dockercfg_path="/usr/local/airflow/config.json",

    )

I get the following error:

[2019-11-06 13:59:40,522] {{docker_operator.py:194}} INFO - Starting docker container from image eu.gcr.io/project/image [2019-11-06 13:59:40,524] {{taskinstance.py:1047}} ERROR - ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

I also tried using only dockercfg_path="config.json" and got the same error.

I can't really use Bash Operator to try to docker login as it does not recognize docker command...

What am I missing?

line 1: docker: command not found

t3 = BashOperator(
                task_id='print_hello',
                bash_command='docker login -u _json_key - p /usr/local/airflow/config.json eu.gcr.io'
        )

Solution

  • airflow.hooks.docker_hook.DockerHook is using docker_default connection where one isn't configured.

    Now in your first attempt, you set google_con for docker_conn_id and the error thrown is showing that host (i.e registry name) isn't configured.

    Here are a couple of changes to do:

    • image argument passed in DockerOperator should be set to image tag without registry name prefixing it.
    DockerOperator(api_version='1.21',
        # docker_url='tcp://localhost:2375', #Set your docker URL
        command='/bin/ls',
        image='image',
        network_mode='bridge',
        task_id='docker_op_tester',
        docker_conn_id='google_con',
        dag=dag,
        # added this to map to host path in MacOS
        host_tmp_dir='/tmp', 
        tmp_dir='/tmp',
        )
    
    • provide registry name, username and password for the underlying DockerHook to authenticate to Docker in your google_con connection.

    You can obtain long lived credentials for authentication from a service account key. For username, use _json_key and in password field paste in the contents of the json key file.

    Google connection for docker

    Here are logs from running my task:

    [2019-11-16 20:20:46,874] {base_task_runner.py:110} INFO - Job 443: Subtask docker_op_tester [2019-11-16 20:20:46,874] {dagbag.py:88} INFO - Filling up the DagBag from /Users/r7/OSS/airflow/airflow/example_dags/example_docker_operator.py
    [2019-11-16 20:20:47,054] {base_task_runner.py:110} INFO - Job 443: Subtask docker_op_tester [2019-11-16 20:20:47,054] {cli.py:592} INFO - Running <TaskInstance: docker_sample.docker_op_tester 2019-11-14T00:00:00+00:00 [running]> on host 1.0.0.127.in-addr.arpa
    [2019-11-16 20:20:47,074] {logging_mixin.py:89} INFO - [2019-11-16 20:20:47,074] {local_task_job.py:120} WARNING - Time since last heartbeat(0.01 s) < heartrate(5.0 s), sleeping for 4.989537 s
    [2019-11-16 20:20:47,088] {logging_mixin.py:89} INFO - [2019-11-16 20:20:47,088] {base_hook.py:89} INFO - Using connection to: id: google_con. Host: gcr.io/<redacted-project-id>, Port: None, Schema: , Login: _json_key, Password: XXXXXXXX, extra: {}
    [2019-11-16 20:20:48,404] {docker_operator.py:209} INFO - Starting docker container from image alpine
    [2019-11-16 20:20:52,066] {logging_mixin.py:89} INFO - [2019-11-16 20:20:52,066] {local_task_job.py:99} INFO - Task exited with return code 0