Search code examples
airflow

How to execute a task in Airflow in another server?


I have already set up the following infrastructure:

  • host (main): airflow (with all the services)

  • slave server airflow (redis, postgres, worker)

Cannot see any tutorial or couldn't find any clue about how to do it, but i don't have any idea about how to connect the task with the parameter queue=node1 to execute it in the slave server, thank you!


Solution

  • You can use the following example to redirect the task:

    train_model = PythonOperator(
      task_id='train_model',
      python_callable=train_model_flights
      queue='<QUEUE_NAME>'
    )
    

    Also, if you have created this infra using docker-compose, check the airflow-worker section where the "celery worker" command is used. For your new worker with a different queue name, copy and paste this, and change the command to "celery worker -q <QUEUE_NAME>".