Search code examples
pythonairflowdirected-acyclic-graphs

Airflow 1.10.15 dynamic task creation


I'm trying to create a DAG that will spawn N-tasks depending on the result of the previous task. The problem is that I cannot use the value returned from the previous task (in XCom) outside of Operator

Is there a way to make this work?

with DAG(
        "spawn_dag",
         start_date=datetime(2022, 1, 1)
    ) as dag:
    
    # Calculates the number of tasks based on some previous task run
    count_number_of_tasks = PythonOperator(
        task_id='count_number_of_tasks',
        python_callable=count_tasks_function,
        dag=dag,
        xcom_push=True,
        provide_context=True
    )

    # Generates tasks and chains them
    def dynamic_spawn_func(parent_dag_name, child_dag_name, start_date, args, **kwargs):
        subdag = DAG(
            dag_id=f"{parent_dag_name}.{child_dag_name}",
            default_args=args,
            start_date=start_date,
            schedule_interval=None
        )

        # Here is the problem, the following variable cannot be used in a loop to spawn tasks
        number_of_tasks = kwargs['ti'].xcom_pull(dag_id='spawn_dag', task_ids='count_number_of_tasks')

        # This is where that variable is used
        for j in range(number_of_tasks):
            task = PythonOperator(
                task_id='processor_' + str(j),
                python_callable=some_func,
                op_kwargs={"val": j},
                dag=subdag,
                provide_context=True)

            task_2 = PythonOperator(
                task_id='wait_for_processor_' + str(j),
                python_callable=some_func,
                op_kwargs={"val": j},
                dag=subdag,
                provide_context=True)

            task >> task_2
        return subdag

    dynamic_spawn_op = SubDagOperator(
        task_id='dynamic_spawn',
        subdag=dynamic_spawn_func("spawn_dag", "dynamic_spawn", dag.start_date, args=default_args),
        dag=dag,
        provide_context=True
    )

    generate >> count_number_of_tasks >> dynamic_spawn_op


Solution

  • No. Migrate to Airflow 2.3+. Airlfow 1.10 is End of Life for 2 years now and you are shooting yourself in the foot by not upgrading. Not only you lack new features (like Dynamic Task Mapping) but also you make yourself super-vulnerable to potential security problems (there were 10s of CVEs fixed since 1.10) but also you put yourself in this position:

    https://xkcd.com/979/

    because you are one of the last peoople in the world who run Airflow 1.10.

    Not upgrading at this stage is just very wrong decision because not upgrading costs you a LOT more than migration cost. Multiple times more.