I have been getting the following error (repeatedly) in Airflow 2.9.0
after upgrading from 2.2.0
:
2024-05-07T11:10:43.522+0000] {local_executor.py:139} ERROR - Failed to execute task ' ti'.
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py", line 135, in _execute_work_in_fork
args.func(args)
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/cli_config.py", line 49, in command
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/cli.py", line 114, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/task_command.py", line 422, in task_run
ti.init_run_context(raw=args.raw)
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 3307, in init_run_context
self._set_context(self)
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/log/logging_mixin.py", line 127, in _set_context
set_context(self.log, context)
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/log/logging_mixin.py", line 274, in set_context
flag = cast(FileTaskHandler, handler).set_context(value)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/microsoft/azure/log/wasb_task_handler.py", line 89, in set_context
super().set_context(ti, identifier=identifier)
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/log/file_task_handler.py", line 219, in set_context
local_loc = self._init_file(ti, identifier=identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/log/file_task_handler.py", line 500, in _init_file
local_relative_path = self._render_filename(ti, ti.try_number)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/log/file_task_handler.py", line 290, in _render_filename
return str_tpl.format(
^^^^^^^^^^^^^^^
KeyError: ' ti'
I used Airflow 2.2.0
before and there everything was still working fine without any errors or warnings. After the update to 2.9.0 and upgrading the database (using airflow db migrate --to-version="2.9.0"
), I started receiving this error. This error also seemingly results in the scheduler crashing. The webserver is still accessible though.
Airflow is being run inside a docker container with the following base image: apache/airflow:2.9.0-python3.11
So far, I have already tried to set 'provide_context'=True
in the default args and to specific PythonOperators.
Also note that the key where the key error occurs has a space in it (' ti'
), which seems weird to me.
Does anyone know how to fix this error?
Edit 1: Even when Airflow does not have any DAGs loaded, it still has the same error. Which makes me believe that it has something to do with the database. I use a postgresql database for the metadata.
Edit 2: It seems like Airflow runs as long as no DAGs are running. However, if I execute any DAG it crashes immediately with the specified error.
I had the same issue and it turned out to be an issue with the log name.
Take a look at the log file format. New format should be something like this:
log_filename_template = dag_id={{ ti.dag_id }}/run_id={{ ti.run_id }}/task_id={{ ti.task_id }}/{%% if ti.map_index >= 0 %%}map_index={{ ti.map_index }}/{%% endif %%}attempt={{ try_number }}.lo
You may need to clear any active tasks first as they may be stuck with old file name and it won't process.