I am new to Apache Airflow. I wanted to create a simple DAG with a single Task that launches a python script in a virtual environment:
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.bash import BashOperator
with DAG("Inference_DAG",
start_date=datetime(2022,1,1),
schedule_interval=timedelta(hours=12),
catchup=False,
max_active_runs=1
) as dag:
task_a = BashOperator(
task_id="Inference_task_a",
bash_command="/home/xfd/folder/env/bin/python3 compute.py",
retries =1,
)
When triggering the DAG, it failed.
How can I know more about what causes the failure? I have seen that Airflow integrates with Sentry (https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/errors.html?highlight=error). However, it might be overkill for simple dags like this one.
Ideally, I would like to get the output of the console command /home/xfd/folder/env/bin/python3 compute.py
to debug what went wrong easily.
How can I track errors in Apache Airflow for a simple DAG like this one?
If you enter into the DAG and click on the task that is in red, you can then click on "Log" and there you will see the error