I'm using airflow for a project, and I'm a beginner. I'm looking to write a file locally.
I tried that (here is my DAG) :
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
def write_simple_file(**kwargs):
file_path = '/Users/paul/airflow/output/hello_airflow.txt'
with open(file_path, 'w') as file:
file.write("Hello from Airflow DAG!\n")
print(f"File written to {file_path}")
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
dag = DAG(
'write_file_dag',
default_args=default_args,
description='DAG pour écrire un fichier simple',
schedule_interval=None,
start_date=datetime(2024, 5, 7),
catchup=False,
)
write_file_task = PythonOperator(
task_id='write_file_task',
python_callable=write_simple_file,
dag=dag,
)
write_file_task
but I got a No such file directory error in the logs.
if you are running Airflow in docker then by default the docker container running your DAG will see a folder structure like this /opt/airflow i.e. it will not be able to access your laptop/mac home folder. You could scp or sftp the file to your home folder using Python logic.