Search code examples
pythonairflowsystemd

FileNotFoundError: [Errno 2] No such file or directory: 'hadoop'


I am currently working with airflow and its scheduler. I am trying to use systemd to properly manage those two processes both the webserver and the scheduler. However, when starting the scheduler using systemd (systemctl command) it got this error in my dags log "FileNotFoundError: [Errno 2] No such file or directory: 'hadoop'" where if I started my scheduler from command line everything works fine (by typing airflow scheduler in terminal). What I am trying to do is to use subprocess.Popen to run Hadoop command. I am just wondering what the problem is.

Here is my .service file

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service
Wants=postgresql.service

[Service]
EnvironmentFile=/root/.bashrc
User=root
Group=root
Type=simple
ExecStart=/bin/bash -c 'airflow scheduler'
Restart=always
RestartSec=5s

[Install]
WantedBy=multi-user.target

Solution

  • The most likely problem is that you do not have the same variables set. Possibly the PATH variable of your is not set properly when you run it.

    When you open interactive bash session usually few more files are sourced - most likely your path is set in /etc/profile (but it could be in fre other places).

    See https://www.gnu.org/software/bash/manual/html_node/Bash-Startup-Files.html