I have tried to follow this article: https://medium.com/@andrewhharmon/apache-airflow-using-pycharm-and-docker-for-remote-debugging-b2d1edf83d9d.
The problematic parts are:
data-pipeline-airflow-worker-1 | /home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:360: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
data-pipeline-airflow-worker-1 | FutureWarning,
data-pipeline-airflow-worker-1 |
data-pipeline-airflow-worker-1 | airflow command error: the following arguments are required: GROUP_OR_COMMAND, see help above.
data-pipeline-airflow-worker-1 | usage: airflow [-h] GROUP_OR_COMMAND ...
data-pipeline-airflow-worker-1 |
data-pipeline-airflow-worker-1 | positional arguments:
data-pipeline-airflow-worker-1 | GROUP_OR_COMMAND
data-pipeline-airflow-worker-1 |
data-pipeline-airflow-worker-1 | Groups:
data-pipeline-airflow-worker-1 | celery Celery components
data-pipeline-airflow-worker-1 | config View configuration
data-pipeline-airflow-worker-1 | connections Manage connections
data-pipeline-airflow-worker-1 | dags Manage DAGs
data-pipeline-airflow-worker-1 | db Database operations
data-pipeline-airflow-worker-1 | jobs Manage jobs
data-pipeline-airflow-worker-1 | kubernetes Tools to help run the KubernetesExecutor
data-pipeline-airflow-worker-1 | pools Manage pools
data-pipeline-airflow-worker-1 | providers Display providers
data-pipeline-airflow-worker-1 | roles Manage roles
data-pipeline-airflow-worker-1 | tasks Manage tasks
data-pipeline-airflow-worker-1 | users Manage users
data-pipeline-airflow-worker-1 | variables Manage variables
Ok I found the answer, you need to setup Python interpreter in airflow-worker. For script path it should be /home/airflow/.local/bin/airflow
and the parameter should be tasks test [dag_id] [task_id] [start_date]
. I am using Airflow 2.3.2