I believe there can be a log for this, which stores all the dags and their respective data.
I need to have a list of dags with their schedule time, the expected code can be in any format, python or shell command.
Thanks.
To manually check the status of dags. You can follow these ways:
To get the status of Dags from GCP you can use gcloud composer environments run command.
Example:
gcloud composer environments run environment_name --location 'location_name' dags state -- dag_id -- execution_date
To get the status of Dags from Airflow CLI you can use dag state or dag list-runs command.
You can get the current status of the dags from the Airflow UI:
You can go to Browse -> Task Instances and apply “state” filter to get the status of the DAGS, you can also include filters like DAG ID.