When an alternate AIRFLOW_HOME is defined via env var and a DAG is triggered either by a schedule or manually triggered the tasks fail to execute. This occurs because the alternate AIRFLOW_HOME is ignored and a default airflow.cfg is generated at execution time in ~/airflow. Which contains all the incorrect settings, causing the DAG to fail.
Postgres Metadata DB
S3 remote logging
What I see in my dask worker log:
SequentialExecutor being used
Failed to write logs to SQLITE
Copying alternate $AIRFLOW_HOME/airflow.cfg to ~/airflow/airflow.cfg
This insures a default airflow.cfg is not generated in ~/airflow with incorrect settings...
My DAG then runs successfully