-
Hi. I was trying to play around and try out the celery worker. I'm working with airflow 2.0.0, mysql backend. Celery worker running on aws ec2 Red Hat Enterprise Linux release 8.3 (Ootpa) For worker this was my dag directory and on my laptop I had it as On trying this out I got an error, on checking the info log right above it [2021-06-13 14:38:02,237: INFO/MainProcess] Received task: airflow.executors.celery_executor.execute_command[f3746582-18ee-4a84-9955-5e9b5ff31d93] The dag location is not relative to AIRFLOW_HOME and is taking the absolute location which was published by the scheduler. Is there a reason why dag locations arent relative to $AIRFLOW_HOME in the database as well? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 8 replies
-
This is a "hard" requirement to have the same DAG absolute path for all machines. This was a deliberate choice that has been made a long time ago to make more "opinionated" approach and avoid some common mistakes. For example someone can write Dags that read configuration from files placed in the DAG folder using absolute path, which will work locally but start failing in case you change the location of the DAG folder on a different machine. As far as I know it's been always like that. |
Beta Was this translation helpful? Give feedback.
This is a "hard" requirement to have the same DAG absolute path for all machines. This was a deliberate choice that has been made a long time ago to make more "opinionated" approach and avoid some common mistakes. For example someone can write Dags that read configuration from files placed in the DAG folder using absolute path, which will work locally but start failing in case you change the location of the DAG folder on a different machine. As far as I know it's been always like that.