Importing local module (python script) in Airflow DAG
This usually has to do with how Airflow is configured.
In airflow.cfg
, make sure the path in airflow_home
is correctly set to the path the Airflow directory strucure is in.
Then Airflow scans all subfolders and populates them so that modules can be found.
Otherwise, just make sure the folder you are trying to import is in the Python path: How to use PYTHONPATH
The way I do it is as following:
- create a Python script in your sub-folder with a main() function.
- in your dag file include a path declaration for the sub-folder and the file
Now you can use this script in your PythonOperator
import syssys.path.insert(0,"/root/airflow/dags/subfolder"))import subfolder.script_name as script... t1=PythonOperator( task_id='python_script', python_callable=script.main, dag=dag)
If you run Airlow in a docker then you need to do it as following:
- Create a folder for your modules in dags folder. For exampleprograms
- Use it as following (this is the correct path for docker):
import syssys.path.append('/opt/airflow/dags/programs/my_module')import my_moduletask1 = PythonOperator( task_id='my_task_name', python_callable=my_module.my_func, dag=dag, )