How to run Apache Airflow DAG as Unix user How to run Apache Airflow DAG as Unix user hadoop hadoop

How to run Apache Airflow DAG as Unix user


You can use the run_as_user parameter to impersonate a unix user for any task:

t1 = BashOperator(task_id='create_dir', bash_command='mkdir /tmp/airflow_dir_test', dag=dag, run_as_user='user1')

You can use default_args if you want to apply it to every task in the DAG:

dag = DAG('create_directory', description='simple create directory workflow', start_date=datetime(2017, 6, 1), default_args={'run_as_user': 'user1'})t1 = BashOperator(task_id='create_dir', bash_command='mkdir /tmp/airflow_dir_test', dag=dag)t2 = BashOperator(task_id='create_file', bash_command='echo airflow_works > /tmp/airflow_dir_test/airflow.txt')

Note that the owner parameter is for something else, multi-tenancy.