Running shell script through oozie Running shell script through oozie hadoop hadoop

Running shell script through oozie


sed is running on the local distributed cache version of the file - you'll need to pipe the output of sed back via the hadoop fs shell (remembering to delete the file before uploading), something like:

hadoop fs -rm /user/ambari_qa/wfe/coordinator/$filesed "s|$oldStartIndexStr|$newStartIndexStr|g" $file \     hadoop fs -put - /user/ambari_qa/wfe/coordinator/$file

There are probably ways you can find the coordinator path in hdfs rather than hard coding it into the script.

Update

The permission problem is because the oozie job is running as the mapred user, yet the file only has rwx permissions for the user ambari_qa and group hdfs

user=mapred, access=EXECUTE, inode="ambari_qa":ambari_qa:hdfs:rwxrwx---

I would either amend the file permissions on the file and parent folder such that the mapred user can delete / replace the file, or look into masquerading as a user that does have the correct permissions


i had a similar problem.

I could solve this by adding

hadoop_user=${2}export HADOOP_USER_NAME=${hadoop_user}

hadoop_user is passed as ${wf:user} in the workflow as an argument to the action