Copy all files in hadoop directory except 1 Copy all files in hadoop directory except 1 unix unix

Copy all files in hadoop directory except 1


I see in Apache Hadoop docs #put:

Usage: hadoop fs -put ...

Copy single src, or multiple srcs from local file system to the destination file system. Also reads input from stdin and writes to destination file system.

And then a useful example

hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile Reads the input from stdin.

So maybe you can use a find expression grepping this file out and then pipe to hadoop:

find /opt/nikoo28/resources/conf ! -name "doNotCopy.txt" | hadoop dfs -put - ./


Add these lines in your shell script:

mkdir /opt/copymv /opt/nikoo28/doNotCopy.txt /opt/copy/doNotCopy.txthadoop dfs -put /opt/nikoo28/resources/conf ./ && mv /opt/copy/doNotCopy.txt /opt/nikoo28/doNotCopy.txt

Just move the file you don't want to copy to some other folder. Perform hadoop fs -put command. Now, move back the file to its original position.

If you want to preserve file permissions, then do this:

mkdir /opt/copycp -p /opt/nikoo28/doNotCopy.txt /opt/copy/doNotCopy.txtrm /opt/nikoo28/doNotCopy.txthadoop dfs -put /opt/nikoo28/resources/conf ./ && cp -p /opt/copy/doNotCopy.txt /opt/nikoo28/doNotCopy.txt

NOTE: Add sudo if you get permission errors while creating directory, moving the file or copying the file.


This is kinda peculiar, but should work:

file=./conf/doNotCopy.txt[[ -f $file ]] && mv $file $file.oldhadoop dfs -put /opt/nikoo28/resources/conf ./rm $file[[ -f $file ]] && mv $file.old $file