Mapreduce Hadoop job exception Output directory already exists Mapreduce Hadoop job exception Output directory already exists hadoop hadoop

Mapreduce Hadoop job exception Output directory already exists


Correct me if my understanding is wrong.. In the above code, you are referring to "/Users/msadri/Documents/.....", in local file system isn't it.? it seems like fs.defaultFS in core-site.xml is pointing to file:/// instead of hdfs address for your cluster.

1) If you needed to point to Local file system as per your requirement, then try this.

FileSystem.getLocal(conf).delete(outputDir, true);

2) If it is expected to point hdfs then Please check core-site.xml and in that, fs.defaultFS has to point to hdfs://<nameNode>:<port>/ then try it once.. (Error message saying that you are pointing to local file system. if it is pointing to hdfs, it would say "Output directory hdfs://<nameNode>:<port>/Users/msadri/... already exists"

Rule this out if its not necessary. Please let me know your response..


Can you try as

 outputDir.getFileSystem( jobConf ).delete( outputDir, true );//toFileSystem fs = FileSystem.get(jobConf);fs.delete(outputDir, true);


You can try this too

Deletes output folder if already exist.