Hadoop: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected Hadoop: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected hadoop hadoop

Hadoop: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected


Hadoop has gone through a huge code refactoring from Hadoop 1.0 to Hadoop 2.0. One side effectis that code compiled against Hadoop 1.0 is not compatible with Hadoop 2.0 and vice-versa.However source code is mostly compatible and thus one just need to recompile code with targetHadoop distribution.

The exception "Found interface X, but class was expected" is very common when you're runningcode that is compiled for Hadoop 1.0 on Hadoop 2.0 or vice-versa.

You can find the correct hadoop version used in the cluster, then specify that hadoop version in the pom.xml file Build your project with the same version of hadoop used in the cluster and deploy it.


You need to recompile "hcatalog-core" to support Hadoop 2.0.0.Currently "hcatalog-core" only supports Hadoop 1.0


Obviously, you have versions incompatibility between you Hadoop and Hive versions. You need to upgrade (or downgrade) your Hadoop version or Hive version.

This is due the incompatibility between Hadoop 1 and Hadoop 2.