InvalidProtocolBufferException when trying to write to HDFS InvalidProtocolBufferException when trying to write to HDFS hadoop hadoop

InvalidProtocolBufferException when trying to write to HDFS


I got the same exception and I found out that a few dependencies were missing, I only had hadoop-core but needed a few more. When I added these it worked:

            <dependency>                <groupId>org.apache.hadoop</groupId>                <artifactId>hadoop-core</artifactId>                <version>2.0.0-mr1-cdh4.3.0</version>                <scope>test</scope>            </dependency>            <dependency>                <groupId>org.apache.hadoop</groupId>                <artifactId>hadoop-hdfs</artifactId>                <version>2.0.0-cdh4.3.0</version>            </dependency>            <dependency>                <groupId>org.apache.hadoop</groupId>                <artifactId>hadoop-common</artifactId>                <version>2.0.0-cdh4.3.0</version>            </dependency>


You want to write in this path " /usr/msknapp/insurance" but show permission this path "/user/msknapp"!


Seems to be using incompatible version of protobuf-java-X.jar. Please check the protobuf jar version for building your application and what is used in CDH4.8