Connecting to remote Mapr Hive via JDBC Connecting to remote Mapr Hive via JDBC hadoop hadoop

Connecting to remote Mapr Hive via JDBC


I had the same problem and was able to get around it by adding the correct dependency to my pom.xml file. I was getting the latest apache release of hive from maven central and switched to using the cdh4 release from the cloudera repo. So, what you are seeing may be a symptom of having the wrong hive-jdbc dependency. Here's the maven snippet I added to my pom file:

<repository><id>cloudera</id><url>https://repository.cloudera.com/artifactory/cloudera-repos/</url></repository>...<dependency><groupId>org.apache.hive</groupId><artifactId>hive-jdbc</artifactId><version>0.10.0-cdh4.3.2</version></dependency>

Here's a link about the cloudera repo.

Also, adding the ";auth=noSasl" to the URL made my application hang so I removed it.


I think you need to specify the username. Also itshould be hive2 not hive since you are using hiveserver2.Try modifying your connection url:

Connection con =   DriverManager.getConnection("jdbc:hive2://myserver.example.com:10000/default", "<user>", "");

Its given in the link Hive2

Hope this helps...!!!


I also had same problem.Please check if server is reachable on the port 10000 from the client (server and port are enabled no firewall is restricting) also check hiveserver is up and running. if yes then it should work.following code work for me for mapr hive.

if you have any query related mapr, please refer answers.mapr.com, this contain most of the information which you might be requiring.

import java.sql.SQLException;import java.sql.Connection;import java.sql.ResultSet;import java.sql.Statement;import java.sql.DriverManager;import org.apache.log4j.Logger;import java.io.*;import org.apache.hadoop.io.SequenceFile;import org.apache.hadoop.io.SequenceFile.*;import org.apache.hadoop.io.SequenceFile.Writer;import org.apache.hadoop.io.*;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.io.Writable;import org.apache.hadoop.fs.Path;import org.apache.hadoop.conf.*;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.*;public class HiveJdbcClient {    //private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";    /**    * @param args    * @throws SQLException    **/    private static Logger mLogger = Logger.getLogger(HiveJdbcClient.class);    private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";    public static void main(String[] args) throws SQLException {        HiveJdbcClient myJob = new HiveJdbcClient();        myJob.execute();    }    public void execute() throws SQLException {        //mLogger.info("Start HiveJob");        System.out.println("Start HiveJob");        try {            Class.forName(driverName);        } catch (ClassNotFoundException e) {            // TODO Auto-generated catch block            e.printStackTrace();            System.exit(1);        }Connection con = DriverManager.getConnection("jdbc:hive://myserver:10000/default", "", "");        Statement stmt = con.createStatement();        String sql = "SHOW TABLES";        //String tableName = "testHiveDriverTable";       // ResultSet res1 = stmt.executeQuery("create table " + tableName + " (key int, value string)");        System.out.println("Running: " + sql);        ResultSet res = stmt.executeQuery(sql);        while (res.next()) {            System.out.println(res.getString(1));        }        //mLogger.info("HiveJob executed!");        System.out.println("HiveJob executed!");    }}