首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >即使所有的jars都添加到maven存储库中,仍然可以使用ClassNotFoundException。

即使所有的jars都添加到maven存储库中,仍然可以使用ClassNotFoundException。
EN

Stack Overflow用户
提问于 2016-12-30 06:26:01
回答 2查看 1.2K关注 0票数 1

我添加了这个项目所需的所有jars,但是我无法解决这个exception.can --任何人都会对此提出建议。您还能告诉我如何给予hive数据库访问权限吗?提前谢谢。

代码语言:javascript
复制
java.lang.ClassNotFoundException: org.apache.hadoop.hive.jdbc.HiveDriver
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:190)
    at org.ezytruk.com.CreateHiveExternalTable.createHiveExternalTable(CreateHiveExternalTable.java:20)
    at org.ezytruk.com.CreateHiveExternalTable.main(CreateHiveExternalTable.java:53)
Exception in thread "main" java.sql.SQLException: No suitable driver found for jdbc:hive://localhost/EZYTRUK
    at java.sql.DriverManager.getConnection(DriverManager.java:596)
    at java.sql.DriverManager.getConnection(DriverManager.java:215)
    at org.ezytruk.com.CreateHiveExternalTable.createHiveExternalTable(CreateHiveExternalTable.java:39)
    at org.ezytruk.com.CreateHiveExternalTable.main(CreateHiveExternalTable.java:53)

pom.xml

代码语言:javascript
复制
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>BigData</groupId>
  <artifactId>BigData</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <properties>
  <slf4j.version>1.6.1</slf4j.version>
  <hadoop-version>2.6.0</hadoop-version>
  <mysql-connector-version>5.1.40</mysql-connector-version>
  <sqoop-core-version>1.99.3</sqoop-core-version>
  <zookeeper-version>3.4.9</zookeeper-version>
  <hive-jdbc-version>1.2.1</hive-jdbc-version>
  <commons-io-version>2.2</commons-io-version>
  <commons-logging.version>1.2</commons-logging.version>
  </properties>
  <dependencies>
  <dependency>
    <groupId>commons-io</groupId>
    <artifactId>commons-io</artifactId>
    <version>${commons-io-version}</version>
</dependency>
 <dependency>
        <groupId>commons-logging</groupId>
        <artifactId>commons-logging</artifactId>
        <version>${commons-logging.version}</version>
   </dependency>        
   <dependency>
    <groupId>mysql</groupId>
    <artifactId>mysql-connector-java</artifactId>
    <version>${mysql-connector-version}</version>
   </dependency>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${hadoop-version}</version>
</dependency>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>${hadoop-version}</version>
</dependency>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>${hadoop-version}</version>
</dependency>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>${hadoop-version}</version>
</dependency>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-yarn-common</artifactId>
    <version>${hadoop-version}</version>
</dependency>
 <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.2.1</version>
</dependency>
 <dependency> 
    <groupId>org.apache.sqoop</groupId>
    <artifactId>sqoop-core</artifactId>
    <version>${sqoop-core-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.sqoop</groupId>
    <artifactId>sqoop-client</artifactId>
    <version>${sqoop-core-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.sqoop</groupId>
    <artifactId>sqoop-common</artifactId>
    <version>${sqoop-core-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.sqoop.connector</groupId>
    <artifactId>sqoop-connector-generic-jdbc</artifactId>
    <version>${sqoop-core-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.sqoop</groupId>
    <artifactId>sqoop</artifactId>
    <version>1.4.1-incubating</version>
</dependency>
<dependency>
    <groupId>org.apache.zookeeper</groupId>
    <artifactId>zookeeper</artifactId>
    <version>${zookeeper-version}</version>
</dependency>

<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-jdbc</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-exec</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>

<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-metastore</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-common</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-service</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-shims</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-serde</artifactId>
    <version>${hive-jdbc-version}</version>
</dependency>

</dependencies>
  <packaging>war</packaging>
  <build>
    <sourceDirectory>src</sourceDirectory>
    <plugins>
      <plugin>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
          <source>1.7</source>
          <target>1.7</target>
        </configuration>
      </plugin>
      <plugin>
        <artifactId>maven-war-plugin</artifactId>
        <version>2.6</version>
        <configuration>
          <warSourceDirectory>WebContent</warSourceDirectory>
        </configuration>
      </plugin>
    </plugins>
  </build>
</project>

程序:

代码语言:javascript
复制
 package org.hive.com;

    import java.io.FileNotFoundException;
    import java.io.IOException;
    import java.sql.Connection;
    import java.sql.DriverManager;
    import java.sql.SQLException;

    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.fs.Path;

    import com.mysql.jdbc.Statement;

    public class CreateHiveExternalTable {

        public static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";

        public static void createHiveExternalTable() throws FileNotFoundException, IOException, SQLException {
            try {
                Class.forName(driverName);
            } catch (ClassNotFoundException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }

            Configuration config = new Configuration();
            config.addResource(new Path("/usr/local/hadoop/etc/hadoop/conf/core-site.xml"));
            config.addResource(new Path("/usr/local/hadoop/etc/hadoop/conf/hdfs-site.xml"));



        Connection connect = DriverManager.getConnection("jdbc:hive://localhost/hivedb","hive","");
            Statement stmt = (Statement) connect.createStatement();
            //String tableName = properties.getProperty("hive_table_name");
            stmt.executeQuery("CREATE EXTERNAL TABLE IF NOT EXISTS"
             +"SHIPPER(S_ID INT,S_NAME VARCHAR(100),S_ADDR VARCHAR(100),S_CITY VARCHAR(100)"
             +"ROW FORMAT DELIMITED FIELDS TERMINATED BY ','"
             +"LOCATION 'hdfs://localhost://hive'");

            System.out.println("Table created.");
            connect.close();
        }

         public static void main(String[] args) throws FileNotFoundException, IOException, SQLException{
             CreateHiveExternalTable hiveTable = new CreateHiveExternalTable();
             hiveTable.createHiveExternalTable();
         }     

        }    
EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2016-12-30 12:27:47

hive.server2.thrift.port是您可以检查端口的属性。

在hive上,发出命令"set hive.server2.thrift.port“,这将给出单元格的端口号。

默认情况下,蜂窝端口设置为10000,但是您可以使用上面的命令对hive进行切赫。

票数 1
EN

Stack Overflow用户

发布于 2016-12-30 07:14:56

来自这篇文章,Connect from Java to Hive using JDBC

试一试 私有静态字符串driverName = "org.apache.hive.jdbc.HiveDriver“ 而不是 私有静态字符串driverName = "org.apache.hadoop.hive.jdbc.HiveDriver"; 我希望您已经在代码中添加了Class.forName(driverName)语句

此外:

代码语言:javascript
复制
    Connection connect = DriverManager.getConnection("jdbc:hive2://localhost:HIVEPORT/hivedb","hive","");

而不是

代码语言:javascript
复制
Connection connect = DriverManager.getConnection("jdbc:hive://localhost/hivedb","hive","");

我不知道你在运行Hive的哪个端口,但请记住要更改这一行

代码语言:javascript
复制
localhost:HIVEPORT
票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/41391630

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档