首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >在spark中使用HiveContext引发异常

在spark中使用HiveContext引发异常
EN

Stack Overflow用户
提问于 2017-03-07 12:30:12
回答 1查看 1.3K关注 0票数 2

我必须使用HiveContext而不是SQLContext,因为我使用了一些只能通过HiveContext才能使用的窗口函数。我在我的pom.xml中添加了以下几行:

代码语言:javascript
复制
<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-hive_2.10</artifactId>
   <version>1.6.0</version>
</dependency>

而我运行代码的机器上的spark版本也是1.6.0,但是,当我提交我的代码来启动时,我会得到以下异常:

代码语言:javascript
复制
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Iface.get_all_functions()Lorg/apache/hadoop/hive/metastore/api/GetAllFunctionsResponse;

下面是堆栈跟踪:

代码语言:javascript
复制
   at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllFunctions(HiveMetaStoreClient.java:2060)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
    at com.sun.proxy.$Proxy27.getAllFunctions(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:1998)
    at com.sun.proxy.$Proxy27.getAllFunctions(Unknown Source)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3179)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:210)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:197)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:307)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:268)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:243)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:512)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
    at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:329)
    at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:239)
    at org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:459)
    at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:459)
    at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:458)
    at org.apache.spark.sql.hive.HiveContext$$anon$3.<init>(HiveContext.scala:475)
    at org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:475)
    at org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:474)
    at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:34)
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
    at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
    at org.apache.spark.sql.SQLContext.baseRelationToDataFrame(SQLContext.scala:442)
    at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:223)
    at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:146)
    at com.cloudera.sparkwordcount.FindServers.main(FindServers.java:74)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

有人知道吗?

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2019-07-03 09:27:16

我通过替换和显式地使用

代码语言:javascript
复制
hive-metastore.jar in my spark-submit command:
--conf /path_to/hive-metastore.jar

在问题发生之前我使用的jar对于另一个环境是不正确的。

工作jar位于cloudera目录(取决于一个发行版),例如: /cloudera/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/spark2/jars/hive-metastore-1.1.0-cdh5.12.0.jar

干杯,米哈ł

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/42648370

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档