我想使用spark来处理hive表,但是当我运行我的程序时,我得到了以下错误:
线程“主”
中的java.lang.IllegalArgumentException异常:无法用Hive支持实例化SparkSession,因为找不到Hive类。
我的应用程序代码
object spark_on_hive_table extends App {
val spark = SparkSession
.builder()
.appName("Spark Hive Example")
.config("spark.sql.warehouse.dir", "hdfs://localhost:54310/user/hive/warehouse")
.enableHiveSupport()
.getOrCreate()
import spark.implicits._
spark.sql("select * from pbSales").show()
}build.sbt
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.2",
"org.apache.spark" %% "spark-sql" % "2.3.2",
"org.apache.spark" %% "spark-streaming" % "2.3.2",
"org.apache.spark" %% "spark-hive" % "2.3.2" % "provided"
)发布于 2020-06-18 18:09:23
对于您的provided依赖项,您应该删除spark-hive:
"org.apache.spark" %% "spark-hive" % "2.3.2" % "provided" 更改为
"org.apache.spark" %% "spark-hive" % "2.3.2"https://stackoverflow.com/questions/62456086
复制相似问题