我正在尝试在Play!2.5中使用Spark2.0,但是我没有设法使它正常工作(而且似乎没有关于Github的例子)。
我没有任何编译错误,但我有一些奇怪的执行错误。
例如:几乎所有Dataset或Dataframe上的操作都会导致NullPointerException
val ds: Dataset[Event] = df.as[Event]
println(ds.count()) //Works well and prints the good results
ds.collect() // --> NullPointerExceptionds.show也会导致NullPointerException。
因此,有一个大问题,我错过了,所以我认为它来自于不兼容的版本。下面是我的build.sbt的相关部分:
object Version {
val scala = "2.11.8"
val spark = "2.0.0"
val postgreSQL = "9.4.1211.jre7"
}
object Library {
val sparkSQL = "org.apache.spark" %% "spark-sql" % Version.spark
val sparkMLLib = "org.apache.spark" %% "spark-mllib" % Version.spark
val sparkCore = "org.apache.spark" %% "spark-core" % Version.spark
val postgreSQL = "org.postgresql" % "postgresql" % Version.postgreSQL
}
object Dependencies {
import Library._
val dependencies = Seq(
sparkSQL,
sparkMLLib,
sparkCore,
postgreSQL)
}
lazy val root = (project in file("."))
.settings(scalaVersion := Version.scala)
.enablePlugins(PlayScala)
libraryDependencies ++= Dependencies.dependencies
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.7.4",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.7.4"
)发布于 2017-03-07 15:37:20
我在使用play 2.5.12 java时使用spark 2.0.0遇到了同样的问题。默认情况下,激活器似乎包括com.Quickerxml.Jackson-databind 2.7.8,而且它不适用于jackson模块-scala。
我清理了sbt缓存
rm -r ~/.ivy2/cache我的新build.sbt在编译时会产生一个响亮的结果,因为spark 2.0.0是用jackson-模块-scala_2.11:2.6.5编译的,但是仍然会触发两个接缝来处理jackson-模块-Scala2.8.7
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % "2.8.7",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7",
"com.fasterxml.jackson.core" % "jackson-annotations" % "2.8.7",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.8.7",
"org.apache.spark" % "spark-core_2.11" % "2.0.0",
"org.apache.spark" % "spark-mllib_2.11" % "2.0.0"
)从NullpointerException派生的jackson.databind.JsonMappingException:不兼容的杰克逊版本: 2.x.x请阅读https://github.com/FasterXML/jackson-module-scala/issues/233
https://stackoverflow.com/questions/39913510
复制相似问题