我有以下build.sbt文件:
name := "myProject"
version := "1.0"
scalaVersion := "2.11.8"
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-core" % "2.8.1"
)
// additional libraries
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-hive" % "2.0.0" % "provided",
"com.databricks" %% "spark-csv" % "1.4.0",
"org.scalactic" %% "scalactic" % "2.2.1",
"org.scalatest" %% "scalatest" % "2.2.1" % "test",
"org.scalacheck" %% "scalacheck" % "1.12.4",
"com.holdenkarau" %% "spark-testing-base" % "2.0.0_0.4.4" % "test",
)然而,当我运行代码时,我得到了这个错误:
An exception or error caused a run to abort.
java.lang.ExceptionInInitializerError
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Jackson version is too old 2.4.4
at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:56)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:549)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 58 more为什么会这样呢?我已经在dependencyOverrides中添加了一个较新版本的杰克逊(在查看了这里的Spark Parallelize? (Could not find creator property with name 'id')之后),所以不应该使用旧版本。
发布于 2016-08-01 06:53:33
jackson-core和jackson-databind版本应该匹配(我认为至少是次要版本)。
因此,删除dependencyOverrides并让
libraryDependencies ++= Seq(
...
"com.fasterxml.jackson.core" % "jackson-databind" % "2.8.1"
)或者在dependencyOverrides中同时指定两者
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-core" % "2.8.1"
"com.fasterxml.jackson.core" % "jackson-databind" % "2.8.1"
)虽然我不确定我明白你在做什么;但链接的问题似乎是说你应该使用旧版本(2.4.4)。
https://stackoverflow.com/questions/38686318
复制相似问题