我试图得到一个卡夫卡和火花流工作的例子,我发现在运行过程中的问题。
这是一个例外:
com.fasterxml.jackson.databind.JsonMappingException:不兼容Jackson版本: 2.9.8引起的错误
这是build.sbt:
name := "SparkJobs"
version := "1.0"
scalaVersion := "2.11.6"
val sparkVersion = "2.4.1"
val flinkVersion = "1.7.2"
resolvers ++= Seq(
"Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases/",
"apache snapshots" at "http://repository.apache.org/snapshots/",
"confluent.io" at "http://packages.confluent.io/maven/",
"Maven central" at "http://repo1.maven.org/maven2/"
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
"org.apache.spark" %% "spark-hive" % sparkVersion
// ,"org.apache.flink" %% "flink-connector-kafka-0.10" % flinkVersion
, "org.apache.kafka" %% "kafka-streams-scala" % "2.2.0"
// , "io.confluent" % "kafka-streams-avro-serde" % "5.2.1"
)
//excludeDependencies ++= Seq(
// commons-logging is replaced by jcl-over-slf4j
// ExclusionRule("jackson-module-scala", "jackson-module-scala")
//
)这是代码
做一个sbt dependencyTree,我可以看到spark-core_2.11-2.4.1.jar有jackson-databind-2.6.7.1,它告诉我它被2.9.8 version驱逐了,这意味着库之间有冲突,但是spark-core_2.11-2.4.1.jar不是唯一的,kafka-streams-scala_2.11:2.2.0使用jackson-databind-2.9.8版本,所以我不知道哪个库必须驱逐jackson-databind-2.9.8. Spark-core / kafka-streams-scala?还是两者兼备?
如何避免jackson library version 2.9.8来启动和运行此任务?
我假设我需要jackson-databind-2.6.7 version ..。
提供最新的建议。还是不起作用。
我删除了卡夫卡流-scala的依赖关系,它试图使用杰克逊2.9.8,使用这个build.sbt
name := "SparkJobs"
version := "1.0"
scalaVersion := "2.11.6"
val sparkVersion = "2.4.1"
val flinkVersion = "1.7.2"
val kafkaStreamScala = "2.2.0"
resolvers ++= Seq(
"Typesafe Releases" at "http://repo.typesafe.com/typesafe/releases/",
"apache snapshots" at "http://repository.apache.org/snapshots/",
"confluent.io" at "http://packages.confluent.io/maven/",
"Maven central" at "http://repo1.maven.org/maven2/"
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion ,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
"org.apache.spark" %% "spark-hive" % sparkVersion
)但我有了新的异常
更新2
明白了,现在我明白了第二个例外,我忘记了awaitToTermination。
发布于 2019-04-03 20:12:36
但是当你使用星火流的卡夫卡集成时,你不需要它,所以你真的应该删除它。
类似地,kafka-streams-avro-serde并不是您希望在Spark中使用的东西,相反,您可能会发现AbraOSS/ABRiS很有用。
https://stackoverflow.com/questions/55501772
复制相似问题