我已经在这个问题上折腾了一段时间了,现在我正式被卡住了。我正在尝试编译一个jar,其中包含一个在Azure Databricks上运行的简单scala/spark作业,包括对CosmosDB的依赖。导入azure-cosmosdb-spark会在编译期间引入冲突的跨版本错误,我认为这是某种传递依赖的结果。我尝试了许多不同的spark和scala版本,但都没有帮助,错误消息也不能说明太多问题。
我重现这个错误的最小例子是一个build.sbt,目前没有任何scala类正在编译。下面是我的build.sbt示例:
name := "ranchero"
version := "0.0.1"
scalaVersion := "2.11.8"
val sparkVersion = "2.2.0"
// additional libraries
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-mllib" % sparkVersion % "provided",
"joda-time" % "joda-time" % "2.9.9",
"org.scalatest" %% "scalatest" % "3.0.0" % "test",
"com.microsoft.azure" % "azure-cosmosdb-spark_2.2.0_2.11" % "1.1.0"
)
resolvers ++= Seq(
"apache-snapshots" at "http://repository.apache.org/snapshots/",
"Maven central" at "http://repo1.maven.org/maven2/",
)如果我注释掉cosmosdb依赖项,事情就会编译得很好。添加了该dep后,我得到了该风格的错误:
[error] Modules were resolved with conflicting cross-version suffixes in {file:/home/*******/development/ranchero/}ranchero:
[error] org.apache.spark:spark-launcher _2.10, _2.11
[error] org.json4s:json4s-ast _2.10, _2.11
[error] org.apache.spark:spark-network-shuffle _2.10, _2.11
[error] com.twitter:chill _2.10, _2.11
[error] org.json4s:json4s-jackson _2.10, _2.11
[error] com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error] org.json4s:json4s-core _2.10, _2.11
[error] org.apache.spark:spark-unsafe _2.10, _2.11
[error] org.apache.spark:spark-core _2.10, _2.11
[error] org.apache.spark:spark-network-common _2.10, _2.11
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common不幸的是,这并没有提供太多的帮助。对于如何纠正这个问题,有什么建议吗?
发布于 2018-05-24 15:59:03
让sbt处理所有依赖项的scala版本。尝试用下面的代码替换azure-cosmosdb依赖项:
"com.microsoft.azure" %% "azure-cosmosdb-spark_2.2.0" % "1.1.0"双倍的百分比将告诉sbt处理依赖项的scala版本标记。至少sbt shell以更正后的build.sbt开头。
发布于 2018-07-05 00:29:32
虽然我同意通过%%处理Scala版本,但这在我的情况下还不够。最后,我从CosmosDB库中手动排除了令人不快的依赖项,从而使其正常工作
"com.microsoft.azure" %% "azure-cosmosdb-spark_2.2.0" % "1.1.1"
exclude ("org.apache.spark", "spark-launcher_2.10")
exclude ("org.json4s", "json4s-ast_2.10")
exclude ("org.apache.spark", "spark-network-shuffle_2.10")
exclude ("com.twitter", "chill_2.10")
exclude ("org.json4s", "json4s-jackson_2.10")
exclude ("com.fasterxml.jackson.module", "jackson-module-scala_2.10")
exclude ("org.json4s", "json4s-core_2.10")
exclude ("org.apache.spark", "spark-unsafe_2.10")
exclude ("org.apache.spark", "spark-core_2.10")
exclude ("org.apache.spark", "spark-network-common_2.10")这里假设您在项目中使用的是Scala 2.11。然而,我没有解释为什么这是必要的。也许由于某种原因,Maven上的2.11 CosmosDB库链接到了2.10依赖项……
https://stackoverflow.com/questions/50498636
复制相似问题