首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >为什么星火应用程序的sbt程序集会导致“模块被用相互冲突的跨版本后缀解决”?

为什么星火应用程序的sbt程序集会导致“模块被用相互冲突的跨版本后缀解决”?
EN

Stack Overflow用户
提问于 2017-10-16 05:38:42
回答 1查看 563关注 0票数 1

我使用的是CDH集群,Spark2.1和Scala2.11.8。

我用sbt 1.0.2。

在执行assembly时,我收到的错误是

error java.lang.RuntimeException: org.scala-lang.mods:scala-xml,java.lang.RuntimeException中相互冲突的跨版本后缀

我试图使用dependencyOverridesforce()覆盖版本不匹配,但两者都不起作用。

来自sbt程序集的错误消息

代码语言:javascript
复制
[error] Modules were resolved with conflicting cross-version suffixes in {file:/D:/Tools/scala_ide/test_workspace/test/NewSp
arkTest/}newsparktest:
[error]    org.scala-lang.modules:scala-xml _2.11, _2.12
[error]    org.scala-lang.modules:scala-parser-combinators _2.11, _2.12
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.
modules:scala-parser-combinators
[error]         at scala.sys.package$.error(package.scala:27)
[error]         at sbt.librarymanagement.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:39)
[error]         at sbt.librarymanagement.ConflictWarning$.apply(ConflictWarning.scala:19)
[error]         at sbt.Classpaths$.$anonfun$ivyBaseSettings$64(Defaults.scala:1971)
[error]         at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error]         at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:42)
[error]         at sbt.std.Transform$$anon$4.work(System.scala:64)
[error]         at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
[error]         at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error]         at sbt.Execute.work(Execute.scala:266)
[error]         at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
[error]         at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
[error]         at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error]         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error]         at java.lang.Thread.run(Thread.java:748)
[error] (*:update) Conflicting cross-version suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-par
ser-combinators
[error] Total time: 413 s, completed Oct 12, 2017 3:28:02 AM

build.sbt

代码语言:javascript
复制
name := "newtest"
version := "0.0.2"

scalaVersion := "2.11.8" 

sbtPlugin := true

val sparkVersion = "2.1.0"

mainClass in (Compile, run) := Some("com.testpackage.sq.newsparktest")

assemblyJarName in assembly := "newtest.jar"


libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.1.0" % "provided",
  "org.apache.spark" % "spark-sql_2.11" % "2.1.0" % "provided",
  "com.databricks" % "spark-avro_2.11" % "3.2.0",
  "org.apache.spark" % "spark-hive_2.11" % "2.1.0" % "provided")


libraryDependencies +=
     "log4j" % "log4j" % "1.2.15" excludeAll(
       ExclusionRule(organization = "com.sun.jdmk"),
       ExclusionRule(organization = "com.sun.jmx"),
       ExclusionRule(organization = "javax.jms")
     )

resolvers += "SparkPackages" at "https://dl.bintray.com/spark-packages/maven/"
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

plugins.sbt

代码语言:javascript
复制
dependencyOverrides += ("org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4")
dependencyOverrides += ("org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
resolvers += Resolver.url("bintray-sbt-plugins", url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2017-10-17 18:02:38

博士从build.sbt中删除sbtPlugin := true (即用于sbt插件而不是应用程序)。

您还应该从dependencyOverrides中删除plugins.sbt

您应该将spark-core_2.11libraryDependencies中的其他火花依赖项更改为如下:

代码语言:javascript
复制
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0" % "provided"

更改是使用%% (= 2%的符号)并从依赖项的中间部分删除Scala的版本,例如上面的spark-core

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/46763569

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档