我正在尝试将GraphFrames添加到我的scala spark应用程序中,当我添加基于2.10的应用程序时,一切都很顺利。然而,当我尝试用GraphFrames build和Scala2.11构建它的时候,它就失败了。
问题是使用了冲突的scala版本(2.10和2.11)。我得到以下错误:
[error] Modules were resolved with conflicting cross-version suffixes in {file:/E:/Documents/School/LSDE/hadoopcryptoledger/examples/scala-spark-graphx-bitcointransaction/}root:
[error] org.apache.spark:spark-launcher _2.10, _2.11
[error] org.json4s:json4s-ast _2.10, _2.11
[error] org.apache.spark:spark-network-shuffle _2.10, _2.11
[error] com.twitter:chill _2.10, _2.11
[error] org.json4s:json4s-jackson _2.10, _2.11
[error] com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error] org.json4s:json4s-core _2.10, _2.11
[error] org.apache.spark:spark-unsafe _2.10, _2.11
[error] org.apache.spark:spark-core _2.10, _2.11
[error] org.apache.spark:spark-network-common _2.10, _2.11然而,我不能诊断是什么导致了这种情况..这是我的完整build.sbt:
import sbt._
import Keys._
import scala._
lazy val root = (project in file("."))
.settings(
name := "example-hcl-spark-scala-graphx-bitcointransaction",
version := "0.1"
)
.configs( IntegrationTest )
.settings( Defaults.itSettings : _*)
scalacOptions += "-target:jvm-1.7"
crossScalaVersions := Seq("2.11.8")
resolvers += Resolver.mavenLocal
fork := true
jacoco.settings
itJacoco.settings
assemblyJarName in assembly := "example-hcl-spark-scala-graphx-bitcointransaction.jar"
libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.0.7" % "compile"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.5.0" % "provided"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"
libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"
libraryDependencies += "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11"有人能指出哪个依赖项是基于scala 2.10导致构建失败的吗?
发布于 2017-10-23 03:59:30
我发现了问题所在。显然,如果你使用:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"默认使用2.10版本。当我将spark core和spark graphx的依赖项更改为:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.2.0" % "provided"https://stackoverflow.com/questions/46878284
复制相似问题