我在Apache-Spark2.11的build.sbt文件中创建了下面的依赖项。
name := "Project1"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-compiler" % "2.11.8",
"org.scala-lang" % "scala-reflect" % "2.11.8",
"org.scala-lang.modules" % "scala-parser-combinators_2.11" % "1.0.4",
"org.scala-lang.modules" % "scala-xml_2.11" % "1.0.4"
)但是Intellij无法解决spark-core_2.11依赖。我试了好几次,但都没有成功。提前谢谢。
发布于 2017-02-24 12:21:47
在IntelliJ 2016.3.2中,我也遇到了几乎相同的Scala/Spark版本的问题:
name := "some-project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"为了使它正常工作,我必须手动将火花核心jar添加到我的项目库中,即:
https://stackoverflow.com/questions/40187945
复制相似问题