我的build.sbt文件有(我在使用IntelliJ)
scalaVersion := "2.11.8"
resolvers += "MavenRepository" at "http://central.maven.org/maven2"
resolvers += "spark-packages" at "https://dl.bintray.com/spark-packages/maven/"
libraryDependencies ++= {
val sparkVersion = "2.2.1"
Seq( "org.apache.spark" %% "spark-core" % sparkVersion )
}我试着建造一个罐子并把它部署到星火中。我发布了以下命令
sbt compile
sbt assembly编译成功,但程序集由于以下错误消息而失败
java.lang.RuntimeException: Please add any Spark dependencies by supplying the sparkVersion and sparkComponents. Please remove: org.apache.spark:spark-core:2.2.1我试图添加"provided"以避免时间编译本身失败,因为"provided"关键字不包括那些JAR
我犯了什么错?
发布于 2018-03-22 09:35:23
您首先需要为程序集添加插件和依赖项,这将为您创建jar。
在plugins.sbt中
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")将此添加到您的build.sbt中
mainClass := Some("name of jar")
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}您可以引用我的github来创建jar并部署
https://stackoverflow.com/questions/49424957
复制相似问题