我在C:\Spark1_6\spark-1.6.0-bin-hadoop2.6上安装了火花。导航到此路径后,我将输入sbt assembly命令,并收到以下错误消息:
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':'
[error] Not a valid key: assembly
[error] assembly
[error] ^这是我的sbt项目结构。
-Project101
-project
-build.properties
-plugins.sbt
-src
-build.sbt这是我的build.sbt
name := "Project101"
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.6.0" exclude ("org.apache.hadoop","hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude ("org.apache.hadoop","hadoop-yarn-server-web-proxy"),
"org.apache.spark" %% "spark-hive" % "1.6.0",
"org.apache.spark" %% "spark-streaming" % "1.6.0",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.0"
)
resolvers in Global ++= Seq(
"Sbt plugins" at "https://dl.bintray.com/sbt/sbt-plugin-releases",
"Maven Central Server" at "http://repo1.maven.org/maven2",
"TypeSafe Repository Releases" at "http://repo.typesafe.com/typesafe/releases/",
"TypeSafe Repository Snapshots" at "http://repo.typesafe.com/typesafe/snapshots/"
)这是plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")sbt package命令正在工作,并且能够创建jar文件。但我也不得不执行sbt assembly命令,但没有工作。
发布于 2018-12-25 21:48:27
非有效命令:程序集
每当遇到错误消息时,请确保安装了sbt-装配插件的项目位于顶层目录中。
如果您在Project101目录中有一个项目,请确保project/plugins.sbt中有以下行:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")这样,您应该再次在Project101目录中执行sbt assembly。它应该执行插件来创建一个uber-jar。
发布于 2022-08-03 16:33:45
正如在另一个答案中正确指出的那样,该问题是由运行sbt assembly而不是从项目目录运行造成的。
如果您在Docker容器中获得了这个问题,请确保您的挂载设置正确。
例如:
docker run ... -v $$PWD/:/<your Scala project directory here> -w /<your Scala project directory here> <your build image> bash -c "sbt assembly"https://stackoverflow.com/questions/44106182
复制相似问题