首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >运行时出错,sbt编译传递。

运行时出错,sbt编译传递。
EN

Stack Overflow用户
提问于 2016-01-29 02:54:43
回答 1查看 494关注 0票数 1

我有一段代码可以编译(Scala +Spark1.6)。然后我运行它(使用Spark1.6),但是它抱怨没有一个1.6方法。什么给予??

simple.sbt:

代码语言:javascript
复制
name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "Conjars" at "http://conjars.org/repo"
resolvers += "cljars" at "https://clojars.org/repo/"

mainClass in Compile := Some("Medtronic.Class")

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "1.7.2"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark" % "2.1.1"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "1.8.0"

编译:

代码语言:javascript
复制
$ sbt assembly
[info] Loading project definition from /Users/mlieber/projects/spark/test/project
[info] Set current project to Simple Project (in build file:/Users/mlieber/projects/spark/test/)
[info] Updating {file:/Users/mlieber/projects/spark/test/}test...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] Scala version was updated by one of library dependencies:
[warn]  * org.scala-lang:scala-library:(2.10.4, 2.10.0) -> 2.10.5
[warn] To force scalaVersion, add the following:
[warn]  ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn]  * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0
[warn] Run 'evicted' to see detailed eviction warnings
..

[info] Run completed in 257 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
..
[info] Including from cache: spark-core_2.10-1.6.0.jar
..
[info] Including from cache: spark-streaming_2.10-1.6.0.jar
..
[info] Assembly up to date: /Users/mlieber/projects/spark/test/target/scala-2.10/stream_test_1.0.jar
[success] Total time: 98 s, completed Jan 28, 2016 4:05:22 PM

我运行的

代码语言:javascript
复制
./app/spark-1.6.0-bin-hadoop2.6/bin/spark-submit --jars /Users/mlieber/app/elasticsearch-1.7.2/lib/elasticsearch-1.7.2.jar  --master local[4] --class "MyClass"    ./target/scala-2.10/stream_test_1.0.jar 

编译错误:

代码语言:javascript
复制
    Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.streaming.dstream.PairDStreamFunctions.mapWithState(Lorg/apache/spark/streaming/StateSpec;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/MapWithStateDStream;    
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
..
    16/01/28 18:35:23 INFO SparkContext: Invoking stop() from shutdown hook
EN

回答 1

Stack Overflow用户

发布于 2016-01-29 05:26:42

您的项目正在遭受依赖地狱的困扰。所发生的事情是SBT默认解析传递依赖项,并且您的依赖项(elasticsearch-spark)之一需要另一个版本的spark-core。从你的日志中:

代码语言:javascript
复制
[warn] Here are some of the libraries that were evicted:
[warn]  * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0

看起来elasticsearch-spark所需的版本与您的项目所使用的版本不兼容,那么当您的项目运行时会出现一个错误。

编译时没有错误,因为正在编译的代码(也就是您的代码)与解析的版本兼容。

以下是关于如何解决这一问题的一些选项:

  1. 您可以尝试将elasticsearch-spark升级到2.1.2版本,并查看它是否带来了一个更更新的spark-core版本(该版本可以与您的项目兼容)。version 2.2.0-rc1 1.6.0和升级到此版本肯定会解决问题,但请记住,您将使用发布候选版本。
  2. 您可以尝试将spark-corespark-streaming降级为版本1.4.1 (由elasticsearch-spark使用的版本),并在必要时调整代码。
票数 2
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/35076085

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档