我有一段代码可以编译(Scala +Spark1.6)。然后我运行它(使用Spark1.6),但是它抱怨没有一个1.6方法。什么给予??
simple.sbt:
name := "Simple Project"
version := "1.0"
scalaVersion := "2.10.4"
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "Conjars" at "http://conjars.org/repo"
resolvers += "cljars" at "https://clojars.org/repo/"
mainClass in Compile := Some("Medtronic.Class")
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "1.7.2"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark" % "2.1.1"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "1.8.0"编译:
$ sbt assembly
[info] Loading project definition from /Users/mlieber/projects/spark/test/project
[info] Set current project to Simple Project (in build file:/Users/mlieber/projects/spark/test/)
[info] Updating {file:/Users/mlieber/projects/spark/test/}test...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] Scala version was updated by one of library dependencies:
[warn] * org.scala-lang:scala-library:(2.10.4, 2.10.0) -> 2.10.5
[warn] To force scalaVersion, add the following:
[warn] ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0
[warn] Run 'evicted' to see detailed eviction warnings
..
[info] Run completed in 257 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
..
[info] Including from cache: spark-core_2.10-1.6.0.jar
..
[info] Including from cache: spark-streaming_2.10-1.6.0.jar
..
[info] Assembly up to date: /Users/mlieber/projects/spark/test/target/scala-2.10/stream_test_1.0.jar
[success] Total time: 98 s, completed Jan 28, 2016 4:05:22 PM我运行的:
./app/spark-1.6.0-bin-hadoop2.6/bin/spark-submit --jars /Users/mlieber/app/elasticsearch-1.7.2/lib/elasticsearch-1.7.2.jar --master local[4] --class "MyClass" ./target/scala-2.10/stream_test_1.0.jar 编译错误:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.streaming.dstream.PairDStreamFunctions.mapWithState(Lorg/apache/spark/streaming/StateSpec;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/MapWithStateDStream;
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
..
16/01/28 18:35:23 INFO SparkContext: Invoking stop() from shutdown hook发布于 2016-01-29 05:26:42
您的项目正在遭受依赖地狱的困扰。所发生的事情是SBT默认解析传递依赖项,并且您的依赖项(elasticsearch-spark)之一需要另一个版本的spark-core。从你的日志中:
[warn] Here are some of the libraries that were evicted:
[warn] * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0看起来elasticsearch-spark所需的版本与您的项目所使用的版本不兼容,那么当您的项目运行时会出现一个错误。
编译时没有错误,因为正在编译的代码(也就是您的代码)与解析的版本兼容。
以下是关于如何解决这一问题的一些选项:
elasticsearch-spark升级到2.1.2版本,并查看它是否带来了一个更更新的spark-core版本(该版本可以与您的项目兼容)。version 2.2.0-rc1 1.6.0和升级到此版本肯定会解决问题,但请记住,您将使用发布候选版本。spark-core和spark-streaming降级为版本1.4.1 (由elasticsearch-spark使用的版本),并在必要时调整代码。https://stackoverflow.com/questions/35076085
复制相似问题