首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >创建新StreamingContext时的AbstractMethodError

创建新StreamingContext时的AbstractMethodError
EN

Stack Overflow用户
提问于 2019-08-01 04:51:15
回答 1查看 214关注 0票数 0

我一直在尝试实例化Spark Streaming的新StreamingContext时遇到问题。

我正在尝试创建一个新的StreamingContext,但是抛出了一个AbstractMethodError错误。我一直在调试堆栈跟踪,发现在StreamingListenerBus中创建第三个Spark ListenerBus时,应用程序会停止并抛出此错误。

下面是我正在尝试执行的代码

代码语言:javascript
复制
package platform.etl

import org.apache.spark.SparkConf
import org.apache.spark.streaming.{Seconds, StreamingContext}

object ClickGeneratorStreaming {
  def main(args: Array[String]): Unit = {

    val conf = new SparkConf().setAppName("ClickGeneratorStreaming").setMaster("local[*]")
    val ssc = new StreamingContext(conf, Seconds(10)

  }
}

下面是堆栈跟踪

代码语言:javascript
复制
Exception in thread "main" java.lang.AbstractMethodError
    at org.apache.spark.util.ListenerBus$class.$init$(ListenerBus.scala:35)
    at org.apache.spark.streaming.scheduler.StreamingListenerBus.<init>(StreamingListenerBus.scala:30)
    at org.apache.spark.streaming.scheduler.JobScheduler.<init>(JobScheduler.scala:56)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:183)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
    at platform.etl.ClickGeneratorStreaming$.main(ClickGeneratorStreaming.scala:10)
    at platform.etl.ClickGeneratorStreaming.main(ClickGeneratorStreaming.scala)

我的build.sbt

代码语言:javascript
复制
name := "spark"

version := "0.1"

scalaVersion := "2.11.0"

val sparkVersion = "2.3.0.2.6.5.0-292"
val sparkKafkaVersion = "2.3.0"
val argonautVersion = "6.2"

resolvers += "jitpack" at "https://jitpack.io"
resolvers += "horton" at "http://repo.hortonworks.com/content/repositories/releases"
resolvers += "horton2" at "http://repo.hortonworks.com/content/groups/public"


libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.3.2.6.5.0-292" excludeAll ExclusionRule(organization = "javax.servlet")
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "com.softwaremill.sttp" %% "core" % "1.2.0-RC2"
libraryDependencies += "com.softwaremill.retry" %% "retry" % "0.3.0"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test
libraryDependencies += "com.github.scopt" %% "scopt" % "3.7.0"
libraryDependencies += "io.argonaut" %% "argonaut" % argonautVersion
libraryDependencies += "io.argonaut" %% "argonaut-monocle" % argonautVersion
libraryDependencies += "com.github.scopt" %% "scopt" % "3.7.0"
libraryDependencies += "com.github.mrpowers" % "spark-fast-tests" % "v2.3.0_0.11.0" % "test"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.5"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.3.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch-spark-20_2.11" % "5.2.2"
libraryDependencies += "com.redislabs" % "spark-redis" % "2.3.1-M2"
libraryDependencies +=  "org.scalaj" %% "scalaj-http" % "2.4.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion 
libraryDependencies += "org.apache.spark" %% "spark-hive" % sparkVersion
libraryDependencies += "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion


assemblyMergeStrategy in assembly := {
  case PathList("javax", "servlet", xs @ _*)         => MergeStrategy.first
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case "application.conf"            => MergeStrategy.concat
  case "reference.conf"              => MergeStrategy.concat
  case _ => MergeStrategy.first
}

我的plugins.sbt

代码语言:javascript
复制
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2019-08-02 03:52:29

找到了问题所在。看起来我忘了在我的build.sbt上添加spark-streaming依赖,出于某种原因,它找到了一种方法来在我的导入上使用依赖的依赖,使它使用与我的spark版本不兼容的不同版本的spark-streaming

为了解决这个问题,我只在build.sbt中添加了一个换行符

代码语言:javascript
复制
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion

现在,它工作得无懈可击。

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/57298668

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档