我正在尝试运行一个应用程序,它是由kafka提供的星火结构流数据输入组成的。星火版本为2.4.0,scala版本为2.12.7。我用sbt制作了多个胖罐子--我的项目是多模块项目。建造罐子不是问题。当我试图用我的jar做spark-submit时,NoSuchMethodError就出现了。
provided中删除了spark-sql-kafka-0-10范围。val sparkSqlKafka = "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
assemblyMergeStrategy中,我在下面添加了一行。case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
这是完整的错误日志。
2019-01-08 11:55:12 ERROR ApplicationMaster:91 - User class threw exception: java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider org.apache.spark.sql.kafka010.KafkaSourceProvider could not be instantiated
java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider org.apache.spark.sql.kafka010.KafkaSourceProvider could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:630)
at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:161)
at ThisIsMyClass$.main(ThisIsMyClass.scala:28)
at ThisIsMyClass.main(ThisIsMyClass.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
at org.apache.spark.sql.kafka010.KafkaSourceProvider.<init>(KafkaSourceProvider.scala:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 19 more
编辑1。
下面是整个依赖关系。
val sparkVersion = "2.4.0"
val typesafeConfigVersion = "1.3.3"
val scalaTestVersion = "3.0.5"
val junitVersion = "4.12"
val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
val sparkMllib = "org.apache.spark" %% "spark-mllib" % sparkVersion % "provided"
val typesafeConfig = "com.typesafe" % "config" % typesafeConfigVersion
val scalaTest = "org.scalatest" %% "scalatest" % scalaTestVersion % Test
val junit = "junit" % "junit" % "4.12" % Test
val logback = "ch.qos.logback" % "logback-classic" % "1.2.3"
val scalaLogging = "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0"
val sparkStreaming = "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
val sparkSqlKafka = "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
编辑2。
我在我的slf4j-api中发现了一些依赖版本问题。
因此,我已经将构建配置更改为只使用一个版本的slf4j-api版本与spark-core依赖项匹配。并排除其他slf4j-api。
仍然是同样的错误。:(
编辑3。
我在我的--packages org.apache.spark:spark-sql-kafka-0-10_2.12:2.4.0脚本中添加了spark-submit。
仍然是同样的错误。
发布于 2019-01-09 11:48:57
问题解决了。
当我打开spark-shell时,我发现spark版本是2.4.0,scala版本是2.11。
我在build.sbt中的scala版本是2.12。
Scala版本是关键!
谢谢你们所有人。
发布于 2019-01-08 06:09:14
请更新您的依赖项。
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.7</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.7</version>
</dependency>https://stackoverflow.com/questions/54085459
复制相似问题