首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Netty版本与Spark + Elasticsearch传输冲突

Netty版本与Spark + Elasticsearch传输冲突
EN

Stack Overflow用户
提问于 2017-12-01 05:15:06
回答 0查看 2.9K关注 0票数 1

这之前有几个问题,有答案,但答案通常没有足够清楚的信息来解决问题。

我使用Apache Spark将数据注入到Elasticsearch中。我们使用的是X-Pack安全性及其相应的传输客户端。在特殊情况下,我使用传输客户端创建/删除索引,然后使用Spark进行摄取。当我们的代码到达client.close()时,会抛出一个异常:

代码语言:javascript
复制
Exception in thread "elasticsearch[_client_][generic][T#2]" java.lang.NoSuchMethodError: io.netty.bootstrap.Bootstrap.config()Lio/netty/bootstrap/BootstrapConfig;
        at org.elasticsearch.transport.netty4.Netty4Transport.lambda$stopInternal$5(Netty4Transport.java:443)
        at org.apache.lucene.util.IOUtils.close(IOUtils.java:89)
        at org.elasticsearch.common.lease.Releasables.close(Releasables.java:36)
        at org.elasticsearch.common.lease.Releasables.close(Releasables.java:46)
        at org.elasticsearch.common.lease.Releasables.close(Releasables.java:51)
        at org.elasticsearch.transport.netty4.Netty4Transport.stopInternal(Netty4Transport.java:426)
        at org.elasticsearch.transport.TcpTransport.lambda$doStop$5(TcpTransport.java:959)
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:569)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

起初,我认为X-Pack传输客户端使用的是来自Spark的Netty,所以我将其排除在外。即使在排除它之后,我们也会遇到同样的问题。下面是我们的一组依赖项:

代码语言:javascript
复制
    libraryDependencies ++= Seq(
   "com.crealytics" % "spark-excel_2.11" % "0.9.1" exclude("io.netty", "netty-all"),
  "com.github.alexarchambault" %% "scalacheck-shapeless_1.13" % "1.1.6" % Test,
  "com.holdenkarau" % "spark-testing-base_2.11" % "2.2.0_0.7.4" % Test exclude("org.scalatest", "scalatest_2.11") ,
  "com.opentable.components" % "otj-pg-embedded" % "0.9.0" % Test,
  "org.apache.spark" % "spark-core_2.11" % "2.2.0" % "provided" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
  "org.apache.spark" % "spark-hive_2.11" % "2.2.0" % "provided" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
  "org.apache.logging.log4j" % "log4j-core" %"2.8.2",
  "org.elasticsearch" % "elasticsearch-spark-20_2.11" % "5.5.0" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
  "org.elasticsearch.client" % "x-pack-transport" % "5.5.0",
  "org.elasticsearch.client" % "transport" % "5.5.0",
  "org.elasticsearch.test" % "framework" % "5.4.3" % Test,
  "org.postgresql" % "postgresql" % "42.1.4",
  "org.scalamock" %% "scalamock-scalatest-support" % "3.5.0" % Test,
  "org.scalatest" % "scalatest_2.11" % "3.0.1" % Test,
  "org.scalacheck" %% "scalacheck" % "1.13.4" % Test,
  "org.scalactic" %% "scalactic" % "3.0.1",
  "org.scalatest" %% "scalatest" % "3.0.1" % Test,
  "mysql" % "mysql-connector-java" % "5.1.44"
      )

我向sbt dependencyTree核实过,SBT并没有将netty排除在Spark和spark-excel之外,我不确定为什么……我们使用的是SBT 1.0.4。

更新: spark-submit/Spark是罪魁祸首,答案如下!

EN

回答

页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/47582740

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档