首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >与Azure Java SDK一起使用时Apache Spark中的Jackson冲突

与Azure Java SDK一起使用时Apache Spark中的Jackson冲突
EN

Stack Overflow用户
提问于 2022-01-17 07:55:17
回答 1查看 601关注 0票数 0

Azure发布运行时这里中可用的jars。我目前正在使用ApacheSpark3.1运行时。

我的项目也依赖于1.4.0版本的作为依赖项(它引入了azure)。在Synapse上部署作业时,我会得到以下错误。

该作业在本地运行良好,但在Synapse上部署时则不然。

21/11/29 17:38:00 INFO ApplicationMaster: exitCode: 15 (原因:用户类抛出异常: java.lang.LinkageError: Package versions: jackson-core=2.10.0,jackson-databind=2.10.0,jackson-dataformat-xml=2.12.5,jackson-datatype-js310=2.12.5,azure-core=1.19.0,解决版本冲突: com.azure.core.implementation.jackson.ObjectMapperShim.createXmlMapper(ObjectMapperShim.java:73) at com.azure.core.util.serializer.JacksonAdapter.(JacksonAdapter.java:81) at com.azure.core.util.serializer.JacksonAdapter.(JacksonAdapter.java:58) at com.azure.core.util.serializer.JacksonAdapter$SerializerAdapterHolder.(JacksonAdapter.java:113) at com.azure.core.util.serializer.JacksonAdapter.createDefaultSerializerAdapter(JacksonAdapter.java:122) at com.azure.identity.implementation。com.azure.identity.implementation.IdentityClientBuilder.build(IdentityClientBuilder.java:139) at com.azure.identity.ManagedIdentityCredential.(ManagedIdentityCredential.java:70) at com.azure.identity.DefaultAzureCredentialBuilder.getCredentialsChain(DefaultAzureCredentialBuilder.java:129) at com.azure.identity.DefaultAzureCredentialBuilder.build(DefaultAzureCredentialBuilder.java:123)com.xxxxxxxxxxx.$anonfun$sendEvents$1$adapted(xxxxxxxGridSender.scala:25) at scala.collection.immutable.List.foreach(List.scala:392) at xxxxxx.xxxxxxscala.collection.Iterator.foreach(Iterator.scala:941),scala.collection.Iterator.foreach$(Iterator.scala:941),scala.collection.AbstractIterator.foreach(Iterator.scala:1429),scala.collection.IterableLike.foreach(IterableLike.scala:74),scala.collection.IterableLike.foreach$(IterableLike.scala:73),scala.collection.AbstractIterable.foreach(Iterable.scala:56),xxxxxx.runner。xxxxxx.xxxxxx.xxxxxx.xxxxxx.xxxxxx.xxxxxx.xxxxxx(xxxxxx.scala:61),xxxxxx.xxxxxx.xxxxxx.xxxxxx.xxxxxx.xxxxxx.$anonfun$start$2(xxxxxx.scala:39),scala.collection.TraversableLike$WithFilter.$anonfun$map$2(TraversableLike.scala:827),scala.collection.Iterator.foreach(Iterator.scala:941),scala.collection.Iterator.foreach$(Iterator.scala:941),scala.collection.AbstractIterator.foreach(Iterator.scala:1429)在scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:826) at xxxxxx.xxxxxx.xxxxxx.xxxxxx.x.xxxxx.xxxxxx.start(xxxxxx.scala:36) at xxxxxx.xxxxxx.xxxxxx.xxxxxx.xxxxxx$.main(xxxxxx.scala:29) at xxxxxx.xxxxxx.aiops.xxxxxx.xxxxxxsun.reflect.NativeMethodAccessorImpl.invoke0(Native方法的.main(xxxxxx.scala)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)的.main(xxxxxx.scala),org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)的org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732):java.lang.NoSuchMethodError: com.fasterxml.jackson.dataformat.xml.XmlMapper.coercionConfigDefaults()Lcom/fasterxml/jackson/databind/cfg/MutableCoercionConfig;在com.fasterxml.jackson.dataformat.xml.XmlMapper.(XmlMapper.java:176) at com.fasterxml.jackson.dataformat.xml.XmlMapper.(XmlMapper.java:145) at com.fasterxml.jackson.dataformat.xml.XmlMapper.(XmlMapper.java:127) at com.fasterxml.jackson.dataformat.xml.XmlMapper.builder(XmlMapper.java:218) at com.azure.core.implementation.jackson.ObjectMapperFactory.createXmlMapper(ObjectMapperFactory.java:84) at com.azure.core.implementation.jackson.ObjectMapperShim.createXmlMapper(ObjectMapperShim.java:70) 45

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2022-01-17 08:06:41

作为运行时的一部分,Synapse也有自己的jars。项目依赖项需要与运行时可用的jars兼容。

这里有两部分:

  1. Azure-核心获得杰克逊依赖2.12系列。因为ApacheSpark3.1还在2.10系列中。
  2. 蓝核在突触的类路径上已经可用(1.16.0)。因此,任何一个人加入的天蓝色库(和它一起得到的是作为依赖项的azure-core ),都需要与azure-core 1.16.0兼容。

为了修复(1),我添加了以下内容:

代码语言:javascript
复制
object DependencyOverrides {

  /**
   * We do not have any direct dependency on jackson. Spark relies on 2.10 series and Azure-core sdk has dependency on 2.12.
   * In order to resolve conflicts, we explicitly provide the jackson dependency here to 2.10.5
   */
  val jackson = Seq(
    "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.10.0",
    "com.fasterxml.jackson.core" % "jackson-core" % "2.10.0",
    "com.fasterxml.jackson.core" % "jackson-annotations" % "2.10.0",
    "com.fasterxml.jackson.core" % "jackson-databind" % "2.10.0",
    "com.fasterxml.jackson.dataformat" % "jackson-dataformat-xml" % "2.10.0",
    "com.fasterxml.jackson.datatype" % "jackson-datatype-jsr310" % "2.10.0",
  )

  val others = Seq(
    "com.google.guava" % "guava" % "27.0-jre"
  )

  val all = jackson ++ others
}

并覆盖SBT中的上述依赖项:

代码语言:javascript
复制
dependencyOverrides ++= DependencyOverrides.all

要修复(2),请在上面的others中添加相关的jars:

代码语言:javascript
复制
  val others = Seq(
    "com.azure" % "azure-core" % "1.16.0",
    "com.azure" % "azure-core-http-netty" % "1.6.2",
    "com.google.guava" % "guava" % "27.0-jre"
  )

在我的例子中,添加天蓝色的核心是不够的。不得不加上,天蓝色核心-http-netty和番石榴。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/70738081

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档