首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Spring Spark集成- org.springframework.context.annotation.AnnotationConfigApplicationContext :java.io.NotSerializableException

Spring Spark集成- org.springframework.context.annotation.AnnotationConfigApplicationContext :java.io.NotSerializableException
EN

Stack Overflow用户
提问于 2018-08-12 16:40:10
回答 2查看 925关注 0票数 0

我正在为我的spark应用程序使用spring boot,所有的依赖项都是通过spring来管理的,我正在使用Autowire来添加依赖项。提交给executors的My Function类和Custom类实现了Serializable。

但当我运行它并将任务提交给执行器时,它抛出了异常:一个spring类不是serilazable - AnnotationConfigApplicationContext

代码语言:javascript
复制
Caused by: java.io.NotSerializableException: org.springframework.context.annotation.AnnotationConfigApplicationContext

堆栈跟踪:

代码语言:javascript
复制
java.lang.IllegalStateException: Failed to execute CommandLineRunner
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:800) [spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]
    at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:781) [spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:335) [spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) [spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) [spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]
    at com.bikas.MyStarter(MyStarter:67) [classes/:?]
Caused by: org.apache.spark.SparkException: Task not serializable
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:345) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:335) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2299) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:928) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:927) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:927) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.api.java.JavaRDDLike$class.foreachPartition(JavaRDDLike.scala:219) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.api.java.AbstractJavaRDDLike.foreachPartition(JavaRDDLike.scala:45) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at com.bikas.MyStarter.run(MyStarter:81) ~[classes/:?]
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:797) ~[spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]
    ... 5 more
Caused by: java.io.NotSerializableException: org.springframework.context.annotation.AnnotationConfigApplicationContext
Serialization stack:
    - object not serializable (class: org.springframework.context.annotation.AnnotationConfigApplicationContext, value: org.springframework.context.annotation.AnnotationConfigApplicationContext@33a55bd8: startup date [Sun Aug 12 13:59:34 IST 2018]; root of context hierarchy)
    - field (class: com.bikas.services.MyServiceImpl, name: applicationContext, type: interface org.springframework.context.ApplicationContext)
    - object (class com.bikas.services.MyServiceImpl, com.bikas.services.MyServiceImpl@1260c85e)
    - field (class: com.bikas.services.MyProcessor, name: myServiceImpl, type: interface com.bikas.services.MyService)
    - object (class com.bikas.services.MyProcessor, com.bikas.services.MyProcessor@2b551e7b)
    - field (class: org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, name: f$12, type: interface org.apache.spark.api.java.function.VoidFunction)
    - object (class org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, <function1>)
    at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:342) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:335) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2299) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:928) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:927) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:927) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.api.java.JavaRDDLike$class.foreachPartition(JavaRDDLike.scala:219) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at org.apache.spark.api.java.AbstractJavaRDDLike.foreachPartition(JavaRDDLike.scala:45) ~[spark-core_2.11-2.3.1.jar:2.3.1]
    at com.bikas.MyStarter.run(MyStarter.java:81) ~[classes/:?]
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:797) ~[spring-boot-2.0.1.RELEASE.jar:2.0.1.RELEASE]

有什么帮助吗?或者其他人遇到过这个问题?

EN

回答 2

Stack Overflow用户

发布于 2018-08-12 16:59:17

函数和任何包含在executors上运行的函数的类都不应该依赖于Spring Boot。看看是否可以通过检查找到引用,或者序列化堆栈跟踪可能会对您有所帮助。将运行在executors上的函数声明为专用类中的静态方法是避免此问题的好方法。

票数 1
EN

Stack Overflow用户

发布于 2021-10-24 21:00:43

会写一个我找到的变通方法。您可以在要调用的组件上使用spring组件包装器。f.e.

代码语言:javascript
复制
@Component
public class ComponentIWantToUseWrapper {

    private static ComponentIWantToUse componentIWantToUse;

    public ComponentIWantToUseWrapper(ComponentIWantToUse component) {
        ComponentIWantToUseWrapper.componentIWantToUse = component;
    }

    public static ComponentIWantToUse getComponent() {
        return component;
    }
}

因此在组件初始化期间,弹簧将改变包装器静态变量。您现在可以在spark lambdas中使用您的组件,就像这样的ComponentIWantToUseWrapper.getComponent().someMethod()。希望能有所帮助。

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/51807160

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档