首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >线程"main“java.lang.NoClassDefFoundError异常:io/fabric8 8/kubernetes/client/Watcher

线程"main“java.lang.NoClassDefFoundError异常:io/fabric8 8/kubernetes/client/Watcher
EN

Stack Overflow用户
提问于 2020-04-22 15:16:45
回答 1查看 513关注 0票数 0

我遵循本教程,在这里使用kubectl命令运行Spark-Pi应用程序。https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/quick-start-guide.md#running-the-examples

当我使用spark-on-operator提交spark任务时,出现错误。我使用kubectl apply -f examples/spark-pi.yaml提交作业,但它失败了。我的spark版本是2.4.4,spark-on-operator版本是v1beta2-1.0.1-2.4.4。以下是消息:

代码语言:javascript
复制
Warning  SparkApplicationFailed  3m  spark-operator  SparkApplication spark-pi failed: failed to run spark-submit for SparkApplication default/spark-pi: Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/Watcher
       at java.lang.ClassLoader.defineClass1(Native Method)
       at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
       at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
       at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
       at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
       at java.security.AccessController.doPrivileged(Native Method)
       at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
       at java.lang.ClassLoader.defineClass1(Native Method)
       at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
       at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
       at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
       at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
       at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
       at java.security.AccessController.doPrivileged(Native Method)
       at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
       at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:233)
       at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:204)
       at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
       at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
       at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
       at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
       at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: 
  io.fabric8.kubernetes.client.Watcher
  at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 33 more

之后,我尝试修改docker文件,将kubernetes-client-4.1.2.jar添加到yaml文件中,然后重新创建docker镜像。但错误消息是same.My Dockerfile文件是:

代码语言:javascript
复制
ARG spark_image=gcr.io/spark-operator/spark:v2.4.4

FROM $spark_image

RUN mkdir -p /opt/spark/jars
ADD https://repo1.maven.org/maven2/io/fabric8/kubernetes- 
client/4.1.2/kubernetes-client-4.1.2.jar /opt/spark/jars
ENV SPARK_HOME /opt/spark
WORKDIR /opt/spark/work-dir
ENTRYPOINT [ "/opt/entrypoint.sh" ]

任何人帮忙都会很感激的

EN

回答 1

Stack Overflow用户

发布于 2020-04-23 14:55:23

Spark-on-operator要求k8s的版本是1.13+,但在我使用的openshift中,k8s的版本是1.11+,所以用更高版本的k8s替换它可以解决这个问题。

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/61359324

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档