首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >EKS上的spark操作符Apache spark未能创建临时目录。

EKS上的spark操作符Apache spark未能创建临时目录。
EN

Stack Overflow用户
提问于 2021-10-06 09:55:03
回答 1查看 342关注 0票数 1

我试图使用火花操作符将简单的星火-pi.yaml部署到AWS EKS。我成功地部署了火花操作员。

请参考这里的部署YAML 火花算子实例

当我执行舵机安装时,我会收到以下错误

代码语言:javascript
复制
Events:
  Type     Reason                            Age   From            Message
  ----     ------                            ----  ----            -------
  Normal   SparkApplicationAdded             8s    spark-operator  SparkApplication spark-pi was added, enqueuing it for submission
  Warning  SparkApplicationSubmissionFailed  5s    spark-operator  failed to submit SparkApplication spark-pi: failed to run spark-submit for SparkApplication spark-operator/spark-pi: WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.io.IOException: Failed to create a temp directory (under /tmp) after 10 attempts!
  at org.apache.spark.util.Utils$.createDirectory(Utils.scala:305)
  at org.apache.spark.util.Utils$.createTempDir(Utils.scala:325)
  at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

如何解决这个问题?

EN

回答 1

Stack Overflow用户

发布于 2021-10-06 14:30:13

这将很难调试,但根据我的经验,这里可能会发生一些事情-

  1. 我看你的遗嘱执行人没有定义它的服务帐户。您可能需要显式地定义
  2. 您的卷中可能没有足够的空间来创建/tmp目录。您可能需要再次检查您的卷大小。
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/69463502

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档