我已经在我的Windows机器上安装了最新的Hadoop和Spark版本。我试图推出其中一个提供的例子,但失败了,我不知道诊断意味着什么。它似乎是与标准有关,但我无法找出根本原因。
我发出以下命令:
spark-submit --master yarn --class org.apache.spark.examples.JavaSparkPi C:\spark-3.0.1-bin-hadoop3.2\examples\jars\spark-examples_2.12-3.0.1.jar 100我的例外是:
21/01/25 10:53:53警告MetricsSystem:停止未运行的MetricsSystem 21/01/25 10:53:53 INFO OutputCommitCoordinator已停止!21/01/25 10:53:53 INFO SparkContext:在线程"main“org.apache.spark.SparkException中成功停止SparkContext异常: org.apache.spark.SparkException: application_1611568137841_0002 application_1611568137841_0002失败2次,原因是exitCode:-1退出appattempt_1611568137841_0002_000002:-1失败attempt.Diagnostics: 2021-01-25 10:53:53.381退出路径必须是绝对的
有关更详细的输出,请检查应用程序跟踪页面:http://xxxx-PC:8088/cluster/app/application_1611568137841_0002,然后单击指向每次尝试的日志的链接。。申请失败。在org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBack end.scala:95),在org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:201),at org.apache.spark.SparkContext.(SparkContext.scala:555) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934) at scala.Option.getOrElse( org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928) at org.apache.spark.examples.JavaSparkPi.main(JavaSparkPi.java:37) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala52)在org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org。apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 21/01/25 10:53:53 INFO ShutdownHookManager:关机钩子呼叫21/01/25 10:53:53 INFO ShutdownHookManager:删除目录ShutdownHookManager 21/01/25 :53:53 INFO ShutdownHookManager:删除目录C:\Users/xxx\AppData\Local\Temp\spark-3665ba77-d2aa-424a-9f75-e772bb5b9104
至于诊断方面:
诊断:
应用程序application_1611562870926_0004失败2次,原因是appattempt_1611562870926_0004_000002的AM容器退出exitCode:-1失败,此attempt.Diagnostics: 2021-01-25 10:29:19.734 output路径必须是绝对的,以获得更详细的输出,请检查应用程序跟踪页面:http://****-PC:8088/cluster/app/application_1611562870926_0004,然后单击指向每次尝试日志的链接。。申请失败。
谢谢!
发布于 2021-01-25 12:30:42
所以我还不确定根本原因,这可能是因为我在windows下运行,而且有些默认属性对Yarn来说是错误的。当我在yar-site.xml上添加以下两个属性时,它运行得很好:
<property>
<name>yarn.nodemanager.log-dirs</name>
<value>/tmp</value>
</property>
<property>
<name>yarn.log.dir</name>
<value>/tmp</value>
</property>希望它能对未来的人有所帮助!
https://stackoverflow.com/questions/65882951
复制相似问题