我正在运行Spark独立集群,在提交应用程序时,spark驱动程序停止并显示以下错误。
16/01/12 23:26:14 INFO Worker: Asked to kill executor app-20160112232613-0012/0
16/01/12 23:26:14 INFO ExecutorRunner: Runner thread for executor app-20160112232613-0012/0 interrupted
16/01/12 23:26:14 INFO ExecutorRunner: Killing process!
16/01/12 23:26:14 ERROR FileAppender: Error writing stream to file /spark/spark-1.4.1/work/app-20160112232613-0012/0/stderr
java.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:283)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
16/01/12 23:26:14 INFO Worker: Executor app-20160112232613-0012/0 finished with state KILLED exitStatus 143
16/01/12 23:26:14 INFO Worker: Cleaning up local directories for application app-20160112232613-0012我是Spark的新手,它的processing.Please在这方面帮了我的忙。
发布于 2016-01-14 00:11:46
该错误不是由java.io.IOException引起的,因为您可以清楚地看到16/01/12 23:26:14 INFO Worker: Asked to kill executor app-20160112232613-0012/0。当spark尝试写入日志文件时,会引发此异常,在日志文件中,您还会观察到错误的原因。
即使您使用root特权spark-submit运行,也是spark用户写入文件。我猜你是在你的笔记本电脑上运行的。试着在你的spark文件夹上运行sudo chmod -R 777。
发布于 2016-01-13 18:26:50
在我的例子中,问题是spark driver无法从提交的可执行jar中获取依赖项。合并所有依赖项并将其转换为单个可执行文件。它修复了这个问题。
请容忍我的术语:)
https://stackoverflow.com/questions/34761274
复制相似问题