我在大学中使用以下代码片段记录了我在Apache Spark中的一个工作。
conf = SparkConf()\
.setAppName("Ex").set("spark.eventLog.enabled", "true")\
.set("spark.eventLog.dir", "log")作业完成后,我尝试复制日志文件app-20170416171823-0000。登录到我的本地系统,并尝试按照以下命令浏览录制的Spark Web UI。
sbin/start-history-server.sh ~/Downloads/log/app-20170416171823-0000但历史记录已终止,并出现以下错误:
failed to launch: nice -n 0 /usr/local/Cellar/apache-spark/2.1.0/libexec/bin/spark-class org.apache.spark.deploy.history.HistoryServer /Users/sk/Downloads/log/app-20170416171823-0000 at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:77) ... 6 more full log in /usr/local/Cellar/apache-spark/2.1.0/libexec/logs/spark-sk-org.apache.spark.deploy.history.HistoryServer-1-Sk-MacBook-Pro.local.out
历史服务器输出内容:
17/04/16 17:44:52 INFO SecurityManager: Changing modify acls groups to:
17/04/16 17:44:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(skaran); groups with view permissions: Set(); users with modify permissions: Set(skaran); groups with modify permissions: Set()
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:278)
at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.lang.IllegalArgumentException: Logging directory specified is not a directory: file:/Users/sk/Downloads/log/app-20170416171823-0000
at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:198)
at org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:153)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:149)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:77)
... 6 more发布于 2017-04-17 06:10:09
似乎参数应该是包含日志的文件夹。
https://stackoverflow.com/questions/43442539
复制相似问题