当尝试使用Powershell终端启动火花历史服务器(从我的SPARK_HOME/sbin)时
.\start-history-server.sh 通过以下消息启动windows终端,然后关闭。
ps: unknown option -- o
Try `ps --help' for more information.
starting org.apache.spark.deploy.history.HistoryServer, logging to C:\Spark/logs/spark--org.apache.spark.deploy.history.HistoryServer-1-<my-machine>.out
ps: unknown option -- o
Try `ps --help' for more information.
ps: unknown option -- o
Try `ps --help' for more information.
ps: unknown option -- o
Try `ps --help' for more information.
ps: unknown option -- o
Try `ps --help' for more information.下面是在'C:\Spark\logs‘中生成的spark--org.apache.spark.deploy.history.HistoryServer-1-<my-machine>.out中的输出
Spark Command: C:\Program Files (x86)\Java\jre1.8.0_161\bin\java -cp C:\Spark/conf\;C:\Spark\jars\* -Xmx1g org.apache.spark.deploy.history.HistoryServer C:\Spark\logs
========================================
"C:\Program Files (x86)\Java\jre1.8.0_161\bin\java" -cp "C:\Spark/conf\;C:\Spark\jars\*" -Xmx1g org.apache.spark.deploy.history.HistoryServer C:\Spark\logs
C:\Spark/bin/spark-class: line 96: CMD: bad array subscript我已经尝试过的:
更新后的“星星之火-defaults.conf”如下:
spark.eventLog.enabled true
spark.eventLog.dir file:///C:\Spark\logs
spark.history.fs.logDirectory file:///C:\Spark\logsAlso following the discussion here我尝试运行以下命令(从SPARK_HOME/sbin )
spark-class org.apache.spark.deploy.history.HistoryServer但是它导致了FileNotFound异常,如下所示(这很奇怪,因为它试图寻找C:Sparklogs而不是C:\Spark\logs )。
PS C:\Spark\sbin> spark-class org.apache.spark.deploy.history.HistoryServer Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/08/26 12:18:03 INFO HistoryServer: Started daemon with process name: 24364@<my-machine>
20/08/26 12:18:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/08/26 12:18:03 INFO SecurityManager: Changing view acls to: <USER>
20/08/26 12:18:03 INFO SecurityManager: Changing modify acls to: <USER>
20/08/26 12:18:03 INFO SecurityManager: Changing view acls groups to:
20/08/26 12:18:03 INFO SecurityManager: Changing modify acls groups to:
20/08/26 12:18:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(<USER>); groups with view permissions: Set(); users with modify permissions: Set(<USER>); groups with modify permissions: Set()
20/08/26 12:18:04 INFO FsHistoryProvider: History server ui acls disabled; users with admin permissions: ; groups with admin permissions
20/08/26 12:18:05 INFO Utils: Successfully started service on port 18080.
20/08/26 12:18:05 INFO HistoryServer: Bound HistoryServer to 0.0.0.0, and started at http://my-machine:18080
Exception in thread "main" java.io.FileNotFoundException: Log directory specified does not exist: file:///C:Sparklogs
at org.apache.spark.deploy.history.FsHistoryProvider.startPolling(FsHistoryProvider.scala:279)
at org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:227)
at org.apache.spark.deploy.history.FsHistoryProvider.start(FsHistoryProvider.scala:409)
at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:303)
at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.io.FileNotFoundException: File file:/C:Sparklogs does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:428)
at org.apache.spark.deploy.history.FsHistoryProvider.startPolling(FsHistoryProvider.scala:269)有谁能建议我在这里还可以尝试什么来解决这个问题并启动星火历史服务器呢?
谢谢。
发布于 2020-08-26 13:04:59
更新:以下工作的
'file:///C:/Spark/eventlog'
spark.eventLog.dir”和“spark.history.fs.logDirectory”更新为: SPARKHOME/sbin
Spark-class org.apache.spark.deploy.history.HistoryServer可以从这里访问历史服务器Web:http://localhost:18080https://stackoverflow.com/questions/63596885
复制相似问题