我负责EMR的Flink作业。在我的EMR集群中,我可以在s3 Log URI:s3://aws-logs-xxxxx/elasticmapreduce/中看到我的作业日志,我的主节点中也有一些/usr/lib/flink/log/下的日志。因为我们只为根空间配置了20G,所以由于这些日志文件(flink-flink-historyserver-xxxxx.log)在/usr/lib/flink/log/下很容易达到限制。
我的问题是:
在/usr/lib/flink/log/
/usr/lib/flink/log/下的日志吗?之类的操作?
spark.history.fs.cleaner.enabled true
spark.history.fs.cleaner.maxAge 12h
spark.history.fs.cleaner.interval 1h这是我在flink-flink.Here中的HistoryServer配置
# Directory to upload completed jobs to. Add this directory to the list of
# monitored directories of the HistoryServer as well (see below).
jobmanager.archive.fs.dir: hdfs:///completed-jobs/
# The address under which the web-based HistoryServer listens.
historyserver.web.address: 0.0.0.0
# The port under which the web-based HistoryServer listens.
historyserver.web.port: 8082
# Comma separated list of directories to monitor for completed jobs.
historyserver.archive.fs.dir: hdfs:///completed-jobs/
# Interval in milliseconds for refreshing the monitored directories.
historyserver.archive.fs.refresh-interval: 10000发布于 2021-09-24 03:17:50
通过修改flink- path .path中的env.log.dir和env.log.max配置来控制文件路径和文件数量。有关更多日志配置,可以在conf文件夹中修改Log4j属性文件。
您可以参考以下配置https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/deployment/config/ https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/deployment/advanced/logging/
https://stackoverflow.com/questions/69306204
复制相似问题