我已经安装了一个EFK堆栈来记录nginx访问日志。
在使用fresh install时,我能够将数据从Fluentd发送到elasticsearch,没有任何问题。但是,我安装了searchguard来在elasticsearch和kibana上实现身份验证。现在,我能够登录到Kibana和elasticsearch与搜索卫士演示用户凭证。
现在我的问题是,fluentd无法连接到elasticsearch。从td-agent日志中,我得到了以下消息:
2018-07-19 15:20:34 +0600 [warn]: #0 failed to flush the buffer. retry_time=5 next_retry_seconds=2018-07-19 15:20:34 +0600 chunk="57156af05dd7bbc43d0b1323fddb2cd0" error_class=Fluent::Plugin::ElasticsearchOutput::ConnectionFailure error="Can not reach Elasticsearch cluster ({:host=>\"<elasticsearch-ip>\", :port=>9200, :scheme=>\"http\", :user=>\"logstash\", :password=>\"obfuscated\"})!"这是我的Fluentd配置
<source>
@type forward
</source>
<match user_count.**>
@type copy
<store>
@type elasticsearch
host https://<elasticsearch-ip>
port 9200
ssl_verify false
scheme https
user "logstash"
password "<logstash-password>"
index_name "custom_user_count"
include_tag_key true
tag_key "custom_user_count"
logstash_format true
logstash_prefix "custom_user_count"
type_name "custom_user_count"
utc_index false
<buffer>
flush_interval 2s
</buffer>
</store>
</match>sg_roles.yml:
sg_logstash:
cluster:
- CLUSTER_MONITOR
- CLUSTER_COMPOSITE_OPS
- indices:admin/template/get
- indices:admin/template/put
indices:
'custom*':
'*':
- CRUD
- CREATE_INDEX
'logstash-*':
'*':
- CRUD
- CREATE_INDEX
'*beat*':
'*':
- CRUD
- CREATE_INDEX有人能在这方面帮我吗?
https://stackoverflow.com/questions/51419072
复制相似问题