当试图将Dockerized连接到时,我收到了以下错误:
f.p.e.c.f.c.ElasticsearchClientManager未能创建elasticsearch客户端,禁用爬虫…
运行爬虫时收到的f.p.e.c.f.FsCrawler致命错误:连接拒绝
发布于 2020-08-10 22:52:31
当第一次运行fscrawler (即docker-compose run fscrawler)时,它将使用以下默认设置创建/config/{fscrawer_job}/_settings.yml:
elasticsearch:
nodes:
- url: "http://127.0.0.1:9200"这将导致fscrawler试图连接到本地主机(即127.0.0.1)。但是,当fscrawler位于码头容器中时,这将失败,因为它试图连接容器的本地主机。在我的例子中,这尤其令人困惑,因为elasticsearch可以作为localhost访问,但在物理计算机的localhost (而不是容器的localhost )上。更改url允许fscrawler连接到elasticsearch实际驻留的网络地址。
elasticsearch:
nodes:
- url: "http://elasticsearch:9200"
我使用了以下码头映像:https://hub.docker.com/r/toto1310/fscrawler
# FILE: docker-compose.yml
version: '2.2'
services:
# FSCrawler
fscrawler:
image: toto1310/fscrawler
container_name: fscrawler
volumes:
- ${PWD}/config:/root/.fscrawler
- ${PWD}/data:/tmp/es
networks:
- esnet
command: fscrawler job_name
# Elasticsearch Cluster
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.3.2
container_name: elasticsearch
environment:
- node.name=elasticsearch
- discovery.seed_hosts=elasticsearch2
- cluster.initial_master_nodes=elasticsearch,elasticsearch2
- cluster.name=docker-cluster
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- esdata01:/usr/share/elasticsearch/data
ports:
- 9200:9200
networks:
- esnet
elasticsearch2:
image: docker.elastic.co/elasticsearch/elasticsearch:7.3.2
container_name: elasticsearch2
environment:
- node.name=elasticsearch2
- discovery.seed_hosts=elasticsearch
- cluster.initial_master_nodes=elasticsearch,elasticsearch2
- cluster.name=docker-cluster
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- esdata02:/usr/share/elasticsearch/data
networks:
- esnet
volumes:
esdata01:
driver: local
esdata02:
driver: local
networks:
esnet:
运行docker-compose up elasticsearch elasticsearch2来打开elasticsearch节点。
运行docker-compose run fscrawler来创建_settings.yml
编辑_settings.yml到
elasticsearch:
nodes:
- url: "http://elasticsearch:9200"启动fscrawler docker-compose up fscrawler
https://stackoverflow.com/questions/63349102
复制相似问题