我有docker-compose.yml与下面的图像和配置
version: '3'
services:
spark-master:
image: bde2020/spark-master:2.4.4-hadoop2.7
container_name: spark-master
ports:
- "8080:8080"
- "7077:7077"
environment:
- INIT_DAEMON_STEP=setup_spark
spark-worker-1:
image: bde2020/spark-worker:2.4.4-hadoop2.7
container_name: spark-worker-1
depends_on:
- spark-master
ports:
- "8081:8081"
environment:
- "SPARK_MASTER=spark://spark-master:7077"这里是docker-compose up log -> https://jpst.it/1Xc4K
在这里容器启动并运行,我的意思是spark worker连接到spark master没有任何问题,现在的问题是我创建了drone.yml,并在其中添加了服务组件
services:
jce-cassandra:
image: cassandra:3.0
ports:
- "9042:9042"
jce-elastic:
image: elasticsearch:5.6.16-alpine
ports:
- "9200:9200"
environment:
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
janusgraph:
image: janusgraph/janusgraph:latest
ports:
- "8182:8182"
environment:
JANUS_PROPS_TEMPLATE: cassandra-es
janusgraph.storage.backend: cql
janusgraph.storage.hostname: jce-cassandra
janusgraph.index.search.backend: elasticsearch
janusgraph.index.search.hostname: jce-elastic
depends_on:
- jce-elastic
- jce-cassandra
spark-master:
image: bde2020/spark-master:2.4.4-hadoop2.7
container_name: spark-master
ports:
- "8080:8080"
- "7077:7077"
environment:
- INIT_DAEMON_STEP=setup_spark
spark-worker-1:
image: bde2020/spark-worker:2.4.4-hadoop2.7
container_name: spark-worker-1
depends_on:
- spark-master
ports:
- "8081:8081"
environment:
- "SPARK_MASTER=spark://spark-master:7077"但是这里的spark worker没有连接到spark master获取异常,here是异常日志详细信息,有人能告诉我为什么我面临这个问题吗?
注意:我正在尝试在drone.yml中为我的集成测试创建这些服务
发布于 2019-12-18 00:50:49
需要更好的格式。这些评论建议人们睡觉。假设这是dockerfile (https://hub.docker.com/r/bde2020/spark-worker/dockerfile),您可以通过添加以下命令来休眠:
spark-worker-1:
image: bde2020/spark-worker:2.4.4-hadoop2.7
container_name: spark-worker-1
command: sleep 10 && /bin/bash /worker.sh
depends_on:
- spark-master
ports:
- "8081:8081"
environment:
- "SPARK_MASTER=spark://spark-master:7077"尽管睡眠10可能会过多,但如果这会导致sleep 5或sleep 2
https://stackoverflow.com/questions/59232036
复制相似问题