我想在本地挂载远程目录的数据。
docker run -d -it --name redis \
--mount src=airflow-dags,target=/app/dags,type=volume,volume-driver=vieux/sshfs \
-p 6379:6379 \
redis这是我的船坞-合成人。
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.3.3}
# build: .
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: mysql+mysqldb://root:xxxx@xxxxxx:3306/airflow
# For backward compatibility, with Airflow <2.3
AIRFLOW__CORE__SQL_ALCHEMY_CONN: mysql+mysqldb://root:xxxxx@xxxxx:3306/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+mysql://root:xxxxx@xxxx:3306/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@xxxxx:6379/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
AIRFLOW__API__AUTH_BACKENDS: 'airflow.api.auth.backend.basic_auth'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
volumes:
- airflow:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
user: "${AIRFLOW_UID:-50000}:0"
services:
airflow-scheduler:
<<: *airflow-common
command: scheduler
healthcheck:
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
airflow-init:
condition: service_completed_successfully
airflow-worker:
<<: *airflow-common
command: celery worker
healthcheck:
test:
- "CMD-SHELL"
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
interval: 10s
timeout: 10s
retries: 5
environment:
<<: *airflow-common-env
# Required to handle warm shutdown of the celery workers properly
# See https://airflow.apache.org/docs/docker-stack/entrypoint.html#signal-propagation
DUMB_INIT_SETSID: "0"
restart: always
depends_on:
airflow-init:
condition: service_completed_successfully
volumes:
airflow:
name: airflow-dags
external: true我在外部建立了一个数据卷。
docker volume create --driver vieux/sshfs \
-o sshcmd=xxxx@xxxxx:xxxxx \
-o password=xxxxxxxxxx \
airflow-dags如果我可以使用docker命令成功挂载,那么我将使用docker组合失败。
如果使用停靠组合,则文件权限将异常。
我无法访问这个文件。
发布于 2022-08-19 05:51:32
我发现了问题。因为我的容器权限设置不正确,所以无法访问该文件夹。
volumes:
airflow:
name: airflow_airflow
driver: vieux/sshfs
driver_opts:
sshcmd: yourusername@yourhost:/dir/XXX/XXX
password: xxxxxx您需要将停靠器中的用户字段设置为0,以便可以访问您的文件。
user: "0"您还可以添加字段allow_other: ""
volumes:
airflow:
name: airflow_airflow
driver: vieux/sshfs
driver_opts:
allow_other: ""
sshcmd: xxxx@xxxxxxx:/root/airflow/dags
password: xxxxhttps://stackoverflow.com/questions/73402681
复制相似问题