首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >如何避免docker-compose忽略dockerfile?

如何避免docker-compose忽略dockerfile?
EN

Stack Overflow用户
提问于 2021-07-28 00:20:17
回答 1查看 243关注 0票数 0

我正在尝试用Dockerfile中的扩展图像修改Airflow docker-compose设置,以便在容器上安装dbt,但docker-compose文件似乎忽略了Dockerfile:不同的气流容器启动并正常运行,但没有一个容器(完全)安装了dbt。我得到以下错误:

代码语言:javascript
复制
root@42b2358a7792:/opt/airflow# dbt --version
Traceback (most recent call last):
  File "/home/airflow/.local/bin/dbt", line 5, in <module>
    from dbt.main import main
ModuleNotFoundError: No module named 'dbt'

docker-compose.yml文件:

代码语言:javascript
复制
version:                                        "3.7"
#https://github.com/compose-spec/compose-spec/blob/master/spec.md#using-extensions-as-fragments

# Airflow extensions
x-airflow-common:                               &airflow-common
  build:                                        .
  environment:                                  &airflow-common-env
    AIRFLOW__CORE__EXECUTOR:                    CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN:            postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__RESULT_BACKEND:            db+postgresql://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__BROKER_URL:                redis://:@redis:6379/0
    AIRFLOW__CORE__FERNET_KEY:                  ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
    AIRFLOW__CORE__LOAD_EXAMPLES:               'true'
    AIRFLOW__API__AUTH_BACKEND:                 'airflow.api.auth.backend.basic_auth'
  volumes:
    - ./dags:/opt/airflow/dags
    - ./logs:/opt/airflow/logs
    - ./plugins:/opt/airflow/plugins

  user:                                         "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
  depends_on:
    redis:
      condition:                                service_healthy
    postgres:
      condition:                                service_healthy


services:
 
  # Database
  postgres:
      image:                                    postgres:13
      environment:
        POSTGRES_USER:                          airflow
        POSTGRES_PASSWORD:                      airflow
        POSTGRES_DB:                            airflow
      volumes:
        - postgres-db-volume:/var/lib/postgresql/data
      healthcheck:
        test:                                   ["CMD", "pg_isready", "-U", "airflow"]
        interval:                               5s
        retries:                                5
      restart:                                  always
      ports:
              - 5432:5432

  # Airflow services
  redis:
      image:                                    redis:latest
      container_name:                           airflow-redis
      ports:
        - 6379:6379
      healthcheck:
        test:                                   ["CMD", "redis-cli", "ping"]
        interval:                               5s
        timeout:                                30s
        retries:                                50
      restart:                                  always
  airflow-webserver:
      <<:                                       *airflow-common
      container_name:                           airflow-webserver
      command:                                  webserver
      ports:
        - 8080:8080
      healthcheck:
        test:                                   ["CMD", "curl", "--fail", "http://localhost:8080/health"]
        interval:                               10s
        timeout:                                10s
        retries:                                5
      restart:                                  always
  airflow-scheduler:
      <<:                                       *airflow-common
      container_name:                           airflow-scheduler
      command:                                  scheduler
      healthcheck:
        test:                                   ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
        interval:                               10s
        timeout:                                10s
        retries:                                5
      restart:                                  always
  airflow-worker:
      <<:                                       *airflow-common
      container_name:                           airflow-worker
      command:                                  celery worker
      healthcheck:
        test:
          - "CMD-SHELL"
          - 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
        interval:                               10s
        timeout:                                10s
        retries:                                5
      restart:                                  always

  airflow-init:
      <<:                                       *airflow-common
      container_name:                           airflow-init
      command:                                  version
      environment:
        <<:                                     *airflow-common-env
        _AIRFLOW_DB_UPGRADE:                    'true'
        _AIRFLOW_WWW_USER_CREATE:               'true'
        _AIRFLOW_WWW_USER_USERNAME:             ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
        _AIRFLOW_WWW_USER_PASSWORD:             ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}



volumes:
  workspace:
     name:                                      ${WORKSPACE_DOCKER_MOUNT}
  data:
     name:                                      ${DATA_DOCKER_MOUNT}
  db:
     name:                                      ${DB_DOCKER_MOUNT}
  postgres-db-volume:

dockerfile:

代码语言:javascript
复制
FROM apache/airflow:2.1.0
RUN apt-get update \
  && apt-get install -y --no-install-recommends \
         vim \
  && apt-get autoremove -yqq --purge \
  && apt-get clean \
  && rm -rf /var/lib/apt/lists/*
RUN apt-get install -y git libpq-dev python-dev python3-pip
RUN apt-get remove python-cffi
RUN pip install --upgrade cffi
RUN pip install cryptography~=3.4
RUN pip install dbt==0.19.0

我已经尝试过直接构建服务级别的镜像,只需运行pip install dbt==0.19.0就可以更改Dockerfile文件,但不起作用。我希望避免的唯一替代方法是手动安装。

你知道为什么docker-compose会忽略Dockerfile吗?

EN

回答 1

Stack Overflow用户

发布于 2021-07-28 01:21:39

Dockerfile有几个问题。

我建议在准备Dockerfile文件后运行一次docker build .,以检查它是否正确。

它将失败,因为在运行apt-get时,您没有将用户临时更改为超级用户,如下所述:https://airflow.apache.org/docs/docker-stack/build.html#adding-new-apt-package

而且,即使你修复了它,你的Dockerfile文件也会失败,因为你的apt-get install命令被一分为二。第一个删除所有缓存(安装vim的缓存很可能不是您想要做的)。第二个apt-get install将失败,因为它没有apt缓存,也找不到您的包。因此,您似乎从未真正构建过您的docker镜像。

此外,它也会失败,因为python-cffi不是默认安装在气流图像中,尝试删除它将失败。

我在您的文件中做了一些小的修改,使其成为构建的(但我不确定镜像是否是您想要的):

代码语言:javascript
复制
FROM apache/airflow:2.1.0
USER root
RUN apt-get update \
  && apt-get install -y git libpq-dev python-dev python3-pip \
  && apt-get autoremove -yqq --purge \
  && apt-get clean \
  && rm -rf /var/lib/apt/lists/*
USER airflow
RUN pip install --upgrade cffi
RUN pip install cryptography~=3.4 dbt==0.19.0

您的Dockerfile的另一个问题是它没有针对大小进行优化(所有生产映像都应该进行优化)。通过添加python-dev,您添加了许多包(编译器和类似的~),这将使您的图像比其应有的要大得多。如果你让它工作,我建议你去“定制”路线(https://airflow.apache.org/docs/docker-stack/build.html#customizing-the-image),这应该会让你的图像变得更小。

如上构建的镜像为1.5 is。原始的气流图像(高度优化的大小)是882MB。因此,通过添加python-dev并安装dbt,它的大小增加了70% (!)。如果您使用自定义路由,则应小于1 1GB。

您可以阅读更多关于为什么docker大小很重要的内容,例如这里https://semaphoreci.com/blog/2018/03/14/docker-image-size.html

票数 3
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/68548506

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档