首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >拒绝许可:“dbt_modules”

拒绝许可:“dbt_modules”
EN

Stack Overflow用户
提问于 2022-06-07 04:46:33
回答 1查看 379关注 0票数 1

运行deps时会出现此错误。我是不是在dockerfile中遗漏了一些东西来提供对dbt_modules的访问?我似乎找不到它的位置,甚至在文档中也找不到dbt_modules。我为几个yml文件提供了代码。提前感谢

回溯

代码语言:javascript
复制
2022-06-07 04:34:59.121970 (MainThread): Running with dbt=0.21.1
2022-06-07 04:34:59.972911 (MainThread): You have an incompatible version of 'pyarrow' installed (6.0.1), please install a version that adheres to: 'pyarrow<3.1.0,>=3.0.0; extra == "pandas"'
2022-06-07 04:35:00.477470 (MainThread): running dbt with arguments Namespace(cls=<class 'dbt.task.deps.DepsTask'>, debug=False, defer=None, log_cache_events=False, log_format='default', partial_parse=None, profile=None, profiles_dir='/home/airflow/.dbt', project_dir=None, record_timing_info=None, rpc_method='deps', single_threaded=False, state=None, strict=False, target=None, test_new_parser=False, use_cache=True, use_colors=None, use_experimental_parser=False, vars='{}', warn_error=False, which='deps', write_json=True)
2022-06-07 04:35:00.478141 (MainThread): Tracking: tracking
2022-06-07 04:35:00.478667 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f2d0ddcee20>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f2d0ddce6d0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f2d0de7d130>]}
2022-06-07 04:35:00.479294 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f2d0ddcee20>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f2d0ddce6d0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f2d0de7d130>]}
2022-06-07 04:35:00.479750 (MainThread): Flushing usage events
2022-06-07 04:35:00.913755 (MainThread): Encountered an error:
2022-06-07 04:35:00.914481 (MainThread): [Errno 13] Permission denied: 'dbt_modules'
2022-06-07 04:35:00.916934 (MainThread): Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.8/site-packages/dbt/main.py", line 127, in main
    results, succeeded = handle_and_check(args)
  File "/home/airflow/.local/lib/python3.8/site-packages/dbt/main.py", line 205, in handle_and_check
    task, res = run_from_args(parsed)
  File "/home/airflow/.local/lib/python3.8/site-packages/dbt/main.py", line 258, in run_from_args
    results = task.run()
  File "/home/airflow/.local/lib/python3.8/site-packages/dbt/task/deps.py", line 46, in run
    system.make_directory(self.config.modules_path)
  File "/home/airflow/.local/lib/python3.8/site-packages/dbt/clients/system.py", line 109, in make_directory
    raise e
  File "/home/airflow/.local/lib/python3.8/site-packages/dbt/clients/system.py", line 103, in make_directory
    os.makedirs(path)
  File "/usr/local/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: 'dbt_modules'

docker-compose.yaml

代码语言:javascript
复制
version: '3'
x-airflow-common:
  &airflow-common
  build: .
  # image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.1.2}
  environment:
    &airflow-common-env
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
    AIRFLOW__CORE__FERNET_KEY: ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
    AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
    AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
    _PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
  volumes:
    - ./dags:/opt/airflow/dags
    - ./logs:/opt/airflow/logs
    - ./plugins:/opt/airflow/plugins
    - ./config/airflow.cfg:/opt/airflow/airflow.cfg
    - ./dbt:/opt/airflow/dbt
    - ~/.dbt:/home/airflow/.dbt:ro
    - ./dags:/dags
  user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"

dbt_project.yml

代码语言:javascript
复制
target-path: "target"  # directory which will store compiled SQL files
clean-targets:         # directories to be removed by `dbt clean`
  - "target"
  - "dbt_modules"
  - "dbt_packages"

packages.yml

代码语言:javascript
复制
packages:
  - package: fishtown-analytics/dbt_utils
    version: 0.6.4

Dockerfile

代码语言:javascript
复制
FROM ${AIRFLOW_BASE_IMAGE}

USER airflow
RUN pip install dbt \ 
                apache-airflow-providers-microsoft-azure==3.7.0 \
                apache-airflow-providers-snowflake\ 
                riotwatcher \
                pandas
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2022-06-07 16:32:55

当运行dbt_modules (在项目中安装dbt包)时,dbt在dbt项目目录中创建一个dbt_packages目录(在1.0版中重命名为dbt)。

看起来您正在将dbt项目目录挂载为卷。最有可能的情况是,运行dbt deps (作为气流任务)的用户没有被授权写入该卷。

您可以在您的modules-path文件中配置dbt_project.yml文件中的dbt_project.yml(1.0后的packages-install-path),以写入本地目录而不是受保护的卷。文档

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/72526035

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档