首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >禁用TFX/setuptools中的无用日志/输出

禁用TFX/setuptools中的无用日志/输出
EN

Stack Overflow用户
提问于 2021-10-07 17:00:05
回答 1查看 52关注 0票数 1

我使用TensorFlow Extended已经有一个半月了,有一件事一直困扰着我。来自TFX管道的日志记录和堆积如山的stdout输出(与协调器无关)。

我的最终目标是只有我定义的日志/标准,因为它使单元测试更干净,调试更快,日志的存储更便宜。到目前为止,我已经通过下面的代码了解了如何抑制TFX日志和其他依赖项的日志。

抑制TFX日志:

代码语言:javascript
复制
import absl.logging
import logging
absl.logging.set_verbosity(absl.logging.FATAL)

抑制TensroFlow日志(必须在TensorFlow导入之前)

代码语言:javascript
复制
import os
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "3"
import tensorflow as tf

GoogleLogging多次输出以下警告。

代码语言:javascript
复制
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1007 12:09:16.761006 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type
I1007 12:09:16.797765 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type
I1007 12:09:16.826467 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type
I1007 12:09:16.862852 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type
I1007 12:09:16.907064 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type
I1007 12:09:18.094507 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type
I1007 12:09:18.169688 802336 rdbms_metadata_access_object.cc:686] No property is defined for the Type

他们被打压成这样:

代码语言:javascript
复制
os.environ["GLOG_minloglevel"] = "3"

Setuptools (我相信)多次导致以下警告

代码语言:javascript
复制
warning: install_lib: byte-compiling is disabled, skipping.

使用以下内容进行抑制:

代码语言:javascript
复制
os.environ["PYTHONDONTWRITEBYTECODE"] = '0'

问题是,在所有这些之后,仍然有一大堆我不关心的标准,我不知道如何抑制。下面是剩余输出的一个“小”片段。

代码语言:javascript
复制
running bdist_wheel
running build
running build_py
creating build
creating build/lib
copying penguin_utils_cloud_tuner.py -> build/lib
copying penguin_utils_keras.py -> build/lib
copying penguin_utils_flax_experimental.py -> build/lib
copying penguin_utils_base.py -> build/lib
copying penguin_pipeline_local.py -> build/lib
copying penguin_pipeline_kubeflow.py -> build/lib
copying penguin_pipeline_local_infraval.py -> build/lib
copying penguin_pipeline_local_e2e_test.py -> build/lib
copying penguin_pipeline_kubeflow_test.py -> build/lib
copying penguin_pipeline_local_infraval_e2e_test.py -> build/lib
copying penguin_pipeline_kubeflow_e2e_test.py -> build/lib
installing to /tmp/tmpcop1boxs
running install
running install_lib
copying build/lib/penguin_utils_cloud_tuner.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_utils_keras.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_utils_flax_experimental.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_utils_base.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_local.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_kubeflow.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_local_infraval.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_local_e2e_test.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_kubeflow_test.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_local_infraval_e2e_test.py -> /tmp/tmpcop1boxs
copying build/lib/penguin_pipeline_kubeflow_e2e_test.py -> /tmp/tmpcop1boxs
running install_egg_info
running egg_info
creating tfx_user_code_Transform.egg-info
writing tfx_user_code_Transform.egg-info/PKG-INFO
writing dependency_links to tfx_user_code_Transform.egg-info/dependency_links.txt
writing top-level names to tfx_user_code_Transform.egg-info/top_level.txt
writing manifest file 'tfx_user_code_Transform.egg-info/SOURCES.txt'
reading manifest file 'tfx_user_code_Transform.egg-info/SOURCES.txt'
writing manifest file 'tfx_user_code_Transform.egg-info/SOURCES.txt'
Copying tfx_user_code_Transform.egg-info to /tmp/tmpcop1boxs/tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0-py3.7.egg-info
running install_scripts
creating /tmp/tmpcop1boxs/tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/WHEEL
creating '/tmp/tmpbx4f359i/tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0-py3-none-any.whl' and adding '/tmp/tmpcop1boxs' to it
adding 'penguin_pipeline_kubeflow.py'
adding 'penguin_pipeline_kubeflow_e2e_test.py'
adding 'penguin_pipeline_kubeflow_test.py'
adding 'penguin_pipeline_local.py'
adding 'penguin_pipeline_local_e2e_test.py'
adding 'penguin_pipeline_local_infraval.py'
adding 'penguin_pipeline_local_infraval_e2e_test.py'
adding 'penguin_utils_base.py'
adding 'penguin_utils_cloud_tuner.py'
adding 'penguin_utils_flax_experimental.py'
adding 'penguin_utils_keras.py'
adding 'tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/METADATA'
adding 'tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/WHEEL'
adding 'tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/top_level.txt'
adding 'tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/RECORD'
removing /tmp/tmpcop1boxs
running bdist_wheel
running build
running build_py
creating build
creating build/lib
copying penguin_utils_cloud_tuner.py -> build/lib
copying penguin_utils_keras.py -> build/lib
copying penguin_utils_flax_experimental.py -> build/lib
copying penguin_utils_base.py -> build/lib
copying penguin_pipeline_local.py -> build/lib
copying penguin_pipeline_kubeflow.py -> build/lib
copying penguin_pipeline_local_infraval.py -> build/lib
copying penguin_pipeline_local_e2e_test.py -> build/lib
copying penguin_pipeline_kubeflow_test.py -> build/lib
copying penguin_pipeline_local_infraval_e2e_test.py -> build/lib
copying penguin_pipeline_kubeflow_e2e_test.py -> build/lib
installing to /tmp/tmp61fu93hi
running install
running install_lib
copying build/lib/penguin_utils_cloud_tuner.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_utils_keras.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_utils_flax_experimental.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_utils_base.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_local.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_kubeflow.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_local_infraval.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_local_e2e_test.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_kubeflow_test.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_local_infraval_e2e_test.py -> /tmp/tmp61fu93hi
copying build/lib/penguin_pipeline_kubeflow_e2e_test.py -> /tmp/tmp61fu93hi
running install_egg_info
running egg_info
creating tfx_user_code_Trainer.egg-info
writing tfx_user_code_Trainer.egg-info/PKG-INFO
writing dependency_links to tfx_user_code_Trainer.egg-info/dependency_links.txt
writing top-level names to tfx_user_code_Trainer.egg-info/top_level.txt
writing manifest file 'tfx_user_code_Trainer.egg-info/SOURCES.txt'
reading manifest file 'tfx_user_code_Trainer.egg-info/SOURCES.txt'
writing manifest file 'tfx_user_code_Trainer.egg-info/SOURCES.txt'
Copying tfx_user_code_Trainer.egg-info to /tmp/tmp61fu93hi/tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0-py3.7.egg-info
running install_scripts
creating /tmp/tmp61fu93hi/tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/WHEEL
creating '/tmp/tmp07op38gs/tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0-py3-none-any.whl' and adding '/tmp/tmp61fu93hi' to it
adding 'penguin_pipeline_kubeflow.py'
adding 'penguin_pipeline_kubeflow_e2e_test.py'
adding 'penguin_pipeline_kubeflow_test.py'
adding 'penguin_pipeline_local.py'
adding 'penguin_pipeline_local_e2e_test.py'
adding 'penguin_pipeline_local_infraval.py'
adding 'penguin_pipeline_local_infraval_e2e_test.py'
adding 'penguin_utils_base.py'
adding 'penguin_utils_cloud_tuner.py'
adding 'penguin_utils_flax_experimental.py'
adding 'penguin_utils_keras.py'
adding 'tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/METADATA'
adding 'tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/WHEEL'
adding 'tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/top_level.txt'
adding 'tfx_user_code_Trainer-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0.dist-info/RECORD'
removing /tmp/tmp61fu93hi
Processing /tmp/penguin_pipeline_local_e2e_test1ysjxxgo/tmpfld7ezwr/testPenguinPipelineLocal0/tfx/pipelines/penguin_test/_wheels/tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0-py3-none-any.whl
Installing collected packages: tfx-user-code-Transform
Successfully installed tfx-user-code-Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0
Processing /tmp/penguin_pipeline_local_e2e_test1ysjxxgo/tmpfld7ezwr/testPenguinPipelineLocal0/tfx/pipelines/penguin_test/_wheels/tfx_user_code_Transform-0.0+cfe5891c75179075cb76fc27c0a1e243f1f37e33f5b5fe3d0a6572cc4e9b3fa0-py3-none-any.whl
Installing collected packages: tfx-user-code-Transform

我尝试过重定向stdout和stderr,但没有成功。例如:

代码语言:javascript
复制
with contextlib.redirect_stdout(open(os.devnull, 'w')):
    LocalDagRunner().run(pipeline)

代码语言:javascript
复制
with contextlib.redirect_stderr(open(os.devnull, 'w')):
    LocalDagRunner().run(pipeline)

我知道剩余的输出看起来类似于pip install或类似的构建过程的输出,但我还没有找到一种编程方法来抑制它的输出。这里:How do I make python setup.py test -q quieter?,他们使用setuptools和python setup.py -q test来抑制项目的输出。由于我使用的是TFX,所以我没有执行生成输出的调用,所以有没有其他方法可以做到这一点呢?也许是另一个环境变量?

任何帮助都会得到极大的支持!

EN

回答 1

Stack Overflow用户

发布于 2021-10-10 17:46:40

setup.cfg中设置distutils/setuptools的选项。创建一个包含以下内容的setup.cfg

代码语言:javascript
复制
[global]
quiet = 1

现在,运行python setup.py test将产生与python setup.py -q test相同的输出。

此外,您可以考虑切换到pytest作为测试运行器,因为它在默认情况下会取消所有输出。示例:

setup.py

代码语言:javascript
复制
from setuptools import setup

setup(
    ...,
    setup_requires=['pytest-runner'],
    tests_require=['pytest'],
)

setup.cfg

代码语言:javascript
复制
[global]
quiet = 1

[aliases]
test = pytest

当然,pytest不能抑制从其C++代码发出的tensorflow日志,因此仍然需要设置TF_CPP_MIN_LOG_LEVEL环境变量。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/69485127

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档