首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >AttributeError:模块'functools‘没有属性'wraps’

AttributeError:模块'functools‘没有属性'wraps’
EN

Stack Overflow用户
提问于 2017-01-29 21:51:24
回答 1查看 2.4K关注 0票数 3

当我执行测试时,我尝试用Anaconda4.2/Pythona3.5测试第三方代码,得到以下异常:

代码语言:javascript
复制
Traceback (most recent call last):
  File "pyspark/sql/tests.py", line 25, in <module>
    import subprocess
  File "/home/user/anaconda3/lib/python3.5/subprocess.py", line 364, in <module>
    import signal
  File "/home/user/anaconda3/lib/python3.5/signal.py", line 3, in <module>
    from functools import wraps as _wraps
  File "/home/user/anaconda3/lib/python3.5/functools.py", line 22, in <module>
    from types import MappingProxyType
  File "/home/user/Spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/types.py", line 22, in <module>
    import calendar
  File "/home/user/anaconda3/lib/python3.5/calendar.py", line 10, in <module>
    import locale as _locale
  File "/home/user/anaconda3/lib/python3.5/locale.py", line 108, in <module>
    @functools.wraps(_localeconv)
AttributeError: module 'functools' has no attribute 'wraps'

通常,我假设某些模块正在隐藏内置模块,但据我所知,这不是问题所在:

  • 我从测试中记录了模块路径(functools.__file__),它产生了预期的路径。另外,我在例外的道路上没有什么奇怪的地方。
  • 为了排除可能的模块损坏,我测试了全新的Anaconda安装。
  • 当我使用相同的配置和路径从IPython shell (%run pyspark/sql/tests.py)执行测试时,问题就消失了。
  • functools.wraps可以导入到启动于同一个目录中的shell中,并具有相同的配置。
  • 当我用Python2替换Python 3环境时,问题就消失了。
  • 使用virtualenv创建的环境不能重现问题。

对于同一项目的不同版本,我得到:

代码语言:javascript
复制
Traceback (most recent call last):
  File "pyspark/sql/tests.py", line 25, in <module>
    import pydoc
  File "/home/user/anaconda3/lib/python3.5/pydoc.py", line 55, in <module>
    import importlib._bootstrap
  File "/home/user/anaconda3/lib/python3.5/importlib/__init__.py", line 57, in <module>
    import types
  File "/home/user/Spark/spark-1.6.3-bin-hadoop2.6/python/pyspark/sql/types.py", line 22, in <module>
    import calendar
  File "/home/user/anaconda3/lib/python3.5/calendar.py", line 10, in <module>
    import locale as _locale
  File "/home/user/anaconda3/lib/python3.5/locale.py", line 19, in <module>
    import functools
  File "/home/user/anaconda3/lib/python3.5/functools.py", line 22, in <module>
    from types import MappingProxyType
ImportError: cannot import name 'MappingProxyType'

有什么明显的东西我错过了吗?

编辑

可用于再现问题的Dockerfile:

代码语言:javascript
复制
FROM debian:latest

RUN apt-get update
RUN apt-get install -y wget bzip2
RUN wget https://repo.continuum.io/archive/Anaconda3-4.2.0-Linux-x86_64.sh
RUN bash Anaconda3-4.2.0-Linux-x86_64.sh -b -p /anaconda3
RUN wget ftp://ftp.piotrkosoft.net/pub/mirrors/ftp.apache.org/spark/spark-2.1.0/spark-2.1.0-bin-hadoop2.7.tgz
RUN tar xf spark-2.1.0-bin-hadoop2.7.tgz
ENV PATH /anaconda3/bin:$PATH
ENV SPARK_HOME /spark-2.1.0-bin-hadoop2.7
ENV PYTHONPATH $PYTHONPATH:$SPARK_HOME/python/lib/py4j-0.10.4-src.zip:$SPARK_HOME/python
WORKDIR /spark-2.1.0-bin-hadoop2.7
RUN python python/pyspark/sql/tests.py
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2017-01-30 06:09:31

我怀疑这是因为,functools模块的python3有以下导入:from types import MappingProxyType,而不是从${CONDA_PREFIX}/lib/python3.5/types.py获取这个模块,而是尝试从sql目录:${SPARK_HOME}/python/pyspark/sql/types.py中导入该模块。functools模块的python2没有这个导入,因此不会抛出错误。

解决这个问题的方法是先导入所需的types模块,然后调用脚本。作为概念的证明:

代码语言:javascript
复制
(root) ~/condaexpts$ PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.4-src.zip:$SPARK_HOME/python python
Python 3.5.2 |Anaconda 4.2.0 (64-bit)| (default, Jul  2 2016, 17:53:06) 
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import types
>>> import os
>>> sqltests=os.environ['SPARK_HOME'] + '/python/pyspark/sql/tests.py'
>>> exec(open(sqltests).read())
.....Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/01/30 05:59:43 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/01/30 05:59:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

...

----------------------------------------------------------------------
Ran 128 tests in 372.565s

还请注意,conda没有什么特别之处。在普通的虚拟环境(使用python3)中可以看到同样的情况:

代码语言:javascript
复制
~/condaexpts$ virtualenv -p python3 venv
Running virtualenv with interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in venv/bin/python3
Also creating executable in venv/bin/python
Installing setuptools, pip...done.

~/condaexpts$ source venv/bin/activate

(venv)~/condaexpts$ python --version
Python 3.4.3

(venv)~/condaexpts$ python $WORKDIR/python/pyspark/sql/tests.py                                                                                                                                      
Traceback (most recent call last):
  File "/home/ubuntu/condaexpts/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/tests.py", line 26, in <module>
    import pydoc
  File "/usr/lib/python3.4/pydoc.py", line 59, in <module>
    import importlib._bootstrap
  File "/home/ubuntu/condaexpts/venv/lib/python3.4/importlib/__init__.py", line 40, in <module>
    import types
  File "/home/ubuntu/condaexpts/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/types.py", line 22, in <module>
    import calendar
  File "/usr/lib/python3.4/calendar.py", line 10, in <module>
    import locale as _locale
  File "/home/ubuntu/condaexpts/venv/lib/python3.4/locale.py", line 20, in <module>
    import functools
  File "/home/ubuntu/condaexpts/venv/lib/python3.4/functools.py", line 22, in <module>
    from types import MappingProxyType
ImportError: cannot import name 'MappingProxyType'
票数 4
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/41926351

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档