我已经在Python中安装了gensim (通过pip)。安装结束后,我收到以下警告:
C:\Python27\lib\site-packages\gensim\utils.py:855: UserWarning: UserWarning;将分块混叠为chunkize_serial warnings.warn(“检测到窗口;将分块转换为chunkize_serial")
我该怎么纠正呢?
由于此警告,我无法从word2vec导入gensim.models。
我有以下配置:Python2.7,gensim-0.13.4.1,numpy-1.11.3,ciply-0.18.1,模式-2.6。
发布于 2017-01-31 06:49:49
在导入gensim之前,可以先用此代码抑制消息:
import warnings
warnings.filterwarnings(action='ignore', category=UserWarning, module='gensim')
import gensim发布于 2017-01-15 07:03:35
我觉得不是什么大问题。Gensim只是让您知道,它将别名为不同的函数,因为您使用的是特定的os。
从gensim.utils查看这段代码
if os.name == 'nt':
logger.info("detected Windows; aliasing chunkize to chunkize_serial")
def chunkize(corpus, chunksize, maxsize=0, as_numpy=False):
for chunk in chunkize_serial(corpus, chunksize, as_numpy=as_numpy):
yield chunk
else:
def chunkize(corpus, chunksize, maxsize=0, as_numpy=False):
"""
Split a stream of values into smaller chunks.
Each chunk is of length `chunksize`, except the last one which may be smaller.
A once-only input stream (`corpus` from a generator) is ok, chunking is done
efficiently via itertools.
If `maxsize > 1`, don't wait idly in between successive chunk `yields`, but
rather keep filling a short queue (of size at most `maxsize`) with forthcoming
chunks in advance. This is realized by starting a separate process, and is
meant to reduce I/O delays, which can be significant when `corpus` comes
from a slow medium (like harddisk).
If `maxsize==0`, don't fool around with parallelism and simply yield the chunksize
via `chunkize_serial()` (no I/O optimizations).
>>> for chunk in chunkize(range(10), 4): print(chunk)
[0, 1, 2, 3]
[4, 5, 6, 7]
[8, 9]
"""
assert chunksize > 0
if maxsize > 0:
q = multiprocessing.Queue(maxsize=maxsize)
worker = InputQueue(q, corpus, chunksize, maxsize=maxsize, as_numpy=as_numpy)
worker.daemon = True
worker.start()
while True:
chunk = [q.get(block=True)]
if chunk[0] is None:
break
yield chunk.pop()
else:
for chunk in chunkize_serial(corpus, chunksize, as_numpy=as_numpy):
yield chunkhttps://stackoverflow.com/questions/41658568
复制相似问题