首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Scrapy,Scrapinghub和Google云存储:在scrapinghub上运行蜘蛛时的键错误'gs‘

Scrapy,Scrapinghub和Google云存储:在scrapinghub上运行蜘蛛时的键错误'gs‘
EN

Stack Overflow用户
提问于 2018-02-22 10:35:47
回答 1查看 1.1K关注 0票数 1

我正在使用Python 3进行一个刮伤项目,并将蜘蛛部署到scrapinghub中。我还使用Google来存储官方文档这里中提到的刮过的文件。

当我在本地运行蜘蛛时,这些蜘蛛运行的非常好,并且这些蜘蛛被部署到scrapinghub上,没有任何错误。我使用scrapy:1.4-py3作为剪贴的堆栈。在上面运行蜘蛛时,我得到了以下错误:

代码语言:javascript
复制
    Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 77, in crawl
    self.engine = self._create_engine()
  File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 102, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 70, in __init__
    self.scraper = Scraper(crawler)
  File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 71, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 36, in from_settings
    mw = mwcls.from_crawler(crawler)
  File "/usr/local/lib/python3.6/site-packages/scrapy/pipelines/media.py", line 68, in from_crawler
    pipe = cls.from_settings(crawler.settings)
  File "/usr/local/lib/python3.6/site-packages/scrapy/pipelines/images.py", line 95, in from_settings
    return cls(store_uri, settings=settings)
  File "/usr/local/lib/python3.6/site-packages/scrapy/pipelines/images.py", line 52, in __init__
    download_func=download_func)
  File "/usr/local/lib/python3.6/site-packages/scrapy/pipelines/files.py", line 234, in __init__
    self.store = self._get_store(store_uri)
  File "/usr/local/lib/python3.6/site-packages/scrapy/pipelines/files.py", line 269, in _get_store
    store_cls = self.STORE_SCHEMES[scheme]
KeyError: 'gs'

PS:“gs”在路径中用于存储以下文件

代码语言:javascript
复制
'IMAGES_STORE':'gs://<bucket-name>/'

我已经研究过这个错误,但是没有任何解决办法。任何帮助都会有很大帮助。

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2018-02-22 22:39:23

是Scrapy1.5中的一个新特性,因此您需要在Scrapy中使用scrapy:1.5-py3堆栈。

票数 6
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/48925215

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档