在运行第一个教程时获取一个错误。
刮伤: 0.22.2
lxml : 3.3.5.0
libxml2 : 2.7.8
扭曲: 12.0.0
Python : 2.7.2 (默认,2012年10月11日,20:14:37) - GCC 4.2.1兼容Apple Clang 4.0 (tag/Apple/clang-418.0.60)
平台:达尔文-12.5.0-x86_64-i 386-64位
这是我的文件items.py:
from scrapy.item import Item, Field
class DmozItem(Item)
title=Field()
link=Field()
desc=Field()我的dmoz_spider.py文件:来自scrapy.spider导入BaseSpider
class DmozSpider(BaseSpider):
name = "dmoz"
allowed_domains= ["dmoz.org"]
start_urls = [
"http://www.dmoz.org/Computers/Programming/Languages/Python/Books/",
"http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/"
]
def parse(self, response):
filename = response.url.split("/")[-2]
open(filename, 'wb').write(response.body)这是运行"scrapy scrapy dmoz“时的错误消息。
foolios-imac-2:教程foolio$ scrapy scrapy dmoz /usr/local/share/tutorial/tutorial/spiders/dmoz_spider.py:3: ScrapyDeprecationWarning: tutorial.spiders.dmoz_spider.DmozSpider继承自废弃的类scrapy.spider.BaseSpider,请从scrapy.spider.Spider继承。(只对第一个子类发出警告,可能还有其他类)类DmozSpider(BaseSpider): 2014-06-19 14:53:00-0500刮伤信息:刮除0.22.2启动(机器人:教程) 2014-06-19 14:53:00-0500刮除信息:可选特性: ssl,http11 2014-06-19 14:53:00-0500刮除信息:重写设置:{‘NEWSPIDER_模块’:‘tutorial.蜘蛛’,‘蜘蛛_模块’:‘tutorial.蜘蛛’,'BOT_NAME':‘教程’} 2014-06-19 14:53:00-0500剪贴信息:启用扩展: LogStats、TelnetConsole、CloseSpider、WebService、CoreStats、SpiderState 追溯(最近一次调用):pkg_resources.run_script中的文件"/usr/local/bin/scrapy",第5行(‘scrapy==0.22.2’,'scrapy') 文件"/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py",第489行,在run_script self.require(requires).run_script(script_name,ns中) 文件"/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py",第1207行,在run_script execfile中(script_filename、命名空间、命名空间) 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/EGG-INFO/scripts/scrapy",第4行,在execute() 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/cmdline.py",第143行,在execute _run_print_help(解析器、_run_command、cmd、args、opts) 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/cmdline.py",第89行,以_run_print_help func(*a,**kw)表示 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/cmdline.py",第150行,在_run_command cmd.run(args,opts)中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/commands/crawl.py",第50行,运行self.crawler_process.start() 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py",第92行,在start if self.start_crawling()中: 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py",第124行,在start_crawling中返回self._start_crawler()不是无 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py",第139行,在_start_crawler crawler.configure()中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/crawler.py",第47行,在配置self.engine = ExecutionEngine(self,self._spider_closed)中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/engine.py",第63行,在init self.downloader =Downloader(爬虫)中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/init.py",第73行,在init self.handlers =DownloadHandlers(爬虫)中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/init.py",第18行,在init cls = load_object(clspath)中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/utils/misc.py",第40行,在load_object mod =import_module(模块)中 文件"/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/importlib/init.py",第37行,在import_module import(name)中 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/s3.py",第4行,从.http导入HTTPDownloadHandler 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/http.py",第5行,从.http11导入HTTP11DownloadHandler到HTTPDownloadHandler 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/core/downloader/handlers/http11.py",第15行,从scrapy.xlib.tx导入代理程序、ProxyAgent、ResponseDone、\ 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/xlib/tx/init.py",第6行,in from。导入客户端、端点 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/xlib/tx/client.py",第37行,从.endpoints导入TCP4ClientEndpoint,SSL4ClientEndpoint 文件"/Library/Python/2.7/site-packages/Scrapy-0.22.2-py2.7.egg/scrapy/xlib/tx/endpoints.py",第222行,在interfaces.IProcessTransport中,'_process')): 文件"/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/zope/interface/declarations.py",第495行,在call raise中(“不能使用类的实现者。请使用其中之一”) TypeError:不能在类中使用实现者。改为使用类声明函数之一。
发布于 2014-06-19 20:34:37
尝试更新zope,然后运行您的代码
sudo pip install --upgrade zope.interface
或
sudo easy_install --upgrade zope.interface
https://stackoverflow.com/questions/24315430
复制相似问题