首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >scrapy日志未写入日志文件

scrapy日志未写入日志文件
EN

Stack Overflow用户
提问于 2013-09-30 19:47:38
回答 1查看 1.9K关注 0票数 1

我有一个派生自BaseSpider的爬虫类。我调用了self.log,但没有任何内容写入日志文件。我在命令行LOG_FILELOG_LEVEL上配置了日志文件,但是爬行器日志输出没有写入到该文件中。如何将爬虫日志写入普通日志文件?

EN

回答 1

Stack Overflow用户

发布于 2013-10-01 11:01:01

你确定你的回调会被调用吗?

因为在文件example.py中有这个简单的爬行器

代码语言:javascript
复制
from scrapy.spider import BaseSpider

class ExampleSpider(BaseSpider):
    name = "example"
    start_urls = ['http://www.example.com/']

    def parse(self, response):
        self.log('************* my log ***********')

并使用scrapy runspider example.py --set LOG_FILE=logfile运行它,这是文件内容:

代码语言:javascript
复制
2013-09-30 22:55:12-0400 [scrapy] INFO: Scrapy 0.16.5 started (bot: mybot)
2013-09-30 22:55:12-0400 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreStats, SpiderState
2013-09-30 21:55:12-0500 [scrapy] DEBUG: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, RedirectMiddleware, CookiesMiddleware, HttpCompressionMiddleware, ChunkedTransferMiddleware, DownloaderStats
2013-09-30 21:55:12-0500 [scrapy] DEBUG: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2013-09-30 21:55:12-0500 [scrapy] DEBUG: Enabled item pipelines: MybotPipeline
2013-09-30 21:55:12-0500 [example] INFO: Spider opened
2013-09-30 21:55:12-0500 [example] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2013-09-30 21:55:12-0500 [scrapy] DEBUG: Telnet console listening on 0.0.0.0:6023
2013-09-30 21:55:12-0500 [scrapy] DEBUG: Web service listening on 0.0.0.0:6080
2013-09-30 21:55:13-0500 [example] DEBUG: Crawled (200) <GET http://www.example.com/> (referer: None)
2013-09-30 21:55:13-0500 [example] DEBUG: ************* my log ***********
2013-09-30 21:55:13-0500 [example] INFO: Closing spider (finished)
2013-09-30 21:55:13-0500 [example] INFO: Dumping Scrapy stats:
    {'downloader/request_bytes': 221,
     'downloader/request_count': 1,
     'downloader/request_method_count/GET': 1,
     'downloader/response_bytes': 1611,
     'downloader/response_count': 1,
     'downloader/response_status_count/200': 1,
     'finish_reason': 'finished',
     'finish_time': datetime.datetime(2013, 10, 1, 2, 55, 13, 315807),
     'log_count/DEBUG': 8,
     'log_count/INFO': 4,
     'response_received_count': 1,
     'scheduler/dequeued': 1,
     'scheduler/dequeued/memory': 1,
     'scheduler/enqueued': 1,
     'scheduler/enqueued/memory': 1,
     'start_time': datetime.datetime(2013, 10, 1, 2, 55, 12, 991150)}
2013-09-30 21:55:13-0500 [example] INFO: Spider closed (finished)

尝试将失败添加到回调中,以确保被调用。像raise Exception这样简单的东西。如果你在运行它的时候没有得到异常,那么你的回调就不会被调用。

票数 3
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/19093435

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档