如你所知,当我们尝试在Crawlera中使用Scrapy Splash时,我们使用这个lua脚本:
function use_crawlera(splash)
-- Make sure you pass your Crawlera API key in the 'crawlera_user' arg.
-- Have a look at the file spiders/quotes-js.py to see how to do it.
-- Find your Crawlera credentials in https://app.scrapinghub.com/
local user = splash.args.crawlera_user
local host = 'proxy.crawlera.com'
local port = 8010
local session_header = 'X-Crawlera-Session'
local session_id = 'create'
splash:on_request(function (request)
request:set_header('X-Crawlera-Cookies', 'disable')
request:set_header(session_header, session_id)
request:set_proxy{host, port, username=user, password=''}
end)
splash:on_response_headers(function (response)
if type(response.headers[session_header]) ~= nil then
session_id = response.headers[session_header]
end
end)
end
function main(splash)
use_crawlera(splash)
splash:init_cookies(splash.args.cookies)
assert(splash:go{
splash.args.url,
headers=splash.args.headers,
http_method=splash.args.http_method,
})
assert(splash:wait(3))
return {
html = splash:html(),
cookies = splash:get_cookies(),
}
end在这个lua脚本中有一个我非常需要的session_id变量,但是我如何从Scrapy的响应中访问它呢?
我尝试过response.session_id或response.headers['X-Crawlera-Session'],但两者都不起作用。
发布于 2019-07-05 21:41:13
发布于 2019-07-07 00:59:30
在您的lua脚本中,
return {
html = splash:html(),
har = splash:har(),
cookies = splash:get_cookies(),
}execute端点设置为您的请求:meta['splash']['endpoint'] = 'execute'。
如果使用scrapy.Request,则render.json是默认端点,但对于scrapy_splash.SplashRequest,默认端点是render.html。查看以下两个示例,了解如何设置端点:https://github.com/scrapy-plugins/scrapy-splash#requests
X-Crawlera-Session头: def parse(self, response):
headers = json.loads(response.text)['har']['log']['entries'][0]['response']['headers']
session_id = next(x for x in headers if x['name'] == 'X-Crawlera-Session')['value']>>> headers = json.loads(response.text)['har']['log']['entries'][0]['response']['headers']
>>> next(x for x in headers if x['name'] == 'X-Crawlera-Session')
{u'name': u'X-Crawlera-Session', u'value': u'2124641382'}https://stackoverflow.com/questions/53502649
复制相似问题