下面是我同时执行请求的模式:
rs = (grequests.get(url) for url in urls)
res_items = grequests.map(rs)
for num, res in enumerate(res_items):
json_data = json.loads(res.text)但是,大约每5,000个请求就会与错误ConnectionError: HTTPConnectionPool(host='apicache.vudu.com', port=80): Max retries exceeded with url:崩溃。有什么更可靠的模式来执行上面的操作--例如,如果单个请求不起作用,那么重试url最多五次?
发布于 2015-06-03 03:56:49
这里有一个选项,使用指数退避,如下所述:
def grequester(self, url, n=1):
'''
Google exponential backoff: https://developers.google.com/drive/web/handle-errors?hl=pt-pt
'''
MAX_TRIES = 8
try:
res = grequests.get(url)
except:
if n > MAX_TRIES:
return None
n += 1
log.warning('Try #%s for %s...' % (n, url))
time.sleep((2 ** n) + (random.randint(0, 1000) / 1000.0)) # add jitter 0-1000ms
return self.grequester(url, n)
else:
return reshttps://stackoverflow.com/questions/30608813
复制相似问题