这是我使用的命令:
unload ('select * from SPEC_BFO.CASE_HISTORY where (INTEGRATION_ID,LAST_OPERATION_DATE) IN (SELECT INTEGRATION_ID,max(LAST_OPERATION_DATE) from SPEC_BFO.CASE_HISTORY group by INTEGRATION_ID)') to 's3://use-s3-dwnam-qa/NAM/SPEC_BFO/CASE_HISTORY/VIEW_CASE_HISTORY/VIEW_CASE_HISTORY.' iam_role 'arn:aws:iam::111111111111:role/use-redshift-dwnam-qa' delimiter '|' PARALLEL OFF header ALLOWOVERWRITE gzip;获取此错误:
('There is a problem:', InternalError('S3 Query Exception (Fetch)\nDETAIL: \n -----------------------------------------------\n error: S3 Query Exception (Fetch)\n code: 15001\n context: Request ran out of memory in the S3 query layer.\n query: 8163346\n location: dory_util.cpp:1083\n process: asyncrequest_thread [pid=112556]\n -----------------------------------------------\n\n',))远程脚本调用失败,此脚本将暂停。
发布于 2019-12-19 06:43:43
查询很可能太复杂,无法与UNLOAD操作结合。
首先尝试使用CREATE TABLE AS命令创建输出表,然后通过SELECT * UNLOAD该表。
https://stackoverflow.com/questions/59388632
复制相似问题