我正在尝试将数据从AWS Redshift卸载到s3存储桶中。Redshift集群还有额外的密码保护。我将aws cli设置为使用适当的密钥对,并成功进行了测试。我可以使用凭据通过DataGrip访问红移集群,但现在当我尝试在python3中使用以下脚本进行卸载时
import json
import os
import psycopg2
def run(config_json, sql_query):
conn = psycopg2.connect(**config_json['db'])
cursor = conn.cursor()
query = """
UNLOAD ($${}$$)
to \'{}\'
parallel off
delimiter ','
allowoverwrite;
""".format(sql_query, config_json['s3bucket_path_to_file'])
print("The following UNLOAD query is being run: \n" + query)
cursor.execute(query)
print('Completed write to {}'.format(config_json['s3bucket_path_to_file']))
if __name__ == '__main__':
config_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config.json')
with open(config_path, 'r') as f:
config = json.loads(f.read())
query_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'query.sql')
with open(query_path, 'r') as f:
query_sql = f.read()
run(config, query_sql)我得到以下错误
psycopg2.InternalError: invalid CREDENTIALS clause
DETAIL:
-----------------------------------------------
error: invalid CREDENTIALS clause
code: 8001
context:
query: 2820791
location: aws_credentials_parser.cpp:62
process: padbmaster [pid=25330]
-----------------------------------------------config.json文件的格式如下:
{
"db": {
"dbname": "dbname",
"user": "user1",
"host": "someip",
"password": "very secret psw",
"port": "1111"
},
"s3bucket_path_to_file": "s3://bucket-name/path/to/file.csv"
}发布于 2018-01-31 08:45:37
您尚未在查询语句中指定凭据部分。为了让Redshift写入您的S3存储桶,您需要提供Redshift将使用的有效凭据。
您可以指定iam_role或access_key_id和secret_access_key (如果使用临时凭据,则指定session_token )。
https://stackoverflow.com/questions/48532619
复制相似问题