我正在扩展https://github.com/apache/beam中的BigQueryTornadoes示例。我正在进行更改,以便它将写入AWS S3作为接收器。在我的第一次迭代中,我能够使用以下代码使其工作。
public static void main(String[] args) {
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
options.setAwsCredentialsProvider(
new AWSStaticCredentialsProvider(
new BasicAWSCredentials(options.getAwsAccessKey().get(), options.getAwsSecretKey().get())));
runBigQueryTornadoes(options);
}在我的第二次迭代中,我希望使用STSAssumeRoleSessionCredentialsProvider来支持跨帐户的IAM角色。我有以下代码。
public static void main(String[] args) {
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
AWSCredentialsProvider provider = new AWSStaticCredentialsProvider(new BasicAWSCredentials(options.getAwsAccessKey().get(), options.getAwsSecretKey().get()));
AWSSecurityTokenServiceClientBuilder stsBuilder = AWSSecurityTokenServiceClientBuilder.standard().withCredentials(provider);
AWSSecurityTokenService sts = stsBuilder.build();
AWSCredentialsProvider credentialsProvider = new STSAssumeRoleSessionCredentialsProvider.Builder(options.getAwsRoleArn().get(), options.getAwsRoleSession().get())
.withExternalId(options.getAwsExternalId().get())
.withStsClient(sts)
.build();
options.setAwsCredentialsProvider(credentialsProvider);
runBigQueryTornadoes(options);
}当我运行上面的代码时,我得到了以下异常。
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Unexpected IOException (of type java.io.IOException): Failed to serialize and deserialize property 'awsCredentialsProvider' with value 'com.amazonaws.auth.STSAssumeRoleSessionCredentialsProvider@4edb24da'
at com.fasterxml.jackson.databind.JsonMappingException.fromUnexpectedIOE (JsonMappingException.java:338)
at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsBytes (ObjectMapper.java:3432)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:163)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:67)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:317)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:303)
at org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS.runBigQueryTornadoes (BigQueryTornadoesS3STS.java:251)
at org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS.main (BigQueryTornadoesS3STS.java:267)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)我使用以下mvn命令运行。
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS "-Dexec.args=..." -P direct-runner我在Beam: Failed to serialize and deserialize property 'awsCredentialsProvider上看到了类似的帖子。但是我面对这个问题时并没有把它包装成一个罐子。
发布于 2020-09-05 03:07:29
这篇文章I am trying to write to S3 using assumeRole via FileIO with ParquetIO帮助我让我的代码正常工作。使用下面的代码,我能够承担跨帐户IAM角色,并写入另一个亚马逊网络服务帐户拥有的S3存储桶。
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
AWSCredentialsProvider provider = new AWSStaticCredentialsProvider(new BasicAWSCredentials(options.getAwsAccessKey().get(), options.getAwsSecretKey().get()));
AWSSecurityTokenServiceClientBuilder stsBuilder = AWSSecurityTokenServiceClientBuilder.standard().withCredentials(provider);
AWSSecurityTokenService sts = stsBuilder.build();
STSAssumeRoleSessionCredentialsProvider credentials = new STSAssumeRoleSessionCredentialsProvider.Builder(options.getAwsRoleArn().get(), options.getAwsRoleSession().get())
.withExternalId(options.getAwsExternalId().get())
.withStsClient(sts)
.build();
options.setAwsCredentialsProvider(
new AWSStaticCredentialsProvider(
credentials.getCredentials()));
runBigQueryTornadoes(options);
}注意:代码基于https://github.com/apache/beam的BigQueryTornadoes示例。
https://stackoverflow.com/questions/63737021
复制相似问题