Details
-
Task
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.3.1
-
Incompatible change
-
Description
This task tracks upgrading Hadoop's AWS connector S3A from AWS SDK for Java V1 to AWS SDK for Java V2.
Original use case:
We would like to access s3 with AWS SSO, which is supported in software.amazon.awssdk:sdk-core:2.*.
In particular, from https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html, when to set 'fs.s3a.aws.credentials.provider', it must be "com.amazonaws.auth.AWSCredentialsProvider". We would like to support "software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider" which supports AWS SSO, so users only need to authenticate once.
Attachments
Attachments
Issue Links
- breaks
-
SPARK-38958 Override S3 Client in Spark Write/Read calls
- Open
- causes
-
HBASE-28056 [HBoss] add support for AWS v2 SDK
- In Progress
- is a child of
-
HADOOP-18995 S3A: Upgrade AWS SDK version to 2.21.33 for Amazon S3 Express One Zone support
- Resolved
- is depended upon by
-
HADOOP-18886 S3A: AWS SDK V2 Migration: stabilization and S3Express
- Resolved
-
HADOOP-18477 Über-jira: S3A Hadoop 3.3.9-3.4.1 features
- Resolved
- is related to
-
HADOOP-18352 Support AWS IAM Identity Centre (prev. AWS SSO) for providing credentials to S3A
- Open
-
SPARK-45393 Upgrade Hadoop to 3.4.0
- Resolved
-
SPARK-44124 Upgrade AWS SDK to v2
- Open
- relates to
-
HADOOP-18286 S3a: allow custom retry policies
- Open
-
SPARK-48571 Reduce the number of accesses to S3 object storage
- Open
- links to