Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
3.3.0, 3.3.1, 3.3.2, 3.3.3
Description
when using hadoop and spark to read/write data from an s3 bucket like -> s3a://bucket/path and using a custom Credentials Provider, the path is removed from the s3a URI and the credentials provider fails because the full path is gone.
In Spark 3.2,
It was invoked as -> s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, conf)
.createS3Client(name, bucket, credentials);
But In spark 3.3.3
It is invoked as s3 = ReflectionUtils.newInstance(s3ClientFactoryClass, conf).createS3Client(getUri(), parameters);
the getUri() removes the path from the s3a URI
Attachments
Issue Links
- is broken by
-
HADOOP-14556 S3A to support Delegation Tokens
- Resolved
- links to