Details
-
New Feature
-
Status: Open
-
Major
-
Resolution: Unresolved
-
3.2.1
-
None
Description
Hello,
I have been working to use spark to read and write data to S3. Unfortunately, there are a few S3 headers that I need to add to my spark read/write calls. After much looking, I have not found a way to replace the S3 client that spark uses to make the read/write calls. I also have not found a configuration that allows me to pass in S3 headers. Here is an example of some common S3 request headers (https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonRequestHeaders.html). Does there already exist functionality to add S3 headers to spark read/write calls or pass in a custom client that would pass these headers on every read/write request? Appreciate the help and feedback
Thanks,
Attachments
Issue Links
- duplicates
-
HADOOP-18562 S3A: support custom S3 and STS headers
- Open
- is broken by
-
HADOOP-18073 S3A: Upgrade AWS SDK to V2
- Resolved
- links to