Description
We are trying to write to S3 bucket which has policy with specific IAM Users, SSE and endpoint. So this bucket has 2 endpoints mentioned in policy : gateway endpoint and interface endpoint.
When we use gateway endpoint which is general one: https://s3.us-east-1.amazonaws.com => spark code executes successfully and writes to S3 bucket
But when we use interface endpoint (which we have to use ideally): https://bucket.vpce-<>.s3.us-east-1.vpce.amazonaws.com => spark code throws an error as :
py4j.protocol.Py4JJavaError: An error occurred while calling o91.save.
: org.apache.hadoop.fs.s3a.AWSBadRequestException: doesBucketExist on <BUCKET NAME>: com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: BA67GFNR0Q127VFM; S3 Extended Request ID: BopO6Cn1hNzXdWh89hZlnl/QyTJef/1cxmptuP6f4yH7tqfMO36s/7mF+q8v6L5+FmYHXbFdEss=; Proxy: null), S3 Extended Request ID: BopO6Cn1hNzXdWh89hZlnl/QyTJef/1cxmptuP6f4yH7tqfMO36s/7mF+q8v6L5+FmYHXbFdEss=:400 Bad Request: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: BA67GFNR0Q127VFM; S3 Extended Request ID: BopO6Cn1hNzXdWh89hZlnl/QyTJef/1cxmptuP6f4yH7tqfMO36s/7mF+q8v6L5+FmYHXbFdEss=; Proxy: null)
Attaching the pyspark code and exception trace
Attachments
Attachments
Issue Links
- requires
-
HADOOP-17705 S3A to add option fs.s3a.endpoint.region to set AWS region
- Resolved