Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-16838

Support for `fs.s3a.endpoint.region`

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Works for Me
    • None
    • None
    • None
    • None

    Description

      Currently it is not possible to connect S3 Compatible services like MinIO, Ceph, etc (running with a custom region) to Spark with s3a connector. For example, if MinIO is running on a Server with

      • IP Address: 192.168.0.100
      • Region: ap-southeast-1

      The s3a connector can't be configured to use the region `ap-southeast-1`. 

      It would be great to have a configuration field like `fs.s3a.endpoint.region`. This will be very helpful for users deploying Private Cloud and who intend to use S3 like services on premises.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              nitisht Nitish
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: