Details
-
Bug
-
Status: Open
-
Not a Priority
-
Resolution: Unresolved
-
1.9.0
-
None
Description
To provide credentials to S3 users may configure a credentials provider. For providers from amazon (which are relocated) we allow users to configure the original class name, and relocate it manually in the S3 filesystem factories.
However, none of the amazon provided credential providers can be used with the Presto filesystem, since it additionally requires them to have a constructor accepting a hadoop configuration. (https://prestodb.github.io/docs/current/connector/hive.html#amazon-s3-configuration)
hadoop-aws does include a number of credential providers that have this constructor, however these use configuration keys that aren't mirrored from the flink config. (they expect fs.s3a as a key-prefix), not to mention that users would have to configure the relocated class (since the S3 factory only manually relocates amazon classes).
Finally, a custom implementation of the credentials provider can effectively be ruled out since they too would have to be implemented against the relocated amazon/hadoop classes, which we can't really expect users to do.
In summary, amazon providers aren't working since they don't have a constructor that presto requires, hadoop providers don't work since we don't mirror the required configuration keys, and custom providers are unreasonable as they'd have to be implemented against relocated classes.
Attachments
Issue Links
- is duplicated by
-
FLINK-15215 Not able to provide a custom AWS credentials provider with flink-s3-fs-hadoop
- Closed
- is related to
-
FLINK-13044 Shading of AWS SDK in flink-s3-fs-hadoop results in ClassNotFoundExceptions
- Closed
- relates to
-
FLINK-22828 Allow using a custom AWS credentials provider for AWS Connectors
- Resolved