Description
Trying to use the hadoop fs s3a library in AWS lambda with temporary credentials but it's not possible because of the way the AWSCredentialsProviderChain is defined under https://github.com/apache/hadoop/blob/29ae25801380b94442253c4202dee782dc4713f5/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java
Specifically the following code is used to initialise the credentials chain
AWSCredentialsProviderChain credentials = new AWSCredentialsProviderChain( new BasicAWSCredentialsProvider(accessKey, secretKey), new InstanceProfileCredentialsProvider(), new AnonymousAWSCredentialsProvider() );
The above works fine when the EC2 metadata endpoint is available (i.e. running on an EC2 instance) however it doesn't work properly when the environment variables are used to define credentials as it happens in AWS Lambda. Amazon suggests to use the EnvironmentVariableCredentialsProvider in AWS Lambda.
To summarise and suggest an alternative I think that the DefaultAWSCredentialsProviderChain could be used instead of the InstanceProfileCredentialsProvider and that would cover the following cases:
- Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY (RECOMMENDED since they are recognized by all the AWS SDKs and CLI except for .NET), or AWS_ACCESS_KEY and AWS_SECRET_KEY (only recognized by Java SDK)
- Java System Properties - aws.accessKeyId and aws.secretKey
- Credential profiles file at the default location (~/.aws/credentials) shared by all AWS SDKs and the AWS CLI
- Instance profile credentials delivered through the Amazon EC2 metadata service
If you think that the above change would be useful I could investigate more about what the required changes would be and submit a patch.
Attachments
Issue Links
- is related to
-
HADOOP-12807 S3AFileSystem should read AWS credentials from environment variables
- Resolved