I need to access multiple hive tables in my spark application where each hive table is
1- an external table with data sitting on S3
2- each table is own by a different AWS user so I need to provide different AWS credentials.
I am familiar with setting the aws credentials in the hadoop configuration object but that does not really help me because I can only set one pair of (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
From my research , there is no easy or elegant way to do this in spark .
Why is that ?
How do I address this use case?