Details
-
New Feature
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
2.0.1
-
None
-
None
Description
I can't find any possible way to use Spark to write to S3 and set user object metadata. This seems like such a simple thing that I feel I must be missing somewhere how to do it....but I have yet to find anything.
I don't know what all work adding this would entail. My idea would be that there is something like:
rdd.saveAsTextFile(s3://testbucket/file).withMetadata(Map<String,String> data).