Details
-
New Feature
-
Status: Open
-
Major
-
Resolution: Unresolved
-
1.8.1
-
None
Description
Dear Developers,
It was not easy, but using S3 API, it was possible to upload a large blob from stream - without knowing its size in advance (and storing all the data locally). I found solutions using jclouds' aws-s3 specific API (some async interface), but I really miss this feature from jclouds' general API.
My dream is to have a method like:
blob.getOutputStream() into which I can write as many data as I want,
which pushes data to the storage simultaneously until I close the stream.
(When I used S3, I created a wrapper class extending OutputStream, which initiates multipart upload, buffers data written to the output stream, writes a part when the buffer is full, and finalizes multipart upload on stream close.)
I don't know it is possible for all providers, but I really miss it...
Thank you,
Akos Hajnal
Attachments
Issue Links
- is related to
-
JCLOUDS-627 Add support for aborting putBlob operations
- Open
- relates to
-
JCLOUDS-639 Provide methods to get the progress of a running upload to a blobstore
- Open