Uploaded image for project: 'Libcloud'
  1. Libcloud
  2. LIBCLOUD-988

Not able to upload object larger then 4MB (Blob/chunk size)

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Storage
    • Labels:
      None
    • Environment:

      tried in following

      • docker python:2.7
      • repl.it 2.7.0

      Description

      Hi there,

      I have started using Azure Blob Storate provider and found it it just doesn't work for files larger then 4MB, which is actually blob and chunk size limit in code.

      I have done some digging/debugging in the code base and found out that uploading files in chunks is not used at all. It was actually added correctly with first version for Azure Blob Provided in 0.12.* version, but during some kind of refactoring it has been removed and for example `_upload_in_chunks` method is not used at all. Practically in method `_upload_object` check if file size is bigger then blob/chunk size but actually chunk upload is not happening.

      Try following code to see it's not working 

      import urllib
      from libcloud.storage.types import Provider
      from libcloud.storage.providers import get_driver
      
      print "start data downloaded"
      
      file_path = "data.csv"
      urllib.urlretrieve("https://data.seattle.gov/api/views/mags-97de/rows.csv?accessType=DOWNLOAD", "data.csv")
      
      print "data downloaded"
      
      key = "<KEY>"
      secret = "<SECRET>"
      bucket_name = "<BUCKET_NAME>"
      kls = get_driver(Provider.AZURE_BLOBS)
      provider = kls(key=key, secret=secret)
      
      
      print "start pushing data to azure"
      container = provider.get_container(bucket_name)
      provider.upload_object(
      file_path,
      container=provider.get_container(bucket_name),
      object_name=file_path,
      )
      print "stop pushing data to azure"

      Just downloading some random CSV file (24MB in this case) and trying to upload it. I have found some workaround how to be able to send one chunk/blob, but Azure have restrictions max 100MB per chunk/blob, so in the end i am not able to push files larger then 100MB.

      The example code fail with following exception

      Traceback (most recent call last): File "python", line 24, in <module> LibcloudError: <LibcloudError in <libcloud.storage.drivers.azure_blobs.AzureBlobsStorageDriver object at 0x7fbf61fbbdd0> 'Unexpected status code, status_code=403'>

      But if you print out output of the HTTP response it's saying following (sensitive informations truncated)

      <?xml version="1.0" encoding="UTF-8"?>
      <Error>
      <Code>AuthenticationFailed</Code>
      <Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:<truncated>\nTime:2018-03-16T13:03:44.0282425Z</Message>
      <AuthenticationErrorDetail>The MAC signature found in the HTTP request \'<truncated>\' is not the same as any computed signature. Server used following string to sign: \'PUT\n\n\n\n\ntext/csv\n\n\n\n\n\n\nx-ms-blob-type:BlockBlob\nx-ms-date:Fri, 16 Mar 2018 13:03:51 GMT\nx-ms-version:2012-02-12\n<truncated>\'.</AuthenticationErrorDetail>
      </Error>

      Let me know if i am doing anything wrong or if you need provide more informations

      Thanks,

      Vojta

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              vojta Vojta Bartos
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated: