Uploaded image for project: 'Apache Arrow'
  1. Apache Arrow
  2. ARROW-5966

[Python] Capacity error when converting large UTF32 numpy array to arrow array

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: In Progress
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 0.13.0, 0.14.0
    • Fix Version/s: 0.15.0
    • Component/s: Python
    • Labels:
      None

      Description

      Trying to create a large string array fails with 

      ArrowCapacityError: Encoded string length exceeds maximum size (2GB)

      instead of creating a chunked array.

       

      A reproducible example:

      import uuid
      import numpy as np
      import pyarrow as pa
      
      li = []
      for i in range(100000000):
          li.append(uuid.uuid4().hex)
      arr = np.array(li)
      parr = pa.array(arr)
      

      Is it a regression or was it never properly fixed: https://github.com/apache/arrow/issues/1855?

       

       

        Attachments

          Activity

            People

            • Assignee:
              wesmckinn Wes McKinney
              Reporter:
              Igor Yastrebov Igor Yastrebov
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated: