Uploaded image for project: 'Apache Arrow'
  1. Apache Arrow
  2. ARROW-5966

[Python] Capacity error when converting large UTF32 numpy array to arrow array

    XMLWordPrintableJSON

Details

    Description

      Trying to create a large string array fails with 

      ArrowCapacityError: Encoded string length exceeds maximum size (2GB)

      instead of creating a chunked array.

       

      A reproducible example:

      import uuid
      import numpy as np
      import pyarrow as pa
      
      li = []
      for i in range(100000000):
          li.append(uuid.uuid4().hex)
      arr = np.array(li)
      parr = pa.array(arr)
      

      Is it a regression or was it never properly fixed: https://github.com/apache/arrow/issues/1855?

       

       

      Attachments

        Issue Links

          Activity

            People

              wesm Wes McKinney
              Igor Yastrebov Igor Yastrebov
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 0.5h
                  0.5h