Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
Description
See discussion on Dask org:
https://github.com/dask/distributed/pull/931
It would be valuable for downstream users to compute the serialized payload as a sequence of memoryview-compatible objects without having to allocate new memory on write. This means that the component tensor messages must have their metadata and bodies in separate buffers. This will require a bit of work internally reassemble the object from a collection of pyarrow.Buffer objects
see also ARROW-1509
Attachments
Issue Links
- is related to
-
ARROW-1784 [Python] Read and write pandas.DataFrame in pyarrow.serialize by decomposing the BlockManager rather than coercing to Arrow format
- Resolved
- relates to
-
ARROW-1509 [Python] Write serialized object as a stream of encapsulated IPC messages
- Closed
- links to