Details
-
Umbrella
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
An umbrella ticket to track the various 2G limit we have in Spark, due to the use of byte arrays and ByteBuffers.
Attachments
Attachments
Issue Links
- is duplicated by
-
SPARK-22622 OutOfMemory thrown by Closure Serializer without proper failure propagation
- Resolved
-
SPARK-2755 TorrentBroadcast cannot broadcast very large objects
- Resolved
-
SPARK-1391 BlockManager cannot transfer blocks larger than 2G in size
- Closed
- is related to
-
SPARK-22352 task failures with java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE error
- Resolved
-
SPARK-22062 BlockManager does not account for memory consumed by remote fetches
- Resolved
- links to