Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.4.0
-
None
Description
In a few places spark can try to allocate arrays as big as Int.MaxValue, but thats actually too big for the JVM. We should consistently use ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH instead.
In some cases this is changing defaults for configs, in some cases its bounds on a config, and others its just improving error msgs for things that still won't work.
Attachments
Issue Links
- is related to
-
SPARK-25704 Replication of > 2GB block fails due to bad config default
- Resolved
- links to