Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25904

Avoid allocating arrays too large for JVMs

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.4.0
    • Fix Version/s: 2.4.1, 3.0.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      In a few places spark can try to allocate arrays as big as Int.MaxValue, but thats actually too big for the JVM. We should consistently use ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH instead.

      In some cases this is changing defaults for configs, in some cases its bounds on a config, and others its just improving error msgs for things that still won't work.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                irashid Imran Rashid
                Reporter:
                irashid Imran Rashid
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: