Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25904

Avoid allocating arrays too large for JVMs

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.4.0
    • 2.4.1, 3.0.0
    • Spark Core
    • None

    Description

      In a few places spark can try to allocate arrays as big as Int.MaxValue, but thats actually too big for the JVM. We should consistently use ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH instead.

      In some cases this is changing defaults for configs, in some cases its bounds on a config, and others its just improving error msgs for things that still won't work.

      Attachments

        Issue Links

          Activity

            People

              irashid Imran Rashid
              irashid Imran Rashid
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: