Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12081

Make unified memory management work with small heaps

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 1.6.0
    • 1.6.0
    • Spark Core
    • None

    Description

      By default, Spark drivers and executors are 1GB. With the recent unified memory mode, only 250MB is set aside for non-storage non-execution purposes (spark.memory.fraction is 75%). However, especially in local mode, the driver needs at least ~300MB. Some local jobs started to OOM because of this.

      Two mutually exclusive proposals:
      (1) First, cut out 300 MB, then take 75% of what remains
      (2) Use min(75% of JVM heap size, JVM heap size - 300MB)

      Attachments

        Activity

          People

            andrewor14 Andrew Or
            andrewor14 Andrew Or
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: