Details
-
Bug
-
Status: Resolved
-
Critical
-
Resolution: Fixed
-
1.6.0
-
None
Description
By default, Spark drivers and executors are 1GB. With the recent unified memory mode, only 250MB is set aside for non-storage non-execution purposes (spark.memory.fraction is 75%). However, especially in local mode, the driver needs at least ~300MB. Some local jobs started to OOM because of this.
Two mutually exclusive proposals:
(1) First, cut out 300 MB, then take 75% of what remains
(2) Use min(75% of JVM heap size, JVM heap size - 300MB)