Details
-
Question
-
Status: Closed
-
Minor
-
Resolution: Invalid
-
2.4.3
-
None
-
None
Description
When i review StaticMemoryManager.scala, there comes a question to me.
private def getMaxExecutionMemory(conf: SparkConf): Long = { val systemMaxMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory) if (systemMaxMemory < MIN_MEMORY_BYTES) { throw new IllegalArgumentException(s"System memory $systemMaxMemory must " + s"be at least $MIN_MEMORY_BYTES. Please increase heap size using the --driver-memory " + s"option or spark.driver.memory in Spark configuration.") } if (conf.contains("spark.executor.memory")) { val executorMemory = conf.getSizeAsBytes("spark.executor.memory") if (executorMemory < MIN_MEMORY_BYTES) { throw new IllegalArgumentException(s"Executor memory $executorMemory must be at least " + s"$MIN_MEMORY_BYTES. Please increase executor memory using the " + s"--executor-memory option or spark.executor.memory in Spark configuration.") } } val memoryFraction = conf.getDouble("spark.shuffle.memoryFraction", 0.2) val safetyFraction = conf.getDouble("spark.shuffle.safetyFraction", 0.8) (systemMaxMemory * memoryFraction * safetyFraction).toLong }
When a executor tries to getMaxExecutionMemory, it should set systemMaxMemory by usingĀ Runtime.getRuntime.maxMemory first, then compares the value between systemMaxMemory and MIN_MEMORY_BYTES.
If the compared value is true, program thows an exception to remind user to increase heap size by using --driver-memory.
I wonder if it is wrong because the heap size of executors are setted by --executor-memory?
Although there is another exception about adjusting executor's memory below, i just think that the first exception may be notĀ appropriate.
Thanks for answering my question!