Details
-
Task
-
Status: Closed
-
Minor
-
Resolution: Done
-
None
-
None
Description
This allocation is obsolete because of task https://issues.apache.org/jira/browse/DATALAB-1985
Previously it was:
1. the value of spark memory is equal 75% from total (if instance shape is up to 8 GB)
2. the value of spark.executor.memory is equal formula value: total value minus 3.5 GB (if instance shape is over than 8 GB).
Currently it is:
Case 1: Rstudio [4 GB]
3.4*0.75*1024=3072MB (by theory)
But by practical:
Case2:
Rstudio [122 GB]
(122-3.5)*1024= 121344
But by practical: (by theory)
Please, investigate how memory should be allocated