Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
3.0.1
-
None
-
None
Description
Add config spark.executor.coresOverhead to request extra cores per executor. This config will be helpful in below cases:
Suppose for physical machines or vm, the memory/cpu ratio is 3GB/core. But we run spark job, we want to have 6GB per task. If we request resource in such way, there will be resource waste.
If we request extra cores without affecting cores per executor for task allocation, extra cores won't be wasted.