Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33446

[CORE] Add config spark.executor.coresOverhead

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 3.0.1
    • None
    • Spark Core
    • None

    Description

      Add config spark.executor.coresOverhead to request extra cores per executor. This config will be helpful in below cases:

      Suppose for physical machines or vm, the memory/cpu ratio is 3GB/core. But we run spark job, we want to have 6GB per task. If we request resource in such way, there will be resource waste.
      If we request extra cores without affecting cores per executor for task allocation, extra cores won't be wasted.

      Attachments

        Activity

          People

            Unassigned Unassigned
            warrenzhu25 Zhongwei Zhu
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: