Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21225

decrease the Mem using for variable 'tasks' in function resourceOffers

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.1.0, 2.1.1
    • Fix Version/s: 2.3.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      In the function 'resourceOffers', It declare a variable 'tasks' for storage the tasks which have allocated a executor. It declared like this:
      val tasks = shuffledOffers.map(o => new ArrayBuffer[TaskDescription](o.cores))

      But, I think this code only conside a situation for that one task per core. If the user config the "spark.task.cpus" as 2 or 3, It really don't need so much space. I think It can motify as follow:

      val tasks = shuffledOffers.map(o => new ArrayBuffer[TaskDescription](Math.ceil(o.cores*1.0/CPUS_PER_TASK).toInt))

        Attachments

          Activity

            People

            • Assignee:
              yzg37166 yangZhiguo
              Reporter:
              yzg37166 yangZhiguo
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Time Tracking

                Estimated:
                Original Estimate - 1h
                1h
                Remaining:
                Remaining Estimate - 1h
                1h
                Logged:
                Time Spent - Not Specified
                Not Specified