Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21225

decrease the Mem using for variable 'tasks' in function resourceOffers

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.1.0, 2.1.1
    • 2.3.0
    • Spark Core
    • None

    Description

      In the function 'resourceOffers', It declare a variable 'tasks' for storage the tasks which have allocated a executor. It declared like this:
      val tasks = shuffledOffers.map(o => new ArrayBuffer[TaskDescription](o.cores))

      But, I think this code only conside a situation for that one task per core. If the user config the "spark.task.cpus" as 2 or 3, It really don't need so much space. I think It can motify as follow:

      val tasks = shuffledOffers.map(o => new ArrayBuffer[TaskDescription](Math.ceil(o.cores*1.0/CPUS_PER_TASK).toInt))

      Attachments

        Activity

          People

            yzg37166 yangZhiguo
            yzg37166 yangZhiguo
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

                Estimated:
                Original Estimate - 1h
                1h
                Remaining:
                Remaining Estimate - 1h
                1h
                Logged:
                Time Spent - Not Specified
                Not Specified