Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8366

maxNumExecutorsNeeded should properly handle failed tasks

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.4.0
    • 1.5.0
    • Spark Core
    • None

    Description

      I use the dynamic executor allocation function.
      When an executor is killed, all running tasks on it will be failed. Until reach the maxTaskFailures, this failed task will re-run with a new task id.
      But the ExecutorAllocationManager won't concern this new tasks to total and pending tasks, because the total stage task number only set when stage submitted.

      Attachments

        Activity

          People

            meiyoula meiyoula
            meiyoula meiyoula
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: