Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-23948

Trigger mapstage's job listener in submitMissingTasks

    XMLWordPrintableJSON

    Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.3.0
    • Fix Version/s: 2.3.1, 2.4.0
    • Component/s: Scheduler, Spark Core
    • Labels:
      None

      Description

      SparkContext submitted a map stage from "submitMapStage" to DAGScheduler, 
      "markMapStageJobAsFinished" is called only in (https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala#L933 and https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala#L1314);

      But think about below scenario:
      1. stage0 and stage1 are all "ShuffleMapStage" and stage1 depends on stage0;
      2. We submit stage1 by "submitMapStage";
      3. When stage 1 running, "FetchFailed" happened, stage0 and stage1 got resubmitted as stage0_1 and stage1_1;
      4. When stage0_1 running, speculated tasks in old stage1 come as succeeded, but stage1 is not inside "runningStages". So even though all splits(including the speculated tasks) in stage1 succeeded, job listener in stage1 will not be called;
      5. stage0_1 finished, stage1_1 starts running. When "submitMissingTasks", there is no missing tasks. But in current code, job listener is not triggered

        Attachments

          Activity

            People

            • Assignee:
              jinxing6042@126.com Jin Xing
              Reporter:
              jinxing6042@126.com Jin Xing
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: