Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10842

Eliminate create duplicate stage while generate job dag

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 1.5.0
    • None
    • Scheduler, Spark Core
    • None

    Description


      When we traverse RDD, to generate Stage DAG, Spark will skip to judge the stage whether was added into shuffleIdToStage or not in some condition: get shuffleDep from getAncestorShuffleDependency

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              SuYan SuYan
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: