Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20091

DagScheduler should allow running concurrent attempts of a stage in case of multiple fetch failure

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.0.1
    • None
    • Scheduler, Spark Core

    Description

      Currently, the Dag scheduler does not allow running concurrent attempts of a stage in case of multiple fetch failure. As a result, in case of multipe fetch failures are detected, serial execution of map stage delays the job run significantly.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              sitalkedia@gmail.com Sital Kedia
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: