Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8029

ShuffleMapTasks must be robust to concurrent attempts on the same executor

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 1.4.0
    • 1.5.3, 1.6.0
    • Spark Core
    • None

    Description

      When stages get retried, a task may have more than one attempt running at the same time, on the same executor. Currently this causes problems for ShuffleMapTasks, since all attempts try to write to the same output files.

      This is finally resolved through https://github.com/apache/spark/pull/9610, which uses the first writer wins approach.

      Attachments

        Issue Links

          Activity

            People

              davies Davies Liu
              irashid Imran Rashid
              Votes:
              2 Vote for this issue
              Watchers:
              11 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: