Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8029

ShuffleMapTasks must be robust to concurrent attempts on the same executor

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Critical
    • Resolution: Fixed
    • Affects Version/s: 1.4.0
    • Fix Version/s: 1.5.3, 1.6.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      When stages get retried, a task may have more than one attempt running at the same time, on the same executor. Currently this causes problems for ShuffleMapTasks, since all attempts try to write to the same output files.

      This is finally resolved through https://github.com/apache/spark/pull/9610, which uses the first writer wins approach.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                davies Davies Liu
                Reporter:
                irashid Imran Rashid
              • Votes:
                2 Vote for this issue
                Watchers:
                11 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: