Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-13343

speculative tasks that didn't commit shouldn't be marked as success

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.6.0
    • 2.4.0
    • Spark Core
    • None

    Description

      Currently Speculative tasks that didn't commit can show up as success  (depending on timing of commit). This is a bit confusing because that task didn't really succeed in the sense it didn't write anything.

      I think these tasks should be marked as KILLED or something that is more obvious to the user exactly what happened. it is happened to hit the timing where it got a commit denied exception then it shows up as failed and counts against your task failures. It shouldn't count against task failures since that failure really doesn't matter.

      MapReduce handles these situation so perhaps we can look there for a model.

      Attachments

        1. image.png
          44 kB
          Hieu Tri Huynh
        2. image.png
          44 kB
          Hieu Tri Huynh
        3. Screen Shot 2018-07-08 at 3.49.52 PM.png
          46 kB
          Hieu Tri Huynh

        Activity

          People

            hthuynh2 Hieu Tri Huynh
            tgraves Thomas Graves
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: