Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10781

Allow certain number of failed tasks and allow job to succeed

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 1.5.0
    • None
    • Spark Core

    Description

      MapReduce has this config mapreduce.map.failures.maxpercent and mapreduce.reduce.failures.maxpercent which allows for a certain percent of tasks to fail but the job to still succeed.

      This could be a useful feature in Spark also if a job doesn't need all the tasks to be successful.

      Attachments

        1. SPARK_10781_Proposed_Solution.pdf
          255 kB
          Hieu Tri Huynh

        Activity

          People

            Unassigned Unassigned
            tgraves Thomas Graves
            Votes:
            5 Vote for this issue
            Watchers:
            9 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: