Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10781

Allow certain number of failed tasks and allow job to succeed

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 1.5.0
    • None
    • Spark Core

    Description

      MapReduce has this config mapreduce.map.failures.maxpercent and mapreduce.reduce.failures.maxpercent which allows for a certain percent of tasks to fail but the job to still succeed.

      This could be a useful feature in Spark also if a job doesn't need all the tasks to be successful.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            tgraves Thomas Graves
            Votes:
            5 Vote for this issue
            Watchers:
            9 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment