Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6415

Spark Streaming fail-fast: Stop scheduling jobs when a batch fails, and kills the app

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • None
    • None
    • DStreams

    Description

      Of course, this would have to be done as a configurable param, but such a fail-fast is useful else it is painful to figure out what is happening when there are cascading failures. In some cases, the SparkContext shuts down and streaming keeps scheduling jobs

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              hshreedharan Hari Shreedharan
              Votes:
              2 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: