Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22655

Fail task instead of complete task silently in PythonRunner during shutdown

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.0.2, 2.1.0, 2.2.0
    • 2.3.0
    • PySpark
    • None

    Description

      We have observed in our production environment that during Spark shutdown, if there are some active tasks, sometimes they will complete with incorrect results. We've tracked down the issue to a PythonRunner where it is returning partial result instead of throwing exception during Spark shutdown.

      I think the better way to handle this is to have these tasks fail instead of complete with partial results (complete with partial is always bad IMHO)

      Attachments

        Activity

          People

            icexelloss Li Jin
            icexelloss Li Jin
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: