Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-30310

SparkUncaughtExceptionHandler halts running process unexpectedly

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.4.0, 3.0.0
    • Fix Version/s: 2.4.5, 3.0.0
    • Component/s: Spark Core
    • Labels:
      None
    • Docs Text:
      Resolved SparkUncaughtExceptionHandler halting process unexpectedly.

      Description

      During 2.4.x testing, we have many occasions where the Worker process would just DEAD unexpectedly, with the Worker log ends with:

       

      ERROR SparkUncaughtExceptionHandler: scala.MatchError:  <...callstack...>

       

      We get the same callstack during our 2.3.x testing but the Worker process stays up.

      Upon looking at the 2.4.x SparkUncaughtExceptionHandler.scala compared to the 2.3.x version,  we found out SPARK-24294 introduced the following change:

      exception catch {
        case _: OutOfMemoryError =>
          System.exit(SparkExitCode.OOM)
        case e: SparkFatalException if e.throwable.isInstanceOf[OutOfMemoryError] =>
          // SPARK-24294: This is defensive code, in case that SparkFatalException is
          // misused and uncaught.
          System.exit(SparkExitCode.OOM)
        case _ if exitOnUncaughtException =>
          System.exit(SparkExitCode.UNCAUGHT_EXCEPTION)
      }

       

      This code has the _ if exitOnUncaughtException case, but not the other _ cases.  As a result, when exitOnUncaughtException is false (Master and Worker) and exception doesn't match any of the match cases (e.g., IllegalStateException), Scala throws MatchError(exception) ("MatchError" wrapper of the original exception).  Then the other catch block down below thinks we have another uncaught exception, and halts the entire process with SparkExitCode.UNCAUGHT_EXCEPTION_TWICE.

       

      catch {
        case oom: OutOfMemoryError => Runtime.getRuntime.halt(SparkExitCode.OOM)
        case t: Throwable => Runtime.getRuntime.halt(SparkExitCode.UNCAUGHT_EXCEPTION_TWICE)
      }

       

      Therefore, even when exitOnUncaughtException is false, the process will halt.

        Attachments

          Activity

            People

            • Assignee:
              tinhto-000 Tin Hang To
              Reporter:
              tinhto-000 Tin Hang To
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: