Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-14454

Better exception handling while marking tasks as failed

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.6.2, 2.0.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      Add support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in WriterContainer.scala:

      logError("Aborting task.", cause)
      // call failure callbacks first, so we could have a chance to cleanup the writer.
      TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
      if (currentWriter != null) {
        currentWriter.close()
      }
      abortTask()
      throw new SparkException("Task failed while writing rows.", cause)
      

      If markTaskFailed or currentWriter.close throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function Utils.tryWithSafeCatch that suppresses (Throwable.addSuppressed) the exception that are thrown within the catch block and rethrowing the original exception.

        Attachments

          Activity

            People

            • Assignee:
              sameerag Sameer Agarwal
              Reporter:
              sameerag Sameer Agarwal
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: