Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22827

Avoid throwing OutOfMemoryError in case of exception in spill

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.2.0
    • Fix Version/s: 2.3.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      Currently, the task memory manager throws an OutofMemory error when there is an IO exception happens in spill() - https://github.com/apache/spark/blob/master/core/src/main/java/org/apache/spark/memory/TaskMemoryManager.java#L194. Similarly there any many other places in code when if a task is not able to acquire memory due to an exception we throw an OutofMemory error which kills the entire executor and hence failing all the tasks that are running on that executor instead of just failing one single task.

        Attachments

          Activity

            People

            • Assignee:
              sitalkedia@gmail.com Sital Kedia
              Reporter:
              sitalkedia@gmail.com Sital Kedia

              Dates

              • Created:
                Updated:
                Resolved:

                Issue deployment