Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20904

Task failures during shutdown cause problems with preempted executors

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.6.0
    • 2.2.1, 2.3.0
    • Spark Core, YARN
    • None

    Description

      Spark runs tasks in a thread pool that uses daemon threads in each executor. That means that when the JVM gets a signal to shut down, those tasks keep running.

      Now when YARN preempts an executor, it sends a SIGTERM to the process, triggering the JVM shutdown. That causes shutdown hooks to run which may cause user code running in those tasks to fail, and report task failures to the driver. Those failures are then counted towards the maximum number of allowed failures, even though in this case we don't want that because the executor was preempted.

      So we need a better way to handle that situation.

      Attachments

        Activity

          People

            vanzin Marcelo Masiero Vanzin
            vanzin Marcelo Masiero Vanzin
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: