Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21503

Spark UI shows incorrect task status for a killed Executor Process

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.2.0
    • 2.2.1, 2.3.0
    • Spark Core
    • None

    Description

      The executor tab on Spark UI page shows task as completed when an executor process that is running that task is killed using the kill command.

      Steps:
      1. Run a big Spark job. As an example, I ran a pyspark job with the following command:
      $SPARK_HOME/bin/spark-submit --master yarn --deploy-mode cluster --queue default --num-executors 10 --driver-memory 2G --conf spark.pyspark.driver.python=./Python3/bin/python --conf spark.pyspark.python=./Python3/bin/python --archives hdfs:///user/USERNAME/Python3.zip#Python3 ~/pi.py

      2. Go to the UI to see which executors are running.

      3. Do an ssh to each of the executor hosts and kill the java process running on the respective port mentioned in the UI using the following command:

      kill <pid> OR kill -9 <pid>

      Attachments

        Activity

          People

            pgandhi Parth Gandhi
            pgandhi Parth Gandhi
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: