Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-31438

Support JobCleaned Status in SparkListener

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: In Progress
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 3.1.0
    • Fix Version/s: None
    • Component/s: Spark Core
    • Labels:
      None

      Description

      In Spark, we need do some hook after job cleaned, such as cleaning hive external temporary paths. This has already discussed in SPARK-31346 and GitHub Pull Request #28129.
      The JobEnd Status is not suitable for this. As JobEnd is responsible for Job finished, once all result has generated, it should be finished. After finish, Scheduler will leave the still running tasks to be zombie tasks and delete abnormal tasks asynchronously.
      Thus, we add JobCleaned Status to enable user to do some hook after all tasks cleaned in Job. The JobCleaned Status can get from TaskSetManagers, which is related to a stage, and once all stages of the job has been cleaned, then the job is cleaned.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                Jackey Lee Jackey Lee
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated: