Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-23307

Spark UI should sort jobs/stages with the completed timestamp before cleaning up them

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 2.3.0
    • Fix Version/s: 2.3.0
    • Component/s: Web UI
    • Labels:
      None
    • Target Version/s:

      Description

      When you have a long running job, it may be deleted from UI quickly when it completes, if you happen to run a small job after it. It's pretty annoying when you run lots of jobs in the same driver concurrently (e.g., running multiple Structured Streaming queries). We should sort jobs/stages with the completed timestamp before cleaning up them.

      In 2.2, Spark has a separated buffer for completed jobs/stages, so it doesn't need to sort the jobs/stages.

      What's the behavior I expect:

      Set "spark.ui.retainedJobs" to 10 and run the following codes, job 0 should be kept in the Spark UI.

       

      new Thread() {
        override def run() {
      
          // job 0
          sc.makeRDD(1 to 1, 1).foreach { i =>
          Thread.sleep(10000)
         }
        }
      }.start()
      
      Thread.sleep(1000)
      
      for (_ <- 1 to 20) {
        new Thread() {
          override def run() {
            sc.makeRDD(1 to 1, 1).foreach { i =>
            }
          }
        }.start()
      }
      
      Thread.sleep(15000)
        sc.makeRDD(1 to 1, 1).foreach { i =>
      }
      
      

        Attachments

          Activity

            People

            • Assignee:
              zsxwing Shixiong Zhu
              Reporter:
              zsxwing Shixiong Zhu
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: