Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-698

Spark Standalone Mode is leaving a java process "spark.executor.StandaloneExecutorBackend" open on Windows

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 0.6.2
    • 0.8.0
    • Deploy
    • None

    Description

      The java process runnig "spark.executor.StandaloneExecutorBackend" fails to end after a task is finished.

      Under Max OS X and Unix, there is a single shell script "run" to start Spark master, worker, and executor. Under Windows, there is a cascade: "run.cmd" calls "run2.cmd" which calls java. So when the spark.deploy.worker.ExecutorRunner (which runs in the worker process) wants to kill the executor process via process.destroy(), it actually only kills the process of "run.cmd", and the process of "run2.cmd" (=> java running the executor) stays alive.

      See this thread on spark-users for all details: https://groups.google.com/forum/#!topic/spark-users/NrdhVlrUDtU/discussion

      Attachments

        Activity

          People

            cgrothaus Christoph Grothaus
            cgrothaus Christoph Grothaus
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: