Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-34674

Spark app on k8s doesn't terminate without call to sparkContext.stop() method

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.1.1
    • 3.1.2, 3.2.0
    • Kubernetes
    • None

    Description

      Hello!
      I have run into a problem that if I don't call the method sparkContext.stop() explicitly, then a Spark driver process doesn't terminate even after its Main method has been completed. This behaviour is different from spark on yarn, where the manual sparkContext stopping is not required.
      It looks like, the problem is in using non-daemon threads, which prevent the driver jvm process from terminating.
      At least I see two non-daemon threads, if I don't call sparkContext.stop():

      Thread[OkHttp kubernetes.default.svc,5,main]
      Thread[OkHttp kubernetes.default.svc Writer,5,main]
      

      Could you tell please, if it is possible to solve this problem?

      Docker image from the official release of spark-3.1.1 hadoop3.2 is used.

      Attachments

        Issue Links

          Activity

            People

              Kotlov Sergey Kotlov
              Kotlov Sergey Kotlov
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: