Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25922

[K8] Spark Driver/Executor "spark-app-selector" label mismatch

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.4.0
    • 2.4.1, 3.0.0
    • Kubernetes, Spark Core
    • None
    • Spark 2.4.0 RC4

    Description

      Hi,

      I have been testing Spark 2.4.0 RC4 on Kubernetes  to run Python Spark Applications and running into an issue where the AppId label on the driver and executors mis-match. I am using the https://github.com/GoogleCloudPlatform/spark-on-k8s-operator to run these applications. 

      I see a spark.app.id of the form spark-* as  "spark-app-selector" label on the driver as well as in the K8 config-map which gets created for the driver via spark-submit . My guess is this is coming from https://github.com/apache/spark/blob/f6cc354d83c2c9a757f9b507aadd4dbdc5825cca/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/KubernetesClientApplication.scala#L211 

      But when the driver actually comes up and brings up executors etc. , I see that the "spark-app-selector" label on the executors as well as the spark.app.Id config within the user-code on the driver is something of the form spark-application-* ( probably from https://github.com/apache/spark/blob/b19a28dea098c7d6188f8540429c50f42952d678/core/src/main/scala/org/apache/spark/SparkContext.scala#L511 & https://github.com/apache/spark/blob/bfb74394a5513134ea1da9fcf4a1783b77dd64e4/core/src/main/scala/org/apache/spark/scheduler/SchedulerBackend.scala#L26 )

      We were consuming this "spark-app-selector" label on the Driver Pod to get the App Id and use it to look-up the app in SparkHistory server (among other use-cases). but due to this mis-match, this logic no longer works. This was working fine in Spark 2.2 fork for Kubernetes which i was using earlier. Is this expected behavior and if yes, what's the correct way to fetch the applicationId from outside the application ?  

      Let me know if I can provide any more details or if I am doing something wrong. Here is an example run with different spark-app-selector label on the driver/executor : 

       

      Name: pyfiles-driver
      Namespace: default
      Priority: 0
      PriorityClassName: <none>
      Start Time: Thu, 01 Nov 2018 18:19:46 -0700
      Labels: spark-app-selector=spark-b78bb10feebf4e2d98c11d7b6320e18f
       spark-role=driver
       sparkoperator.k8s.io/app-name=pyfiles
       sparkoperator.k8s.io/launched-by-spark-operator=true
       version=2.4.0
      Status: Running
      
      
      
      Name: pyfiles-1541121585642-exec-1
      Namespace: default
      Priority: 0
      PriorityClassName: <none>
      Start Time: Thu, 01 Nov 2018 18:24:02 -0700
      Labels: spark-app-selector=spark-application-1541121829445
       spark-exec-id=1
       spark-role=executor
       sparkoperator.k8s.io/app-name=pyfiles
       sparkoperator.k8s.io/launched-by-spark-operator=true
       version=2.4.0
      Status: Pending
      

       

       

       

      Attachments

        Activity

          People

            suxingfate Wang, Xinglong
            akhurana Anmol Khurana
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: