Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-40814

Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 2.4.0, 3.1.2
    • None
    • Deploy, Spark Submit
    • None
    • k8s version: v1.18.9

      spark version: v2.4.0

      kubernetes-client:v6.1.1

    Description

      After I change the user in the Spark image, the running program reports an error. What is the problem

      ++ id -u
      + myuid=2023
      ++ id -g
      + mygid=2023
      + set +e
      ++ getent passwd 2023
      + uidentry=zndw:x:2023:2023::/home/zndw:/bin/sh
      + set -e
      + '[' -z zndw:x:2023:2023::/home/zndw:/bin/sh ']'
      + SPARK_K8S_CMD=driver
      + case "$SPARK_K8S_CMD" in
      + shift 1
      + SPARK_CLASSPATH=':/opt/spark/jars/*'
      + env
      + grep SPARK_JAVA_OPT_
      + sort -t_ -k4 -n
      + sed 's/[^=]=(.)/\1/g'
      + readarray -t SPARK_EXECUTOR_JAVA_OPTS
      + '[' -n '' ']'
      + '[' -n '' ']'
      + PYSPARK_ARGS=
      + '[' -n '' ']'
      + R_ARGS=
      + '[' -n '' ']'
      + '[' '' == 2 ']'
      + '[' '' == 3 ']'
      + case "$SPARK_K8S_CMD" in
      + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
      + exec /sbin/tini -s – /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=10.1.1.11 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class com.frontier.pueedas.computer.batchTool.etl.EtlScheduler 'http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false' configMode=HDFS metaMode=HDFS platformConfigMode=NACOS storeConfigMode=NACOS startDate=2022-08-02 endDate=2022-08-03 _file=/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml runMode=TEST
      2022-10-14 06:52:21 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      2022-10-14 06:52:29 INFO  SparkContext:54 - Running Spark version 2.4.0
      2022-10-14 06:52:29 INFO  SparkContext:54 - Submitted application: [TEST]ETL[2022-08-02 00:00:00,2022-08-03 00:00:00]{/user/config/YC2/TEST/config/computer/business/opc/EMpHpReadCurveHourData.xml}
      2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls to: zndw,root
      2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls to: zndw,root
      2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing view acls groups to: 
      2022-10-14 06:52:29 INFO  SecurityManager:54 - Changing modify acls groups to: 
      2022-10-14 06:52:29 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(zndw, root); groups with view permissions: Set(); users  with modify permissions: Set(zndw, root); groups with modify permissions: Set()
      2022-10-14 06:52:29 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 7078.
      2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering MapOutputTracker
      2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering BlockManagerMaster
      2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
      2022-10-14 06:52:29 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
      2022-10-14 06:52:29 INFO  DiskBlockManager:54 - Created local directory at /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/blockmgr-79ab0f0d-6f9e-401e-aa90-91baa00a3ff3
      2022-10-14 06:52:29 INFO  MemoryStore:54 - MemoryStore started with capacity 912.3 MB
      2022-10-14 06:52:29 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
      2022-10-14 06:52:30 INFO  log:192 - Logging initialized @9926ms
      2022-10-14 06:52:30 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
      2022-10-14 06:52:30 INFO  Server:419 - Started @10035ms
      2022-10-14 06:52:30 INFO  AbstractConnector:278 - Started ServerConnector@66f0548d{HTTP/1.1,[http/1.1]}

      {0.0.0.0:4040}

      2022-10-14 06:52:30 INFO  Utils:54 - Successfully started service 'SparkUI' on port 4040.
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@59ed3e6c{/jobs,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@70c53dbe{/jobs/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1894e40d{/jobs/job,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7342e05d{/jobs/job/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2a331b46{/stages,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@15383681{/stages/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@743e66f7{/stages/stage,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@761956ac{/stages/stage/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@71978f46{/stages/pool,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@304d0259{/stages/pool/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1d23ff23{/storage,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2133661d{/storage/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6c9320c2{/storage/rdd,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3414a8c3{/storage/rdd/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@36cc9385{/environment,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@cf518cf{/environment/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7915bca3{/executors,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@68d651f2{/executors/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3ad4a7d6{/executors/threadDump,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1e43e323{/executors/threadDump/json,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4a67b4ec{/static,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2def7a7a{/,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@24e83d19{/api,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2b03d52f{/jobs/job/kill,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37e0292a{/stages/stage/kill,null,AVAILABLE,@Spark}
      2022-10-14 06:52:30 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://computer-batch-etl-hadoop-basic-5d159383d5437799-driver-svc.spark.svc:4040
      2022-10-14 06:52:30 INFO  SparkContext:54 - Added JAR local:///temp/spark_files/computer-package-batch-frame-hadoop_2.11-basic-jar-with-dependencies.jar at file:/temp/spark_files/computer-package-batch-frame-hadoop_2.11-basic-jar-with-dependencies.jar with timestamp 1665730350295
      2022-10-14 06:52:30 INFO  SparkContext:54 - Added JAR local:///temp/spark_files/computer-spark-cdh-common_2.11-2.0.0-jar-with-dependencies.jar at file:/temp/spark_files/computer-spark-cdh-common_2.11-2.0.0-jar-with-dependencies.jar with timestamp 1665730350296
      2022-10-14 06:52:30 INFO  SparkContext:54 - Added JAR local:///temp/spark_files/computer-spark-cdh-frame_2.11-2.0.0-jar-with-dependencies.jar at file:/temp/spark_files/computer-spark-cdh-frame_2.11-2.0.0-jar-with-dependencies.jar with timestamp 1665730350296
      2022-10-14 06:52:30 INFO  SparkContext:54 - Added JAR local:///temp/spark_files/computer-batch-hp-curve-hadoop-2.0.0.jar at file:/temp/spark_files/computer-batch-hp-curve-hadoop-2.0.0.jar with timestamp 1665730350297
      2022-10-14 06:52:30 INFO  SparkContext:54 - Added JAR http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false at http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false with timestamp 1665730350297
      2022-10-14 06:52:30 WARN  SparkContext:66 - The jar http://26.47.128.120:18000/spark/spark/raw/master/computer-batch-etl-hadoop-basic.jar?inline=false has been added already. Overwriting of added jars is not supported in the current version.
      Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/KubernetesClient
          at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:64)
          at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
          at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
          at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
          at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
          at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
          at scala.Option.getOrElse(Option.scala:121)
          at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
          at com.frontier.pueedas.computer.common.util.SparkSessionUtil$.getSparkSession(SparkSessionUtil.scala:41)
          at com.frontier.pueedas.computer.batchTool.etl.EtlComputer.getSparkSession(EtlComputer.scala:346)
          at com.frontier.pueedas.computer.batchTool.etl.EtlComputer.start(EtlComputer.scala:230)
          at com.frontier.pueedas.computer.batchTool.etl.EtlScheduler$.main(EtlScheduler.scala:30)
          at com.frontier.pueedas.computer.batchTool.etl.EtlScheduler.main(EtlScheduler.scala)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
          at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
          at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
          at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
          at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
          at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
          at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
          at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      Caused by: java.lang.ClassNotFoundException: io.fabric8.kubernetes.client.KubernetesClient
          at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
          at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
          ... 25 more
      2022-10-14 06:52:30 INFO  DiskBlockManager:54 - Shutdown hook called
      2022-10-14 06:52:30 INFO  ShutdownHookManager:54 - Shutdown hook called
      2022-10-14 06:52:30 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-96c8562a-aa98-4f33-b73a-a09ec99705ac
      2022-10-14 06:52:30 INFO  ShutdownHookManager:54 - Deleting directory /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/spark-828cbff3-6b49-48f0-ada9-854c9891fc86/userFiles-5d6483c6-a1e2-42a0-8aaf-3699aaeba8ff
      2022-10-14 06:52:30 INFO  ShutdownHookManager:54 - Deleting directory /var/data/spark-9a270950-7527-4d08-a7bd-d6c1062e8522/spark-828cbff3-6b49-48f0-ada9-854c9891fc86

      Attachments

        1. Dockerfile
          3 kB
          jiangjian
        2. Dockerfile-1
          3 kB
          jiangjian
        3. Dockerfile-2
          3 kB
          jiangjian
        4. spark-error.log
          11 kB
          jiangjian

        Activity

          People

            Unassigned Unassigned
            jiangjian jiangjian
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: