Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-1735

Class not found exception when execute rdd transform function

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Cannot Reproduce
    • 0.7.0
    • 0.8.0
    • Interpreters
    • Red Hat Enterprise Linux Server release 7.2
      Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
      CDH-5.7.1
      spark 1.6.0

    Description

      when execute this script
      %spark
      val a = sc.parallelize(1 to 9, 3)
      val b = a.map(_*3)
      b.collect().foreach(println)

      Error Message:
      org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 4 times, most recent failure: Lost task 2.3 in stage 6.0 (TID 25, XXX): java.lang.ClassNotFoundException: $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
      at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
      at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
      at java.security.AccessController.doPrivileged(Native Method)
      at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
      at java.lang.Class.forName0(Native Method)
      at java.lang.Class.forName(Class.java:274)
      at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
      at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
      at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
      at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
      at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
      at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
      at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
      at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
      at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
      at org.apache.spark.scheduler.Task.run(Task.scala:89)
      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      at java.lang.Thread.run(Thread.java:745)
      Driver stacktrace:
      at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
      at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
      at scala.Option.foreach(Option.scala:236)
      at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
      at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
      at org.apache.spark.SparkContext.runJob(SparkContext.scala:1843)
      at org.apache.spark.SparkContext.runJob(SparkContext.scala:1856)
      at org.apache.spark.SparkContext.runJob(SparkContext.scala:1869)
      at org.apache.spark.SparkContext.runJob(SparkContext.scala:1940)
      at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
      at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
      at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
      at org.apache.spark.rdd.RDD.collect(RDD.scala:926)
      at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
      at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
      at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
      at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
      at $iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
      at $iwC$$iwC$$iwC.<init>(<console>:47)
      at $iwC$$iwC.<init>(<console>:49)
      at $iwC.<init>(<console>:51)
      at <init>(<console>:53)
      at .<init>(<console>:57)
      at .<clinit>(<console>)
      at .<init>(<console>:7)
      at .<clinit>(<console>)
      at $print(<console>)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:606)
      at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
      at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
      at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
      at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
      at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
      at sun.reflect.GeneratedMethodAccessor51.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:606)
      at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
      at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:891)
      at org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:1104)
      at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1050)
      at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1042)
      at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
      at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:383)
      at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
      at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
      at java.util.concurrent.FutureTask.run(FutureTask.java:262)
      at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
      at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      at java.lang.Thread.run(Thread.java:745)
      Caused by: java.lang.ClassNotFoundException: $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
      at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
      at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
      at java.security.AccessController.doPrivileged(Native Method)
      at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
      at java.lang.Class.forName0(Native Method)
      at java.lang.Class.forName(Class.java:274)
      at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
      at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
      at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
      at java
      .io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
      at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
      at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
      at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
      at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
      at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
      at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
      at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
      at org.apache.spark.scheduler.Task.run(Task.scala:89)
      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)

      This script is ok. Colllect data to driver
      %spark
      val a = sc.parallelize(1 to 9, 3)
      a.collect().map(_*3).foreach(println)
      3
      6
      9
      12
      15
      18
      21
      24
      27

      Attachments

        1. test.png
          115 kB
          Jongyoul Lee

        Issue Links

          Activity

            People

              Unassigned Unassigned
              1916038084@qq.com zhenhuanli708
              Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - 96h
                  96h
                  Remaining:
                  Remaining Estimate - 96h
                  96h
                  Logged:
                  Time Spent - Not Specified
                  Not Specified