Uploaded image for project: 'Mahout'
  1. Mahout
  2. MAHOUT-1607

spark-shell:scheduler.DAGScheduler: Failed to run fold at CheckpointedDrmSpark.scala:192

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Not A Problem
    • 0.9, 1.0.0
    • 0.10.0
    • classic
    • ubuntu 13.x x64
      jdk1.7.0_65
      scala 2.10.4
      spark 1.0.2

    Description

      follow the step by http://mahout.apache.org/users/sparkbindings/play-with-shell.html, mahou spark-shell startup normally,but when exec "val drmX = drmData(::, 0 until 4);" ,it throw exception as bellow:

      14/08/17 20:13:20 INFO scheduler.DAGScheduler: Failed to run fold at CheckpointedDrmSpark.scala:192
      14/08/17 20:13:20 INFO scheduler.TaskSetManager: Loss was due to java.lang.ArrayStoreException: scala.Tuple3 [duplicate 6]
      14/08/17 20:13:20 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
      org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:1 failed 4 times, most recent failure: Exception failure in TID 6 on host iZ23qefud7nZ: java.lang.ArrayStoreException: scala.Tuple3
      com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:338)
      com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:293)
      com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
      com.twitter.chill.WrappedArraySerializer.read(WrappedArraySerializer.scala:34)
      com.twitter.chill.WrappedArraySerializer.read(WrappedArraySerializer.scala:21)
      com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
      org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:118)
      org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:80)
      org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:80)
      org.apache.spark.util.Utils$.deserializeViaNestedStream(Utils.scala:120)
      org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:80)
      sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      java.lang.reflect.Method.invoke(Method.java:606)
      java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
      java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
      java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
      java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      java.io.ObjectInputStream.skipCustomData(ObjectInputStream.java:1956)
      java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1850)
      java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
      java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
      java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
      org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
      org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85)
      org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:165)
      java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      java.lang.Thread.run(Thread.java:745)
      Driver stacktrace:
      at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1049)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1033)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1031)
      at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1031)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
      at scala.Option.foreach(Option.scala:236)
      at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:635)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1234)
      at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
      at akka.actor.ActorCell.invoke(ActorCell.scala:456)
      at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
      at akka.dispatch.Mailbox.run(Mailbox.scala:219)
      at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
      at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
      at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
      at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
      at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

      Attachments

        Activity

          People

            Andrew_Palumbo Andrew Palumbo
            hhlin hhlin
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: