Uploaded image for project: 'Mahout'
  1. Mahout
  2. MAHOUT-2057

Example in README results in class not found

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 0.14.0
    • Fix Version/s: 0.14.1
    • Component/s: None
    • Labels:
      None

      Description

      Running the example in the README gives a class not found:
      "java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap"
       
      If that's just us still using something that's been removed, it's not a deal-breaker for me as long as we fix it in a quick point release.
       
      Pending that being a simple fix my vote is +1 binding, and if Andy's not back from vacation and his proxy works that's +2 binding from me and Andy.
       
       
      bob $ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
      bob $ export MAHOUT_HOME=//home/akm/a/src/test/repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0
      bob $ export SPARK_HOME=/home/akm/a/src/spark-2.1.0-bin-hadoop2.7
      bob $ MASTER=local[2] mahout-0.14.0/bin/mahout spark-shell
      Adding lib/ to CLASSPATH
      Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
      19/03/04 09:07:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      19/03/04 09:07:44 WARN Utils: Your hostname, Bob resolves to a loopback address: 127.0.1.1; using 10.0.1.2 instead (on interface eno1)
      19/03/04 09:07:44 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
      19/03/04 09:07:53 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
      Spark context Web UI available at http://10.0.1.2:4040
      Spark context available as 'sc' (master = local[2], app id = local-1551719265339).
      Spark session available as 'spark'.
      Loading /home/akm/a/src/test/repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0/bin/load-shell.scala...
      import org.apache.mahout.math._
      import org.apache.mahout.math.scalabindings._
      import org.apache.mahout.math.drm._
      import org.apache.mahout.math.scalabindings.RLikeOps._
      import org.apache.mahout.math.drm.RLikeDrmOps._
      import org.apache.mahout.sparkbindings._
      sdc: org.apache.mahout.sparkbindings.SparkDistributedContext = org.apache.mahout.sparkbindings.SparkDistributedContext@749ffdc7

                      _                 _
      _ __ __   __ _| |   ___  _   _| |
       '_ ` _ \ / ` | ' \ / _ | | | | __|
       | | | | (| | | | | () | || | |
      | || ||_,|| ||_/ _,|_|  version 0.14.0

      That file does not exist

      Welcome to
            ____              __
           / _/  ___ _____/ /_
          \ \/ _ \/ _ `/ __/  '/
         /__/ ./_,// //_\   version 2.1.0
            /_/
              
      Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)
      Type in expressions to have them evaluated.
      Type :help for more information.

      scala> :load /home/akm/a/src/test/repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0/examples/bin/SparseSparseDrmTimer.mscala
      Loading /home/akm/a/src/test/repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0/examples/bin/SparseSparseDrmTimer.mscala...
      timeSparseDRMMMul: (m: Int, n: Int, s: Int, para: Int, pctDense: Double, seed: Long)Long

      scala> timeSparseDRMMMul(1000,1000,1000,1,.02,1234L)
      19/03/04 09:13:13 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 1)
      java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.immutable.Range.foreach(Range.scala:160)
          at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
          at scala.collection.AbstractTraversable.map(Traversable.scala:104)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
          at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
          at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
          at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
          at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
          at org.apache.spark.scheduler.Task.run(Task.scala:99)
          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
          at java.lang.Thread.run(Thread.java:748)
      19/03/04 09:13:13 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 0)
      java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.immutable.Range.foreach(Range.scala:160)
          at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
          at scala.collection.AbstractTraversable.map(Traversable.scala:104)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
          at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
          at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
          at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
          at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
          at org.apache.spark.scheduler.Task.run(Task.scala:99)
          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
          at java.lang.Thread.run(Thread.java:748)
      Caused by: java.lang.ClassNotFoundException: it.unimi.dsi.fastutil.ints.Int2DoubleOpenHashMap
          at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
          ... 35 more
      19/03/04 09:13:13 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 1, localhost, executor driver): java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.immutable.Range.foreach(Range.scala:160)
          at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
          at scala.collection.AbstractTraversable.map(Traversable.scala:104)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
          at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
          at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
          at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
          at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
          at org.apache.spark.scheduler.Task.run(Task.scala:99)
          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
          at java.lang.Thread.run(Thread.java:748)

      19/03/04 09:13:13 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
      19/03/04 09:13:13 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 0, localhost, executor driver): java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.immutable.Range.foreach(Range.scala:160)
          at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
          at scala.collection.AbstractTraversable.map(Traversable.scala:104)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
          at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
          at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
          at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
          at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
          at org.apache.spark.scheduler.Task.run(Task.scala:99)
          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
          at java.lang.Thread.run(Thread.java:748)
      Caused by: java.lang.ClassNotFoundException: it.unimi.dsi.fastutil.ints.Int2DoubleOpenHashMap
          at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
          ... 35 more

      19/03/04 09:13:13 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job
      org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 1, localhost, executor driver): java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
          at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
          at scala.collection.immutable.Range.foreach(Range.scala:160)
          at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
          at scala.collection.AbstractTraversable.map(Traversable.scala:104)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
          at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
          at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
          at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
          at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
          at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
          at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
          at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
          at org.apache.spark.scheduler.Task.run(Task.scala:99)
          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
          at java.lang.Thread.run(Thread.java:748)

      Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
        at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:935)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
        at org.apache.spark.rdd.RDD.collect(RDD.scala:934)
        at org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.collect(CheckpointedDrmSpark.scala:128)
        at org.apache.mahout.math.drm.package$.drm2InCore(package.scala:98)
        at timeSparseDRMMMul(<console>:87)
        ... 60 elided
      Caused by: java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
        at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
        at org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
        at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
        at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.immutable.Range.foreach(Range.scala:160)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
        at org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
        at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
        at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
        at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
        at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
        at org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
        at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

      scala>

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              andrew.musselman Andrew Musselman
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated: