Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-690

Stack overflow when running pagerank more than 10000 iterators

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Abandoned
    • 0.6.1
    • None
    • Spark Core
    • None

    Description

      when I run PageRank example more than 10000 iterators, Job client will report stack overflow errors.

      13/02/07 13:41:40 INFO CacheTracker: Registering RDD ID 57993 with cache
      Exception in thread "DAGScheduler" java.lang.StackOverflowError
      at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryAcquireShared(ReentrantReadWriteLock.java:467)
      at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireShared(AbstractQueuedSynchronizer.java:1281)
      at java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.lock(ReentrantReadWriteLock.java:731)
      at org.jboss.netty.akka.util.HashedWheelTimer.scheduleTimeout(HashedWheelTimer.java:277)
      at org.jboss.netty.akka.util.HashedWheelTimer.newTimeout(HashedWheelTimer.java:264)
      at akka.actor.DefaultScheduler.scheduleOnce(Scheduler.scala:186)
      at akka.pattern.PromiseActorRef$.apply(AskSupport.scala:274)
      at akka.pattern.AskSupport$class.ask(AskSupport.scala:83)
      at akka.pattern.package$.ask(package.scala:43)
      at akka.pattern.AskSupport$AskableActorRef.ask(AskSupport.scala:123)
      at spark.CacheTracker.askTracker(CacheTracker.scala:121)
      at spark.CacheTracker.communicate(CacheTracker.scala:131)
      at spark.CacheTracker.registerRDD(CacheTracker.scala:142)
      at spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:149)
      at spark.scheduler.DAGScheduler$$anonfun$visit$1$2.apply(DAGScheduler.scala:155)
      at spark.scheduler.DAGScheduler$$anonfun$visit$1$2.apply(DAGScheduler.scala:150)
      at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
      at scala.collection.immutable.List.foreach(List.scala:76)
      at spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:150)
      at spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:160)
      at spark.scheduler.DAGScheduler.newStage(DAGScheduler.scala:131)
      at spark.scheduler.DAGScheduler.getShuffleMapStage(DAGScheduler.scala:111)
      at spark.scheduler.DAGScheduler$$anonfun$visit$1$2.apply(DAGScheduler.scala:153)
      at spark.scheduler.DAGScheduler$$anonfun$visit$1$2.apply(DAGScheduler.scala:150)
      at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)

      Attachments

        Activity

          People

            Unassigned Unassigned
            andrew xia xiajunluan
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: