Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21253

Cannot fetch big blocks to disk

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.2.0
    • 2.2.0
    • Spark Core
    • None

    Description

      Spark cluster can reproduce, local can't:

      1. Start a spark context with spark.reducer.maxReqSizeShuffleToMem=1K:

      $ spark-shell --conf spark.reducer.maxReqSizeShuffleToMem=1K --conf spark.serializer=org.apache.spark.serializer.KryoSerializer
      

      2. A shuffle:

      scala> sc.parallelize(0 until 3000000, 10).repartition(2001).count()
      

      The error messages:

      org.apache.spark.shuffle.FetchFailedException: Failed to send request for 1649611690367_2 to yhd-jqhadoop166.int.yihaodian.com/10.17.28.166:7337: java.io.IOException: Connection reset by peer
              at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:442)
              at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:418)
              at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:59)
              at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
              at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
              at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
              at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
              at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
              at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
              at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
              at scala.collection.Iterator$class.foreach(Iterator.scala:893)
              at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
              at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
              at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
              at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
              at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
              at scala.collection.AbstractIterator.to(Iterator.scala:1336)
              at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
              at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
              at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
              at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
              at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
              at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
              at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
              at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
              at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
              at org.apache.spark.scheduler.Task.run(Task.scala:108)
              at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
              at java.lang.Thread.run(Thread.java:745)
      Caused by: java.io.IOException: Failed to send request for 1649611690367_2 to yhd-jqhadoop166.int.yihaodian.com/10.17.28.166:7337: java.io.IOException: Connection reset by peer
              at org.apache.spark.network.client.TransportClient.lambda$stream$1(TransportClient.java:196)
              at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
              at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
              at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
              at io.netty.util.concurrent.DefaultPromise.addListener(DefaultPromise.java:163)
              at io.netty.channel.DefaultChannelPromise.addListener(DefaultChannelPromise.java:93)
              at io.netty.channel.DefaultChannelPromise.addListener(DefaultChannelPromise.java:28)
              at org.apache.spark.network.client.TransportClient.stream(TransportClient.java:183)
              at org.apache.spark.network.shuffle.OneForOneBlockFetcher$1.onSuccess(OneForOneBlockFetcher.java:123)
              at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:176)
              at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
              at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
              at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
              at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
              at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
              at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
              at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
              at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
              at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
              at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
              at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
              at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
              at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
              at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
              at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
              at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
              at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
              at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      

      Attachments

        Activity

          People

            zsxwing Shixiong Zhu
            yumwang Yuming Wang
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: