Uploaded image for project: 'CarbonData'
  1. CarbonData
  2. CARBONDATA-2408

Before register to master, the master maybe not finished the start service.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.4.0
    • None
    • None

    Description

      Before register to master, the master maybe not finished the start service.

      Error1:

      java.lang.RuntimeException: javax.security.sasl.SaslException: DIGEST-MD5: digest response format violation. Mismatched response.
      	at org.spark_project.guava.base.Throwables.propagate(Throwables.java:160)
      	at org.apache.spark.network.sasl.SparkSaslServer.response(SparkSaslServer.java:122)
      	at org.apache.spark.network.sasl.SaslRpcHandler.receive(SaslRpcHandler.java:101)
      	at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159)
      	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
      	at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
      	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
      	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
      	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
      	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
      	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
      	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
      	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
      	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: javax.security.sasl.SaslException: DIGEST-MD5: digest response format violation. Mismatched response.
      	at com.sun.security.sasl.digest.DigestMD5Server.validateClientResponse(DigestMD5Server.java:627)
      	at com.sun.security.sasl.digest.DigestMD5Server.evaluateResponse(DigestMD5Server.java:244)
      	at org.apache.spark.network.sasl.SparkSaslServer.response(SparkSaslServer.java:120)
      	... 31 more
      2018-04-26 19:57:28,801 | WARN  | [task-result-getter-0] | Lost task 0.0 in stage 15.0 (TID 1046, BLR1000014269, executor 6): org.apache.spark.SparkException: Exception thrown in awaitResult
      	at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
      	at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
      	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
      	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
      	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
      	at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
      	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
      	at org.apache.spark.rpc.Worker$.registerToMaster(Worker.scala:96)
      	at org.apache.spark.rpc.Worker$.init(Worker.scala:45)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:143)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:141)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:798)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:798)
      	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
      	at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
      	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
      	at org.apache.spark.scheduler.Task.run(Task.scala:99)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: org.apache.spark.SparkException: Message is dropped because Outbox is stopped
      	at org.apache.spark.rpc.netty.Outbox.stop(Outbox.scala:271)
      	at org.apache.spark.rpc.netty.NettyRpcEnv.removeOutbox(NettyRpcEnv.scala:107)
      	at org.apache.spark.rpc.netty.NettyRpcHandler.channelInactive(NettyRpcEnv.scala:624)
      	at org.apache.spark.network.server.TransportRequestHandler.channelInactive(TransportRequestHandler.java:99)
      	at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:103)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:241)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:227)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:220)
      	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
      	at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:278)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:241)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:227)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:220)
      	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:241)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:227)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:220)
      	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
      	at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:241)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:227)
      	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:220)
      	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1289)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:241)
      	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:227)
      	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:893)
      	at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:691)
      	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
      	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
      	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      	... 1 more
       | org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
      

      Error2:

      18/04/27 10:52:43 ERROR Executor: Exception in task 0.0 in stage 3.0 (TID 3)
      org.apache.spark.SparkException: Exception thrown in awaitResult: 
      	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
      	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
      	at org.apache.spark.rpc.Worker$.registerToMaster(Worker.scala:96)
      	at org.apache.spark.rpc.Worker$.init(Worker.scala:45)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:143)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:141)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
      	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
      	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
      	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
      	at org.apache.spark.scheduler.Task.run(Task.scala:108)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: java.io.IOException: Failed to connect to /127.0.0.1:10020
      	at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:232)
      	at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:182)
      	at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197)
      	at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194)
      	at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	... 3 more
      Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /127.0.0.1:10020
      	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
      	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
      	at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257)
      	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:631)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
      	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
      	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      	... 1 more
      18/04/27 10:52:43 ERROR TaskSetManager: Task 0 in stage 3.0 failed 1 times; aborting job
      Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3.0 failed 1 times, most recent failure: Lost task 0.0 in stage 3.0 (TID 3, localhost, executor driver): org.apache.spark.SparkException: Exception thrown in awaitResult: 
      	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
      	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
      	at org.apache.spark.rpc.Worker$.registerToMaster(Worker.scala:96)
      	at org.apache.spark.rpc.Worker$.init(Worker.scala:45)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:143)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:141)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
      	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
      	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
      	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
      	at org.apache.spark.scheduler.Task.run(Task.scala:108)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: java.io.IOException: Failed to connect to /127.0.0.1:10020
      	at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:232)
      	at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:182)
      	at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197)
      	at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194)
      	at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	... 3 more
      Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /127.0.0.1:10020
      	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
      	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
      	at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257)
      	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:631)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
      	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
      	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      	... 1 more
      
      Driver stacktrace:
      	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1517)
      	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1505)
      	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1504)
      	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
      	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1504)
      	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
      	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
      	at scala.Option.foreach(Option.scala:257)
      	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
      	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1732)
      	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1687)
      	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1676)
      	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
      	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2029)
      	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2050)
      	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2069)
      	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2094)
      	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
      	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
      	at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
      	at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
      	at org.apache.carbondata.store.SparkCarbonStore.startAllWorkers(SparkCarbonStore.scala:148)
      	at org.apache.carbondata.store.SparkCarbonStore.startSearchMode(SparkCarbonStore.scala:113)
      	at org.apache.spark.sql.CarbonSession.startSearchMode(CarbonSession.scala:186)
      	at org.apache.carbondata.examples.SearchModeExample$.exampleBody(SearchModeExample.scala:79)
      	at org.apache.carbondata.examples.SearchModeExample$.main(SearchModeExample.scala:37)
      	at org.apache.carbondata.examples.SearchModeExample.main(SearchModeExample.scala)
      Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: 
      	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
      	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
      	at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
      	at org.apache.spark.rpc.Worker$.registerToMaster(Worker.scala:96)
      	at org.apache.spark.rpc.Worker$.init(Worker.scala:45)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:143)
      	at org.apache.carbondata.store.SparkCarbonStore$$anonfun$1.apply(SparkCarbonStore.scala:141)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
      	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
      	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
      	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
      	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
      	at org.apache.spark.scheduler.Task.run(Task.scala:108)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Caused by: java.io.IOException: Failed to connect to /127.0.0.1:10020
      	at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:232)
      	at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:182)
      	at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197)
      	at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194)
      	at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	... 3 more
      Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /127.0.0.1:10020
      	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
      	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
      	at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257)
      	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:631)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
      	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
      	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
      	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
      	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
      	... 1 more
      
      

      Attachments

        Issue Links

          Activity

            People

              xubo245 Bo Xu
              xubo245 Bo Xu
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 11.5h
                  11.5h