Details
-
Bug
-
Status: Closed
-
Minor
-
Resolution: Cannot Reproduce
-
1.3.1
-
None
-
None
-
Centos 6.7, HDP 2.2
Description
When running Spark jobs, this error appears many times on the output, but the job keeps running and produces results.
I've found similar bug which display similar error messages, but none of them say "Broken pipe".
15/10/05 11:05:37 ERROR DAGScheduler: Failed to update accumulators for ShuffleMapTask(49, 29)
java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113)
at java.net.SocketOutputStream.write(SocketOutputStream.java:159)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.DataOutputStream.flush(DataOutputStream.java:123)
at org.apache.spark.api.python.PythonAccumulatorParam.addInPlace(PythonRDD.scala:827)
at org.apache.spark.api.python.PythonAccumulatorParam.addInPlace(PythonRDD.scala:789)
at org.apache.spark.Accumulable.$plus$plus$eq(Accumulators.scala:81)
at org.apache.spark.Accumulators$$anonfun$add$2.apply(Accumulators.scala:323)
at org.apache.spark.Accumulators$$anonfun$add$2.apply(Accumulators.scala:321)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
at org.apache.spark.Accumulators$.add(Accumulators.scala:321)
at org.apache.spark.scheduler.DAGScheduler.updateAccumulators(DAGScheduler.scala:890)
at org.apache.spark.scheduler.DAGScheduler.handleTaskCompletion(DAGScheduler.scala:974)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1390)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)