Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Duplicate
-
1.6.0
-
None
-
None
Description
In our cluster, we set spark.speculation=true, but when a task throw exception at SparkHadoopMapRedUtil.performCommit(), this task can retry infinite.
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/mapred/SparkHadoopMapRedUtil.scala#L83
Attachments
Attachments
Issue Links
- duplicates
-
SPARK-14915 Tasks that fail due to CommitDeniedException (a side-effect of speculation) can cause job to never complete
- Resolved
- links to