-
Type:
Bug
-
Status: Resolved
-
Priority:
Blocker
-
Resolution: Fixed
-
Affects Version/s: None
-
Fix Version/s: 1.2.0
-
Component/s: Spark Core
-
Labels:None
In previous versions of Spark, accumulator updates only got applied once for accumulators that are only used in actions (i.e. result stages), letting you use them to deterministically compute a result. Unfortunately, this got broken in some recent refactorings.
This is related to https://issues.apache.org/jira/browse/SPARK-732, but that issue is about applying the same semantics to intermediate stages too, which is more work and may not be what we want for debugging.
- relates to
-
SPARK-732 Recomputation of RDDs may result in duplicated accumulator updates
-
- Closed
-
- links to