Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.0.2, 1.1.0, 1.2.0
-
None
Description
Spark does not currently provide any mechanism to delete accumulators after they are no longer used. This can lead to OOMs for long-lived SparkContexts that create many large accumulators.
Part of the problem is that accumulators are registered in a global Accumulators registry. Maybe the fix would be as simple as using weak references in the Accumulators registry so that accumulators can be GC'd once they can no longer be used.
In the meantime, here's a workaround that users can try:
Accumulators have a public setValue() method that can be called (only by the driver) to change an accumulator’s value. You might be able to use this to reset accumulators’ values to smaller objects (e.g. the “zero” object of whatever your accumulator type is, or ‘null’ if you’re sure that the accumulator will never be accessed again).
This issue was originally reported by nkronenfeld on the dev mailing list: http://apache-spark-developers-list.1001551.n3.nabble.com/Fwd-Accumulator-question-td8709.html
Attachments
Issue Links
- is related to
-
SPARK-4030 `destroy` method in Broadcast should be public
- Resolved
- links to