Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3885

Provide mechanism to remove accumulators once they are no longer used

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.0.2, 1.1.0, 1.2.0
    • 1.4.0
    • Spark Core
    • None

    Description

      Spark does not currently provide any mechanism to delete accumulators after they are no longer used. This can lead to OOMs for long-lived SparkContexts that create many large accumulators.

      Part of the problem is that accumulators are registered in a global Accumulators registry. Maybe the fix would be as simple as using weak references in the Accumulators registry so that accumulators can be GC'd once they can no longer be used.

      In the meantime, here's a workaround that users can try:

      Accumulators have a public setValue() method that can be called (only by the driver) to change an accumulator’s value. You might be able to use this to reset accumulators’ values to smaller objects (e.g. the “zero” object of whatever your accumulator type is, or ‘null’ if you’re sure that the accumulator will never be accessed again).

      This issue was originally reported by nkronenfeld on the dev mailing list: http://apache-spark-developers-list.1001551.n3.nabble.com/Fwd-Accumulator-question-td8709.html

      Attachments

        Issue Links

          Activity

            People

              ilganeli Ilya Ganelin
              joshrosen Josh Rosen
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: