Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11806 Spark 2.0 deprecations and removals
  3. SPARK-7689

Remove TTL-based metadata cleaning (spark.cleaner.ttl)

    XMLWordPrintableJSON

    Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 2.0.0
    • Component/s: Spark Core
    • Labels:
    • Target Version/s:

      Description

      With the introduction of ContextCleaner, I think there's no longer any reason for most users to enable the MetadataCleaner / spark.cleaner.ttl (except perhaps for super-long-lived Spark REPLs where you're worried about orphaning RDDs or broadcast variables in your REPL history and having them never get cleaned up, although I think this is an uncommon use-case). I think that this property used to be relevant for Spark Streaming jobs, but I think that's no longer the case since the latest Streaming docs have removed all mentions of spark.cleaner.ttl (see https://github.com/apache/spark/pull/4956/files#diff-dbee746abf610b52d8a7cb65bf9ea765L1817, for example).

      See http://apache-spark-user-list.1001560.n3.nabble.com/is-spark-cleaner-ttl-safe-td2557.html for an old, related discussion. Also, see https://github.com/apache/spark/pull/126, the PR that introduced the new ContextCleaner mechanism.

      For Spark 2.0, I think we should remove spark.cleaner.ttl and the associated TTL-based metadata cleaning code.

        Attachments

          Activity

            People

            • Assignee:
              joshrosen Josh Rosen
              Reporter:
              joshrosen Josh Rosen
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: