With the introduction of ContextCleaner, I think there's no longer any reason for most users to enable the MetadataCleaner / spark.cleaner.ttl (except perhaps for super-long-lived Spark REPLs where you're worried about orphaning RDDs or broadcast variables in your REPL history and having them never get cleaned up, although I think this is an uncommon use-case). I think that this property used to be relevant for Spark Streaming jobs, but I think that's no longer the case since the latest Streaming docs have removed all mentions of spark.cleaner.ttl (see https://github.com/apache/spark/pull/4956/files#diff-dbee746abf610b52d8a7cb65bf9ea765L1817, for example).
See http://apache-spark-user-list.1001560.n3.nabble.com/is-spark-cleaner-ttl-safe-td2557.html for an old, related discussion. Also, see https://github.com/apache/spark/pull/126, the PR that introduced the new ContextCleaner mechanism.
For Spark 2.0, I think we should remove spark.cleaner.ttl and the associated TTL-based metadata cleaning code.