Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18979

ShutdownHookManager:Exception while deleting Spark temp dir

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.5.2
    • None
    • Spark Core
    • None

    Description

      when i stop the worker process, the SPARK_LOCAL_DIRS should be delete recursively but failed. Exception info:
      2016-12-15 20:12:59,930 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: /data2/zdh/spark/tmp/spark-67cf188b-5978-42d9-8ce6-e181f0ba4d0d
      java.io.IOException: Failed to delete: /data2/zdh/spark/tmp/spark-67cf188b-5978-42d9-8ce6-e181f0ba4d0d
      at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
      at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:63)
      at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:60)
      at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
      at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:60)
      at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at scala.util.Try$.apply(Try.scala:161)
      at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
      at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              zuo.tingbing9 zuotingbing
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: