Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12216

Spark failed to delete temp directory

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Invalid
    • None
    • None
    • Spark Shell
    • None

    Description

      The mailing list archives have no obvious solution to this:

      scala> :q
      Stopping spark context.
      15/12/08 16:24:22 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
      java.io.IOException: Failed to delete: C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
      at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
      at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:63)
      at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:60)
      at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
      at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:60)
      at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at scala.util.Try$.apply(Try.scala:161)
      at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
      at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              skypickle stefan
              Votes:
              1 Vote for this issue
              Watchers:
              17 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: