Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
1.3.1
-
None
-
None
-
Windows 7, CentOS 6.6
Description
Spark does not delete temporary local directories.
After a spark program completes, there are 3 temporary directories remain in the temp directory. The directory names are like this: spark-2e389487-40cc-4a82-a5c7-353c0feefbb7
The directories are empty.
They are created every time the Spark program runs. So the number of files and directories keeps growing.
I've traced the spark source code.
The module methods that create the 3 'temp' directories are as follows:
- DiskBlockManager.createLocalDirs
- HttpFileServer.initialize
- SparkEnv.sparkFilesDir
They (eventually) call Utils.getOrCreateLocalRootDirs and then Utils.createDirectory, which intentionally does NOT mark the directory for automatic deletion.
The comment of createDirectory method says: "The directory is guaranteed to be newly created, and is not marked for automatic deletion."
But since the directories does not hold useful data after the program completes, they should be deleted if possible.
Attachments
Issue Links
- duplicates
-
SPARK-7503 Resources in .sparkStaging directory can't be cleaned up on error
- Resolved
- is duplicated by
-
SPARK-2572 Can't delete local dir on executor automatically when running spark over Mesos.
- Resolved