Description
In SPARK-22356, the sparkTestingDir used by HiveExternalCatalogVersionsSuite became hard-coded to enable re-use of the downloaded Spark tarball between test executions:
// For local test, you can set `sparkTestingDir` to a static value like `/tmp/test-spark`, to // avoid downloading Spark of different versions in each run. private val sparkTestingDir = new File("/tmp/test-spark")
However this doesn't work, since it gets deleted every time:
override def afterAll(): Unit = { try { Utils.deleteRecursively(wareHousePath) Utils.deleteRecursively(tmpDataDir) Utils.deleteRecursively(sparkTestingDir) } finally { super.afterAll() } }
It's bad that we're hard-coding to a /tmp directory, as in some cases this is not the proper place to store temporary files. We're not currently making any good use of it.