Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33214

HiveExternalCatalogVersionsSuite shouldn't use or delete hard-coded /tmp directory

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 3.0.1
    • Fix Version/s: 3.1.0
    • Component/s: SQL, Tests
    • Labels:
      None
    • Target Version/s:

      Description

      In SPARK-22356, the sparkTestingDir used by HiveExternalCatalogVersionsSuite became hard-coded to enable re-use of the downloaded Spark tarball between test executions:

        // For local test, you can set `sparkTestingDir` to a static value like `/tmp/test-spark`, to
        // avoid downloading Spark of different versions in each run.
        private val sparkTestingDir = new File("/tmp/test-spark")
      

      However this doesn't work, since it gets deleted every time:

        override def afterAll(): Unit = {
          try {
            Utils.deleteRecursively(wareHousePath)
            Utils.deleteRecursively(tmpDataDir)
            Utils.deleteRecursively(sparkTestingDir)
          } finally {
            super.afterAll()
          }
        }
      

      It's bad that we're hard-coding to a /tmp directory, as in some cases this is not the proper place to store temporary files. We're not currently making any good use of it.

        Attachments

          Activity

            People

            • Assignee:
              xkrogen Erik Krogen
              Reporter:
              xkrogen Erik Krogen
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: