Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33214

HiveExternalCatalogVersionsSuite shouldn't use or delete hard-coded /tmp directory

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.1
    • 3.1.0
    • SQL, Tests
    • None

    Description

      In SPARK-22356, the sparkTestingDir used by HiveExternalCatalogVersionsSuite became hard-coded to enable re-use of the downloaded Spark tarball between test executions:

        // For local test, you can set `sparkTestingDir` to a static value like `/tmp/test-spark`, to
        // avoid downloading Spark of different versions in each run.
        private val sparkTestingDir = new File("/tmp/test-spark")
      

      However this doesn't work, since it gets deleted every time:

        override def afterAll(): Unit = {
          try {
            Utils.deleteRecursively(wareHousePath)
            Utils.deleteRecursively(tmpDataDir)
            Utils.deleteRecursively(sparkTestingDir)
          } finally {
            super.afterAll()
          }
        }
      

      It's bad that we're hard-coding to a /tmp directory, as in some cases this is not the proper place to store temporary files. We're not currently making any good use of it.

      Attachments

        Activity

          People

            xkrogen Erik Krogen
            xkrogen Erik Krogen
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: