Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19836

Customizable remote repository url for hive versions unit test

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: 2.1.0
    • Fix Version/s: None
    • Component/s: Tests
    • Labels:

      Description

      When the VersionSuite test runs from sql/hive it downloads different versions from hive.

      Unfortunately the IsolatedClientClassloader (which is used by the VersionSuite) uses hardcoded fix repositories:

          val classpath = quietly {
            SparkSubmitUtils.resolveMavenCoordinates(
              hiveArtifacts.mkString(","),
              SparkSubmitUtils.buildIvySettings(
                Some("http://www.datanucleus.org/downloads/maven2"),
                ivyPath),
              exclusions = version.exclusions)
          }
      

      The problem is with the hard-coded repositories:

      1. it's hard to run unit tests in an environment where only one internal maven repository is available (and central/datanucleus is not)

      2. it's impossible to run unit tests against custom built hive/hadoop artifacts (which are not available from the central repository)

      VersionSuite has already a specific SPARK_VERSIONS_SUITE_IVY_PATH environment variable to define a custom local repository as ivy cache.

      I suggest to add an additional environment variable (SPARK_VERSIONS_SUITE_IVY_REPOSITORIES to the HiveClientBuilder.scala), to make it possible adding new remote repositories for testing the different hive versions.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                elek Marton Elek
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: