Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7371

Identify a minimum set of JARs needed to ship to Spark cluster [Spark Branch]

    XMLWordPrintableJSON

Details

    • Task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.1.0
    • Spark
    • None

    Description

      Currently, Spark client ships all Hive JARs, including those that Hive depends on, to Spark cluster when a query is executed by Spark. This is not efficient, causing potential library conflicts. Ideally, only a minimum set of JARs needs to be shipped. This task is to identify such a set.

      We should learn from current MR cluster, for which I assume only hive-exec JAR is shipped to MR cluster.

      We also need to ensure that user-supplied JARs are also shipped to Spark cluster, in a similar fashion as MR does.

      NO PRECOMMIT TESTS. This is for spark-branch only.

      Attachments

        1. HIVE-7371-Spark.1.patch
          7 kB
          Chengxiang Li
        2. HIVE-7371-Spark.2.patch
          7 kB
          Chengxiang Li
        3. HIVE-7371-Spark.3.patch
          7 kB
          Xuefu Zhang

        Issue Links

          Activity

            People

              chengxiang li Chengxiang Li
              xuefuz Xuefu Zhang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: