Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7371

Identify a minimum set of JARs needed to ship to Spark cluster [Spark Branch]

    XMLWordPrintableJSON

    Details

    • Type: Task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.1.0
    • Component/s: Spark
    • Labels:
      None

      Description

      Currently, Spark client ships all Hive JARs, including those that Hive depends on, to Spark cluster when a query is executed by Spark. This is not efficient, causing potential library conflicts. Ideally, only a minimum set of JARs needs to be shipped. This task is to identify such a set.

      We should learn from current MR cluster, for which I assume only hive-exec JAR is shipped to MR cluster.

      We also need to ensure that user-supplied JARs are also shipped to Spark cluster, in a similar fashion as MR does.

      NO PRECOMMIT TESTS. This is for spark-branch only.

        Attachments

        1. HIVE-7371-Spark.1.patch
          7 kB
          Chengxiang Li
        2. HIVE-7371-Spark.2.patch
          7 kB
          Chengxiang Li
        3. HIVE-7371-Spark.3.patch
          7 kB
          Xuefu Zhang

          Issue Links

            Activity

              People

              • Assignee:
                chengxiang li Chengxiang Li
                Reporter:
                xuefuz Xuefu Zhang
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: