Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7292 Hive on Spark
  3. HIVE-10291

Hive on Spark job configuration needs to be logged [Spark Branch]

    XMLWordPrintableJSON

    Details

    • Type: Sub-task
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.1.0
    • Fix Version/s: 1.2.0
    • Component/s: Spark
    • Labels:
      None

      Description

      In a Hive on MR job, all the job properties are put into the JobConf, which can then be viewed via the MR2 HistoryServer's Job UI.

      However, in Hive on Spark we are submitting an application that is long-lived. Hence, we only put properties into the SparkConf relevant to application submission (spark and yarn properties). Only these are viewable through the Spark HistoryServer Application UI.

      It is the Hive application code (RemoteDriver, aka RemoteSparkContext) that is responsible for serializing and deserializing the job.xml per job (ie, query) within the application. Thus, for supportability we also need to give an equivalent mechanism to print the job.xml per job.

        Attachments

        1. HIVE-10291-spark.patch
          3 kB
          Szehon Ho
        2. HIVE-10291.2-spark.patch
          3 kB
          Szehon Ho
        3. HIVE-10291.3-spark.patch
          2 kB
          Szehon Ho

          Activity

            People

            • Assignee:
              szehon Szehon Ho
              Reporter:
              szehon Szehon Ho
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: