Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7292 Hive on Spark
  3. HIVE-10434

Cancel connection when remote Spark driver process has failed [Spark Branch]

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.2.0
    • 1.3.0, 2.0.0
    • Spark
    • None

    Description

      Currently in HoS, in SparkClientImpl it first launch a remote Driver process, and then wait for it to connect back to the HS2. However, in certain situations (for instance, permission issue), the remote process may fail and exit with error code. In this situation, the HS2 process will still wait for the process to connect, and wait for a full timeout period before it throws the exception.

      What makes it worth, user may need to wait for two timeout periods: one for the SparkSetReducerParallelism, and another for the actual Spark job. This could be very annoying.

      We should cancel the timeout task once we found out that the process has failed, and set the promise as failed.

      Attachments

        1. HIVE-10434.4-spark.patch
          2 kB
          Chao Sun
        2. HIVE-10434.3-spark.patch
          2 kB
          Chao Sun
        3. HIVE-10434.1-spark.patch
          2 kB
          Chao Sun

        Issue Links

          Activity

            People

              csun Chao Sun
              csun Chao Sun
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: