Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-17718 Hive on Spark Debugging Improvements
  3. HIVE-19053

RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

    Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 4.0.0
    • Component/s: Spark
    • Labels:
      None
    • Target Version/s:

      Description

          Future<SparkJobInfo> getJobInfo = sparkClient.run(
              new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
          try {
            return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
          } catch (Exception e) {
            LOG.warn("Failed to get job info.", e);
            throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
                Long.toString(sparkClientTimeoutInSeconds));
          }
      

      It should only throw ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT if a TimeoutException is thrown. Other exceptions should be handled independently.

        Attachments

        1. HIVE-19053.2.patch
          3 kB
          Aihua Xu
        2. HIVE-19053.1.patch
          2 kB
          Aihua Xu

          Activity

            People

            • Assignee:
              aihuaxu Aihua Xu
              Reporter:
              stakiar Sahil Takiar
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: