Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-17718 Hive on Spark Debugging Improvements
  3. HIVE-19053

RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Closed
    • Major
    • Resolution: Fixed
    • None
    • 4.0.0-alpha-1
    • Spark
    • None

    Description

          Future<SparkJobInfo> getJobInfo = sparkClient.run(
              new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
          try {
            return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
          } catch (Exception e) {
            LOG.warn("Failed to get job info.", e);
            throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
                Long.toString(sparkClientTimeoutInSeconds));
          }
      

      It should only throw ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT if a TimeoutException is thrown. Other exceptions should be handled independently.

      Attachments

        1. HIVE-19053.1.patch
          2 kB
          Aihua Xu
        2. HIVE-19053.2.patch
          3 kB
          Aihua Xu

        Activity

          People

            aihuaxu Aihua Xu
            stakiar Sahil Takiar
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: