Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-16208

Do Not Log InterruptedException in Client

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 3.2.0
    • Fix Version/s: 3.3.0, 3.2.1, 3.1.3, 2.10.2
    • Component/s: common
    • Labels:
      None
    • Flags:
      Patch

      Description

             } catch (InterruptedException e) {
              Thread.currentThread().interrupt();
              LOG.warn("interrupted waiting to send rpc request to server", e);
              throw new IOException(e);
            }
      

      https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java#L1450

      I'm working on a project that uses an ExecutorService to launch a bunch of threads. Each thread spins up an HDFS client connection. At any point in time, the program can terminate and call ExecutorService#shutdownNow() to forcibly close vis-à-vis Thread#interrupt(). At that point, I get a cascade of logging from the above code and there's no easy to way to turn it off.

      "Log and throw" is generally frowned upon, just throw the Exception and move on.

      https://community.oracle.com/docs/DOC-983543#logAndThrow

        Attachments

        1. HADOOP-16208.1.patch
          0.8 kB
          David Mollitor
        2. HADOOP-16208.2.patch
          1 kB
          David Mollitor
        3. HADOOP-16208.3.patch
          1 kB
          David Mollitor

        Issue Links

          Activity

            People

            • Assignee:
              belugabehr David Mollitor
              Reporter:
              belugabehr David Mollitor

              Dates

              • Created:
                Updated:
                Resolved:

                Issue deployment