Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-16208

Do Not Log InterruptedException in Client

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.2.0
    • 3.3.0, 3.2.1, 3.1.3, 2.10.2
    • common
    • None
    • Patch

    Description

             } catch (InterruptedException e) {
              Thread.currentThread().interrupt();
              LOG.warn("interrupted waiting to send rpc request to server", e);
              throw new IOException(e);
            }
      

      https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java#L1450

      I'm working on a project that uses an ExecutorService to launch a bunch of threads. Each thread spins up an HDFS client connection. At any point in time, the program can terminate and call ExecutorService#shutdownNow() to forcibly close vis-à-vis Thread#interrupt(). At that point, I get a cascade of logging from the above code and there's no easy to way to turn it off.

      "Log and throw" is generally frowned upon, just throw the Exception and move on.

      https://community.oracle.com/docs/DOC-983543#logAndThrow

      Attachments

        1. HADOOP-16208.1.patch
          0.8 kB
          David Mollitor
        2. HADOOP-16208.2.patch
          1 kB
          David Mollitor
        3. HADOOP-16208.3.patch
          1 kB
          David Mollitor

        Issue Links

          Activity

            People

              belugabehr David Mollitor
              belugabehr David Mollitor
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: