Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-15962

The buffer size is small when unpacking tar archives

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 3.3.0
    • Component/s: common, util
    • Labels:
      None
    • Hadoop Flags:
      Reviewed
    • Flags:
      Patch

      Description

      Note sure if this code is even being used, but it implements a copy routing utilizing a 2K buffer.  Modern JVM uses 8K, but 4K should be minimum.  Also, there are libraries for this stuff.

      FileUtil.java
          int count;
          byte data[] = new byte[2048];
          try (BufferedOutputStream outputStream = new BufferedOutputStream(
              new FileOutputStream(outputFile));) {
      
            while ((count = tis.read(data)) != -1) {
              outputStream.write(data, 0, count);
            }
      
            outputStream.flush();
          }
      

      I also fixed a couple of check-style warnings.

        Attachments

        1. HADOOP-15962.2.patch
          2 kB
          David Mollitor
        2. HADOOP-15962.1.patch
          3 kB
          David Mollitor

          Activity

            People

            • Assignee:
              belugabehr David Mollitor
              Reporter:
              belugabehr David Mollitor
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: