Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-15962

The buffer size is small when unpacking tar archives

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • None
    • 3.3.0
    • common, util
    • None
    • Reviewed
    • Patch

    Description

      Note sure if this code is even being used, but it implements a copy routing utilizing a 2K buffer.  Modern JVM uses 8K, but 4K should be minimum.  Also, there are libraries for this stuff.

      FileUtil.java
          int count;
          byte data[] = new byte[2048];
          try (BufferedOutputStream outputStream = new BufferedOutputStream(
              new FileOutputStream(outputFile));) {
      
            while ((count = tis.read(data)) != -1) {
              outputStream.write(data, 0, count);
            }
      
            outputStream.flush();
          }
      

      I also fixed a couple of check-style warnings.

      Attachments

        1. HADOOP-15962.1.patch
          3 kB
          David Mollitor
        2. HADOOP-15962.2.patch
          2 kB
          David Mollitor

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            belugabehr David Mollitor
            belugabehr David Mollitor
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment