Uploaded image for project: 'Commons Compress'
  1. Commons Compress
  2. COMPRESS-16

unable to extract a TAR file that contains an entry which is 10 GB in size

Agile BoardAttach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.4
    • Archivers
    • None
    • I am using win xp sp3, but this should be platform independent.

    Description

      I made a TAR file which contains a file entry where the file is 10 GB in size.

      When I attempt to extract the file using TarInputStream, it fails with the following stack trace:

      java.io.IOException: unexpected EOF with 24064 bytes unread
      at org.apache.commons.compress.archivers.tar.TarInputStream.read(TarInputStream.java:348)
      at org.apache.commons.compress.archivers.tar.TarInputStream.copyEntryContents(TarInputStream.java:388)

      So, TarInputStream does not seem to support large (> 8 GB?) files.

      Here is something else to note: I created that TAR file using TarOutputStream , which did not complain when asked to write a 10 GB file into the TAR file, so I assume that TarOutputStream has no file size limits? That, or does it silently create corrupted TAR files (which would be the worst situation of all...)?

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            bbatman Sam Smith
            Votes:
            2 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment