Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-4732

TestDFSUpgradeFromImage fails on Windows due to failure to unpack old image tarball that contains hard links

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 3.0.0-alpha1
    • 2.1.0-beta
    • test
    • None
    • Reviewed

    Description

      On non-Windows, FileUtil#unTar is implemented using external Unix shell commands. On Windows, FileUtil#unTar is implemented in Java using commons-compress. TestDFSUpgradeFromImage uses a testing tarball image of an old HDFS layout version, hadoop-22-dfs-dir.tgz. This file contains hard links. It appears that commons-compress cannot handle the hard links correctly. When it unpacks the file, each hard link ends up as a 0-length file. This causes the test to fail during cluster startup, because the 0-length block files are considered corrupt.

      Attachments

        1. HDFS-4732.1.patch
          1 kB
          Chris Nauroth
        2. hadoop-22-dfs-dir.tgz
          311 kB
          Chris Nauroth

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            cnauroth Chris Nauroth
            cnauroth Chris Nauroth
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Issue deployment