Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-4732

TestDFSUpgradeFromImage fails on Windows due to failure to unpack old image tarball that contains hard links

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 3.0.0-alpha1
    • Fix Version/s: 2.1.0-beta
    • Component/s: test
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      On non-Windows, FileUtil#unTar is implemented using external Unix shell commands. On Windows, FileUtil#unTar is implemented in Java using commons-compress. TestDFSUpgradeFromImage uses a testing tarball image of an old HDFS layout version, hadoop-22-dfs-dir.tgz. This file contains hard links. It appears that commons-compress cannot handle the hard links correctly. When it unpacks the file, each hard link ends up as a 0-length file. This causes the test to fail during cluster startup, because the 0-length block files are considered corrupt.

        Attachments

        1. HDFS-4732.1.patch
          1 kB
          Chris Nauroth
        2. hadoop-22-dfs-dir.tgz
          311 kB
          Chris Nauroth

          Issue Links

            Activity

              People

              • Assignee:
                cnauroth Chris Nauroth
                Reporter:
                cnauroth Chris Nauroth
              • Votes:
                0 Vote for this issue
                Watchers:
                5 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: