Hadoop Map/Reduce
  1. Hadoop Map/Reduce
  2. MAPREDUCE-1029

TestCopyFiles fails on testHftpAccessControl()

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.21.0
    • Fix Version/s: 0.21.0
    • Component/s: build
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      Log :
      Testcase: testHftpAccessControl took 2.692 sec
      FAILED
      expected:<-3> but was:<-999>
      junit.framework.AssertionFailedError: expected:<-3> but was:<-999>
      at org.apache.hadoop.tools.TestCopyFiles.testHftpAccessControl(TestCopyFiles.java:853)

      1. mr-1029.patch
        0.5 kB
        Jothi Padmanabhan
      2. MAPREDUCE-1029.2.patch
        0.6 kB
        Aaron Kimball

        Activity

        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk #112 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk/112/)
        . Fix failing TestCopyFiles by restoring the unzipping of
        HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #112 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk/112/ ) . Fix failing TestCopyFiles by restoring the unzipping of HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk-Commit #74 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/74/)
        . Fix failing TestCopyFiles by restoring the unzipping of
        HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk-Commit #74 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/74/ ) . Fix failing TestCopyFiles by restoring the unzipping of HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan
        Hide
        Chris Douglas added a comment -

        I committed this. Thanks Jothi and Aaron!

        Show
        Chris Douglas added a comment - I committed this. Thanks Jothi and Aaron!
        Hide
        Chris Douglas added a comment -

        I got the following failure on the 0.21 branch:

        Testcase: testHftpAccessControl took 1.414 sec
                Caused an ERROR
        null
        java.lang.StackOverflowError
                at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:598)
                at java.lang.StringBuffer.append(StringBuffer.java:329)
                at java.net.URI.appendAuthority(URI.java:1812)
                at java.net.URI.appendSchemeSpecificPart(URI.java:1871)
                at java.net.URI.toString(URI.java:1903)
                at java.net.URI.<init>(URI.java:659)
                at org.apache.hadoop.hdfs.HftpFileSystem.getUri(HftpFileSystem.java:96)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
        [...]
        

        Updating the hdfs jar to the most recent version of the HDFS 0.21 branch resolved this, however.

        Show
        Chris Douglas added a comment - I got the following failure on the 0.21 branch: Testcase: testHftpAccessControl took 1.414 sec Caused an ERROR null java.lang.StackOverflowError at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:598) at java.lang.StringBuffer.append(StringBuffer.java:329) at java.net.URI.appendAuthority(URI.java:1812) at java.net.URI.appendSchemeSpecificPart(URI.java:1871) at java.net.URI.toString(URI.java:1903) at java.net.URI.<init>(URI.java:659) at org.apache.hadoop.hdfs.HftpFileSystem.getUri(HftpFileSystem.java:96) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) [...] Updating the hdfs jar to the most recent version of the HDFS 0.21 branch resolved this, however.
        Hide
        Aaron Kimball added a comment -

        Those release-audit warnings apply to the autogenerated files unzipped from hdfs.jar. Ok to increase warning count.

        Show
        Aaron Kimball added a comment - Those release-audit warnings apply to the autogenerated files unzipped from hdfs.jar. Ok to increase warning count.
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12421883/MAPREDUCE-1029.2.patch
        against trunk revision 824273.

        +1 @author. The patch does not contain any @author tags.

        -1 tests included. The patch doesn't appear to include any new or modified tests.
        Please justify why no new tests are needed for this patch.
        Also please list what manual steps were performed to verify this patch.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs warnings.

        -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings).

        +1 core tests. The patch passed core unit tests.

        +1 contrib tests. The patch passed contrib unit tests.

        Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/testReport/
        Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt
        Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/checkstyle-errors.html
        Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12421883/MAPREDUCE-1029.2.patch against trunk revision 824273. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs warnings. -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings). +1 core tests. The patch passed core unit tests. +1 contrib tests. The patch passed contrib unit tests. Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/testReport/ Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/checkstyle-errors.html Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/console This message is automatically generated.
        Hide
        Jothi Padmanabhan added a comment -

        Running Aaron's patch through Hudson

        Show
        Jothi Padmanabhan added a comment - Running Aaron's patch through Hudson
        Hide
        Aaron Kimball added a comment -

        More specifically, the build/webapps/ directory contains WEB-INF/web.xml files that define the sitemap that jetty uses. The ones from the hdfs jar will block the inclusion of new servlets via the web.xml generated by the compilation process.

        This patch unzips only the HDFS-specific webapps subdirectories. TestCopyFiles passes locally.

        Show
        Aaron Kimball added a comment - More specifically, the build/webapps/ directory contains WEB-INF/web.xml files that define the sitemap that jetty uses. The ones from the hdfs jar will block the inclusion of new servlets via the web.xml generated by the compilation process. This patch unzips only the HDFS-specific webapps subdirectories. TestCopyFiles passes locally.
        Hide
        Aaron Kimball added a comment -

        The build process doesn't build webapps from their .jsp sources if the unzip is in there. It takes the copies out of the pre-packaged jars (which all bundle all of the webapps for all daemons) and uses them on top of the source-based copies.

        Show
        Aaron Kimball added a comment - The build process doesn't build webapps from their .jsp sources if the unzip is in there. It takes the copies out of the pre-packaged jars (which all bundle all of the webapps for all daemons) and uses them on top of the source-based copies.
        Hide
        Vinod Kumar Vavilapalli added a comment -

        The unzip was removed after this comment by Aaron Kimball on MAPREDUCE-679.

        # updates build.xml to fix bug in webapp compilation.

        Not sure of its meaning, but I think we should confirm the reason before adding it back, just to be sure. Aaron?

        Show
        Vinod Kumar Vavilapalli added a comment - The unzip was removed after this comment by Aaron Kimball on MAPREDUCE-679 . # updates build.xml to fix bug in webapp compilation. Not sure of its meaning, but I think we should confirm the reason before adding it back, just to be sure. Aaron?
        Hide
        Chris Douglas added a comment -

        It looks like this was introduced by MAPREDUCE-679. And it was caught by Hudson. sigh

        Yes, the release audit warnings are from the unzipped files. If the files require license headers, they should be added in HDFS.

        Show
        Chris Douglas added a comment - It looks like this was introduced by MAPREDUCE-679 . And it was caught by Hudson. sigh Yes, the release audit warnings are from the unzipped files. If the files require license headers, they should be added in HDFS.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        The patch works! TestCopyFiles did not fail. However, there are release audit warnings. Is it because of the unzip?

        Show
        Tsz Wo Nicholas Sze added a comment - The patch works! TestCopyFiles did not fail. However, there are release audit warnings. Is it because of the unzip?
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12421729/mr-1029.patch
        against trunk revision 823227.

        +1 @author. The patch does not contain any @author tags.

        -1 tests included. The patch doesn't appear to include any new or modified tests.
        Please justify why no new tests are needed for this patch.
        Also please list what manual steps were performed to verify this patch.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs warnings.

        -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings).

        +1 core tests. The patch passed core unit tests.

        +1 contrib tests. The patch passed contrib unit tests.

        Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/testReport/
        Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt
        Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/checkstyle-errors.html
        Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12421729/mr-1029.patch against trunk revision 823227. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs warnings. -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings). +1 core tests. The patch passed core unit tests. +1 contrib tests. The patch passed contrib unit tests. Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/testReport/ Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/checkstyle-errors.html Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/console This message is automatically generated.
        Hide
        Jothi Padmanabhan added a comment -

        Patch that restores the unzipping of hdfs/webapps during build.

        Show
        Jothi Padmanabhan added a comment - Patch that restores the unzipping of hdfs/webapps during build.
        Hide
        Chris Douglas added a comment -

        do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project

        No, the preceding was just a guess. I haven't looked into the regression, but noticed the webapp error messages around the same time the rest of the HDFS web UIs stopped working (and it would explain why hftp fails).

        Show
        Chris Douglas added a comment - do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project No, the preceding was just a guess. I haven't looked into the regression, but noticed the webapp error messages around the same time the rest of the HDFS web UIs stopped working (and it would explain why hftp fails).
        Hide
        Vinod Kumar Vavilapalli added a comment -

        This fails on 0.21 branch too. Marking it as a blocker.

        This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work.

        Chris, do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project. This used to work well until very recently.

        Show
        Vinod Kumar Vavilapalli added a comment - This fails on 0.21 branch too. Marking it as a blocker. This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work. Chris, do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project. This used to work well until very recently.
        Hide
        Chris Douglas added a comment -

        This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work.

        Show
        Chris Douglas added a comment - This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work.

          People

          • Assignee:
            Jothi Padmanabhan
            Reporter:
            Amar Kamat
          • Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development