Hadoop Map/Reduce
  1. Hadoop Map/Reduce
  2. MAPREDUCE-1029

TestCopyFiles fails on testHftpAccessControl()

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.21.0
    • Fix Version/s: 0.21.0
    • Component/s: build
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      Log :
      Testcase: testHftpAccessControl took 2.692 sec
      FAILED
      expected:<-3> but was:<-999>
      junit.framework.AssertionFailedError: expected:<-3> but was:<-999>
      at org.apache.hadoop.tools.TestCopyFiles.testHftpAccessControl(TestCopyFiles.java:853)

      1. MAPREDUCE-1029.2.patch
        0.6 kB
        Aaron Kimball
      2. mr-1029.patch
        0.5 kB
        Jothi Padmanabhan

        Activity

        Amar Kamat created issue -
        Hide
        Chris Douglas added a comment -

        This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work.

        Show
        Chris Douglas added a comment - This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work.
        Hide
        Vinod Kumar Vavilapalli added a comment -

        This fails on 0.21 branch too. Marking it as a blocker.

        This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work.

        Chris, do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project. This used to work well until very recently.

        Show
        Vinod Kumar Vavilapalli added a comment - This fails on 0.21 branch too. Marking it as a blocker. This may be because the hdfs webapps aren't on the classpath, so MiniDFSCluster isn't starting Jetty, so HftpFileSystem (which uses http) fails to work. Chris, do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project. This used to work well until very recently.
        Vinod Kumar Vavilapalli made changes -
        Field Original Value New Value
        Fix Version/s 0.21.0 [ 12314045 ]
        Affects Version/s 0.21.0 [ 12314045 ]
        Priority Major [ 3 ] Blocker [ 1 ]
        Hide
        Chris Douglas added a comment -

        do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project

        No, the preceding was just a guess. I haven't looked into the regression, but noticed the webapp error messages around the same time the rest of the HDFS web UIs stopped working (and it would explain why hftp fails).

        Show
        Chris Douglas added a comment - do you know which issue caused this? This is the same reason for which HDFS NameNode UI doesn't come up when I start HDFS from inside mapreduce project No, the preceding was just a guess. I haven't looked into the regression, but noticed the webapp error messages around the same time the rest of the HDFS web UIs stopped working (and it would explain why hftp fails).
        Hide
        Jothi Padmanabhan added a comment -

        Patch that restores the unzipping of hdfs/webapps during build.

        Show
        Jothi Padmanabhan added a comment - Patch that restores the unzipping of hdfs/webapps during build.
        Jothi Padmanabhan made changes -
        Attachment mr-1029.patch [ 12421729 ]
        Jothi Padmanabhan made changes -
        Status Open [ 1 ] Patch Available [ 10002 ]
        Assignee Jothi Padmanabhan [ jothipn ]
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12421729/mr-1029.patch
        against trunk revision 823227.

        +1 @author. The patch does not contain any @author tags.

        -1 tests included. The patch doesn't appear to include any new or modified tests.
        Please justify why no new tests are needed for this patch.
        Also please list what manual steps were performed to verify this patch.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs warnings.

        -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings).

        +1 core tests. The patch passed core unit tests.

        +1 contrib tests. The patch passed contrib unit tests.

        Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/testReport/
        Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt
        Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/checkstyle-errors.html
        Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12421729/mr-1029.patch against trunk revision 823227. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs warnings. -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings). +1 core tests. The patch passed core unit tests. +1 contrib tests. The patch passed contrib unit tests. Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/testReport/ Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/artifact/trunk/build/test/checkstyle-errors.html Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/155/console This message is automatically generated.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        The patch works! TestCopyFiles did not fail. However, there are release audit warnings. Is it because of the unzip?

        Show
        Tsz Wo Nicholas Sze added a comment - The patch works! TestCopyFiles did not fail. However, there are release audit warnings. Is it because of the unzip?
        Tsz Wo Nicholas Sze made changes -
        Component/s build [ 12312909 ]
        Component/s distcp [ 12312902 ]
        Hide
        Chris Douglas added a comment -

        It looks like this was introduced by MAPREDUCE-679. And it was caught by Hudson. sigh

        Yes, the release audit warnings are from the unzipped files. If the files require license headers, they should be added in HDFS.

        Show
        Chris Douglas added a comment - It looks like this was introduced by MAPREDUCE-679 . And it was caught by Hudson. sigh Yes, the release audit warnings are from the unzipped files. If the files require license headers, they should be added in HDFS.
        Hide
        Vinod Kumar Vavilapalli added a comment -

        The unzip was removed after this comment by Aaron Kimball on MAPREDUCE-679.

        # updates build.xml to fix bug in webapp compilation.

        Not sure of its meaning, but I think we should confirm the reason before adding it back, just to be sure. Aaron?

        Show
        Vinod Kumar Vavilapalli added a comment - The unzip was removed after this comment by Aaron Kimball on MAPREDUCE-679 . # updates build.xml to fix bug in webapp compilation. Not sure of its meaning, but I think we should confirm the reason before adding it back, just to be sure. Aaron?
        Hide
        Aaron Kimball added a comment -

        The build process doesn't build webapps from their .jsp sources if the unzip is in there. It takes the copies out of the pre-packaged jars (which all bundle all of the webapps for all daemons) and uses them on top of the source-based copies.

        Show
        Aaron Kimball added a comment - The build process doesn't build webapps from their .jsp sources if the unzip is in there. It takes the copies out of the pre-packaged jars (which all bundle all of the webapps for all daemons) and uses them on top of the source-based copies.
        Hide
        Aaron Kimball added a comment -

        More specifically, the build/webapps/ directory contains WEB-INF/web.xml files that define the sitemap that jetty uses. The ones from the hdfs jar will block the inclusion of new servlets via the web.xml generated by the compilation process.

        This patch unzips only the HDFS-specific webapps subdirectories. TestCopyFiles passes locally.

        Show
        Aaron Kimball added a comment - More specifically, the build/webapps/ directory contains WEB-INF/web.xml files that define the sitemap that jetty uses. The ones from the hdfs jar will block the inclusion of new servlets via the web.xml generated by the compilation process. This patch unzips only the HDFS-specific webapps subdirectories. TestCopyFiles passes locally.
        Aaron Kimball made changes -
        Attachment MAPREDUCE-1029.2.patch [ 12421883 ]
        Jothi Padmanabhan made changes -
        Status Patch Available [ 10002 ] Open [ 1 ]
        Hide
        Jothi Padmanabhan added a comment -

        Running Aaron's patch through Hudson

        Show
        Jothi Padmanabhan added a comment - Running Aaron's patch through Hudson
        Jothi Padmanabhan made changes -
        Status Open [ 1 ] Patch Available [ 10002 ]
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12421883/MAPREDUCE-1029.2.patch
        against trunk revision 824273.

        +1 @author. The patch does not contain any @author tags.

        -1 tests included. The patch doesn't appear to include any new or modified tests.
        Please justify why no new tests are needed for this patch.
        Also please list what manual steps were performed to verify this patch.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs warnings.

        -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings).

        +1 core tests. The patch passed core unit tests.

        +1 contrib tests. The patch passed contrib unit tests.

        Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/testReport/
        Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt
        Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/checkstyle-errors.html
        Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12421883/MAPREDUCE-1029.2.patch against trunk revision 824273. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs warnings. -1 release audit. The applied patch generated 175 release audit warnings (more than the trunk's current 172 warnings). +1 core tests. The patch passed core unit tests. +1 contrib tests. The patch passed contrib unit tests. Test results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/testReport/ Release audit warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Checkstyle results: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/artifact/trunk/build/test/checkstyle-errors.html Console output: http://hudson.zones.apache.org/hudson/job/Mapreduce-Patch-h6.grid.sp2.yahoo.net/161/console This message is automatically generated.
        Hide
        Aaron Kimball added a comment -

        Those release-audit warnings apply to the autogenerated files unzipped from hdfs.jar. Ok to increase warning count.

        Show
        Aaron Kimball added a comment - Those release-audit warnings apply to the autogenerated files unzipped from hdfs.jar. Ok to increase warning count.
        Hide
        Chris Douglas added a comment -

        I got the following failure on the 0.21 branch:

        Testcase: testHftpAccessControl took 1.414 sec
                Caused an ERROR
        null
        java.lang.StackOverflowError
                at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:598)
                at java.lang.StringBuffer.append(StringBuffer.java:329)
                at java.net.URI.appendAuthority(URI.java:1812)
                at java.net.URI.appendSchemeSpecificPart(URI.java:1871)
                at java.net.URI.toString(URI.java:1903)
                at java.net.URI.<init>(URI.java:659)
                at org.apache.hadoop.hdfs.HftpFileSystem.getUri(HftpFileSystem.java:96)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
                at org.apache.hadoop.fs.Path.makeQualified(Path.java:297)
                at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289)
        [...]
        

        Updating the hdfs jar to the most recent version of the HDFS 0.21 branch resolved this, however.

        Show
        Chris Douglas added a comment - I got the following failure on the 0.21 branch: Testcase: testHftpAccessControl took 1.414 sec Caused an ERROR null java.lang.StackOverflowError at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:598) at java.lang.StringBuffer.append(StringBuffer.java:329) at java.net.URI.appendAuthority(URI.java:1812) at java.net.URI.appendSchemeSpecificPart(URI.java:1871) at java.net.URI.toString(URI.java:1903) at java.net.URI.<init>(URI.java:659) at org.apache.hadoop.hdfs.HftpFileSystem.getUri(HftpFileSystem.java:96) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) at org.apache.hadoop.fs.Path.makeQualified(Path.java:297) at org.apache.hadoop.hdfs.HftpFileSystem.getWorkingDirectory(HftpFileSystem.java:289) [...] Updating the hdfs jar to the most recent version of the HDFS 0.21 branch resolved this, however.
        Hide
        Chris Douglas added a comment -

        I committed this. Thanks Jothi and Aaron!

        Show
        Chris Douglas added a comment - I committed this. Thanks Jothi and Aaron!
        Chris Douglas made changes -
        Status Patch Available [ 10002 ] Resolved [ 5 ]
        Hadoop Flags [Reviewed]
        Resolution Fixed [ 1 ]
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk-Commit #74 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/74/)
        . Fix failing TestCopyFiles by restoring the unzipping of
        HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk-Commit #74 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/74/ ) . Fix failing TestCopyFiles by restoring the unzipping of HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk #112 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk/112/)
        . Fix failing TestCopyFiles by restoring the unzipping of
        HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #112 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk/112/ ) . Fix failing TestCopyFiles by restoring the unzipping of HDFS webapps from the hdfs jar. Contributed by Aaron Kimball and Jothi Padmanabhan
        Tom White made changes -
        Status Resolved [ 5 ] Closed [ 6 ]
        Transition Time In Source Status Execution Times Last Executer Last Execution Date
        Patch Available Patch Available Open Open
        3d 18h 29m 1 Jothi Padmanabhan 13/Oct/09 07:05
        Open Open Patch Available Patch Available
        16d 18m 2 Jothi Padmanabhan 13/Oct/09 07:05
        Patch Available Patch Available Resolved Resolved
        1d 2h 19m 1 Chris Douglas 14/Oct/09 09:24
        Resolved Resolved Closed Closed
        314d 12h 53m 1 Tom White 24/Aug/10 22:18

          People

          • Assignee:
            Jothi Padmanabhan
            Reporter:
            Amar Kamat
          • Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development