Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.22.0
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      If the file is not accessible, HftpFileSystem returns only a HTTP response code.

      2010-08-27 20:57:48,091 INFO org.apache.hadoop.tools.DistCp: FAIL README.txt : java.io.IOException:
       Server returned HTTP response code: 400 for URL: http:/namenode:50070/data/user/tsz/README.txt?ugi=tsz,users
              at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1290)
              at org.apache.hadoop.hdfs.HftpFileSystem.open(HftpFileSystem.java:143)
              at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:356)
              ...
       
      1. h1383_20100913_y20.patch
        4 kB
        Tsz Wo Nicholas Sze
      2. h1383_20100915_y20.patch
        18 kB
        Tsz Wo Nicholas Sze
      3. h1383_20100915b_y20.patch
        18 kB
        Tsz Wo Nicholas Sze
      4. h1383_20100915b.patch
        18 kB
        Tsz Wo Nicholas Sze

        Activity

        Hide
        Tsz Wo Nicholas Sze added a comment -

        h1383_20100913_y20.patch: a patch for y20.

        Show
        Tsz Wo Nicholas Sze added a comment - h1383_20100913_y20.patch: a patch for y20.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        Here is a description of the patch.

        • In HftpFileSystem, getResponseMessage() and show it if there is an IOException
        • fix the error messages in FileDataServlet
        • In ListPathsServlet, catch IOException and write it as xml.

        I tested it manually and will see if I can add some unit tests.

        Show
        Tsz Wo Nicholas Sze added a comment - Here is a description of the patch. In HftpFileSystem, getResponseMessage() and show it if there is an IOException fix the error messages in FileDataServlet In ListPathsServlet, catch IOException and write it as xml. I tested it manually and will see if I can add some unit tests.
        Hide
        Suresh Srinivas added a comment -

        Comment:

        1. HftpFileSystem#open() - should you just use, s!= null. Otherwise s might be null and might print weird exception message. Code = -1 should be fine right?
        2. Can you please attach an example of printed exception?
        Show
        Suresh Srinivas added a comment - Comment: HftpFileSystem#open() - should you just use, s!= null . Otherwise s might be null and might print weird exception message. Code = -1 should be fine right? Can you please attach an example of printed exception?
        Hide
        Tsz Wo Nicholas Sze added a comment -

        > 1. HftpFileSystem#open() - should you just use, s!= null. ...

        It sounds good.

        > 2. Can you please attach an example of printed exception?

        The existing problem is shown below. I will post the fixed output later.

        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -put README.txt foo/s.txt
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -chmod 000 foo           
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/foo/s.txt .
        [Fatal Error] :1:171: XML document structures must start and end within the same entity.
        cat: invalid xml directory content
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -ls hftp://host.xx.yy:50070/user/root/foo/s.txt
        [Fatal Error] :1:171: XML document structures must start and end within the same entity.
        ls: invalid xml directory content
        
        Show
        Tsz Wo Nicholas Sze added a comment - > 1. HftpFileSystem#open() - should you just use, s!= null. ... It sounds good. > 2. Can you please attach an example of printed exception? The existing problem is shown below. I will post the fixed output later. [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -put README.txt foo/s.txt [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -chmod 000 foo [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/foo/s.txt . [Fatal Error] :1:171: XML document structures must start and end within the same entity. cat: invalid xml directory content [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -ls hftp://host.xx.yy:50070/user/root/foo/s.txt [Fatal Error] :1:171: XML document structures must start and end within the same entity. ls: invalid xml directory content
        Hide
        Tsz Wo Nicholas Sze added a comment -

        h1383_20100915_y20.patch: just cheking s!= null and added a few unit tests.

        Show
        Tsz Wo Nicholas Sze added a comment - h1383_20100915_y20.patch: just cheking s!= null and added a few unit tests.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        Below is the outputs after the patch.

        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/foo/s.txt
        cat: user=root, access=EXECUTE, inode="foo":root:supergroup:---------
        
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/foo
        cat: user=root, access=READ_EXECUTE, inode="foo":root:supergroup:---------
        
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/bar
        cat: /user/root/bar is a directory (error code=400)
        
        Show
        Tsz Wo Nicholas Sze added a comment - Below is the outputs after the patch. [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/foo/s.txt cat: user=root, access=EXECUTE, inode="foo":root:supergroup:--------- [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/foo cat: user=root, access=READ_EXECUTE, inode="foo":root:supergroup:--------- [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://host.xx.yy:50070/user/root/bar cat: /user/root/bar is a directory (error code=400)
        Hide
        Suresh Srinivas added a comment -

        +1 for the patch.

        One minor comment - There is a change in UGI with this patch. Was that intentionally introduced by this patch?

        Show
        Suresh Srinivas added a comment - +1 for the patch. One minor comment - There is a change in UGI with this patch. Was that intentionally introduced by this patch?
        Hide
        Tsz Wo Nicholas Sze added a comment -

        No, I don't want to change UGI. Thanks for reviewing it.

        Show
        Tsz Wo Nicholas Sze added a comment - No, I don't want to change UGI. Thanks for reviewing it.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        h1383_20100915b_y20.patch: reverted the change in UserGroupInformation

        Show
        Tsz Wo Nicholas Sze added a comment - h1383_20100915b_y20.patch: reverted the change in UserGroupInformation
        Hide
        Tsz Wo Nicholas Sze added a comment -

        h1383_20100915b.patch: for trunk

        Show
        Tsz Wo Nicholas Sze added a comment - h1383_20100915b.patch: for trunk
        Hide
        Tsz Wo Nicholas Sze added a comment -

        Ran unit tests. TestFiHFlush failed. See HDFS-1206.

        Show
        Tsz Wo Nicholas Sze added a comment - Ran unit tests. TestFiHFlush failed. See HDFS-1206 .
        Hide
        Tsz Wo Nicholas Sze added a comment -

        ant test-patch

             [exec] +1 overall.  
             [exec] 
             [exec]     +1 @author.  The patch does not contain any @author tags.
             [exec] 
             [exec]     +1 tests included.  The patch appears to include 17 new or modified tests.
             [exec] 
             [exec]     +1 javadoc.  The javadoc tool did not generate any warning messages.
             [exec] 
             [exec]     +1 javac.  The applied patch does not increase the total number of javac compiler warnings.
             [exec] 
             [exec]     +1 findbugs.  The patch does not introduce any new Findbugs warnings.
             [exec] 
             [exec]     +1 release audit.  The applied patch does not increase the total number of release audit warnings.
             [exec] 
             [exec]     +1 system tests framework.  The patch passed system tests framework compile.
        
        Show
        Tsz Wo Nicholas Sze added a comment - ant test-patch [exec] +1 overall. [exec] [exec] +1 @author. The patch does not contain any @author tags. [exec] [exec] +1 tests included. The patch appears to include 17 new or modified tests. [exec] [exec] +1 javadoc. The javadoc tool did not generate any warning messages. [exec] [exec] +1 javac. The applied patch does not increase the total number of javac compiler warnings. [exec] [exec] +1 findbugs. The patch does not introduce any new Findbugs warnings. [exec] [exec] +1 release audit. The applied patch does not increase the total number of release audit warnings. [exec] [exec] +1 system tests framework. The patch passed system tests framework compile.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        Tested manually again. It works fine.

        [root@yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat /user/tsz/r.txt
        cat: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=READ, inode="r.txt":tsz:supergroup:---------
        
        Show
        Tsz Wo Nicholas Sze added a comment - Tested manually again. It works fine. [root@yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat /user/tsz/r.txt cat: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=READ, inode="r.txt":tsz:supergroup:---------
        Hide
        Tsz Wo Nicholas Sze added a comment -

        > Tested manually again. ...

        I posted a wrong output last time. It should be the following.

        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -put README.txt foo/s.txt
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -mkdir bar                
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://xxx.yyy.zzz:50070/user/root/bar
        cat: /user/root/bar is a directory (error code=400)
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -put README.txt r.txt
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -chmod 000 foo r.txt 
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://xxx.yyy.zzz:50070/user/root/foo/s.txt
        cat: user=root, access=EXECUTE, inode="foo":root:supergroup:---------
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://xxx.yyy.zzz:50070/user/root/r.txt
        cat: Permission denied: user=root, access=READ, inode="r.txt":root:supergroup:--------- (error code=400)
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -ls hftp://xxx.yyy.zzz:50070/user/root/foo/s.txt
        ls: user=root, access=EXECUTE, inode="foo":root:supergroup:---------
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -get hftp://xxx.yyy.zzz:50070/user/root/r.txt .
        get: Permission denied
        [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -get hftp://xxx.yyy.zzz:50070/user/root/foo/s.txt .
        get: user=root, access=EXECUTE, inode="foo":root:supergroup:---------
        
        Show
        Tsz Wo Nicholas Sze added a comment - > Tested manually again. ... I posted a wrong output last time. It should be the following. [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -put README.txt foo/s.txt [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -mkdir bar [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://xxx.yyy.zzz:50070/user/root/bar cat: /user/root/bar is a directory (error code=400) [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -put README.txt r.txt [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -chmod 000 foo r.txt [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://xxx.yyy.zzz:50070/user/root/foo/s.txt cat: user=root, access=EXECUTE, inode="foo":root:supergroup:--------- [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -cat hftp://xxx.yyy.zzz:50070/user/root/r.txt cat: Permission denied: user=root, access=READ, inode="r.txt":root:supergroup:--------- (error code=400) [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -ls hftp://xxx.yyy.zzz:50070/user/root/foo/s.txt ls: user=root, access=EXECUTE, inode="foo":root:supergroup:--------- [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -get hftp://xxx.yyy.zzz:50070/user/root/r.txt . get: Permission denied [root@host yahoo-hadoop-0.20.1xx]# ./bin/hadoop fs -get hftp://xxx.yyy.zzz:50070/user/root/foo/s.txt . get: user=root, access=EXECUTE, inode="foo":root:supergroup:---------
        Hide
        Suresh Srinivas added a comment -

        +1 for both the trunk and y20 version of the patch.

        Show
        Suresh Srinivas added a comment - +1 for both the trunk and y20 version of the patch.
        Hide
        Tsz Wo Nicholas Sze added a comment -

        I have committed this.

        Show
        Tsz Wo Nicholas Sze added a comment - I have committed this.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Hdfs-trunk-Commit #390 (See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/390/)
        Fix a typo for my last commit: HDFS-1320 should be HDFS-1383 in CHANGES.txt

        Show
        Hudson added a comment - Integrated in Hadoop-Hdfs-trunk-Commit #390 (See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/390/ ) Fix a typo for my last commit: HDFS-1320 should be HDFS-1383 in CHANGES.txt

          People

          • Assignee:
            Tsz Wo Nicholas Sze
            Reporter:
            Tsz Wo Nicholas Sze
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development