Hadoop Common
  1. Hadoop Common
  2. HADOOP-7430

Improve error message when moving to trash fails due to quota issue

    Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.23.0
    • Component/s: fs
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      -rm command doesn't suggest -skipTrash on failure.

      1. HADOOP-7430.1.patch
        3 kB
        Ravi Prakash
      2. HADOOP-7430.2.patch
        3 kB
        Ravi Prakash
      3. HADOOP-7430.3.patch
        3 kB
        Ravi Prakash

        Activity

        Hide
        Ravi Prakash added a comment -

        Courtesy Rajit Saha

        https://issues.apache.org/jira/browse/HADOOP-6203 surfaces again.

        now -rmr is depreciated and using -rm -R and not seeing following messeges in output

        Consider using -skipTrash option

        and

        Problem with Trash.The NameSpace quota (directories and files) of directory /user/someUser is exceeded: quota=5 file
        count=6

        The only messege coming -
        rm: Failed to move to trash: hdfs://localhost:8020/user/someUser/a/b/c

        Also from Rajit,

        found another scenario when "use -skipTrash" is not promted

        $hdfs dfs -rm -R .
        rm: Cannot move "hdfs://<namenode>:8020/user/<user>" to the trash, as it contains the trash

        Previously same command use to print the following

        Problem with Trash.. Consider using -skipTrash option
        rm: Cannot move "hdfs://<namenode>:8020/user/<user>" to the trash, as it contains the trash"

        Show
        Ravi Prakash added a comment - Courtesy Rajit Saha https://issues.apache.org/jira/browse/HADOOP-6203 surfaces again. now -rmr is depreciated and using -rm -R and not seeing following messeges in output Consider using -skipTrash option and Problem with Trash.The NameSpace quota (directories and files) of directory /user/someUser is exceeded: quota=5 file count=6 The only messege coming - rm: Failed to move to trash: hdfs://localhost:8020/user/someUser/a/b/c Also from Rajit, found another scenario when "use -skipTrash" is not promted $hdfs dfs -rm -R . rm: Cannot move "hdfs://<namenode>:8020/user/<user>" to the trash, as it contains the trash Previously same command use to print the following Problem with Trash.. Consider using -skipTrash option rm: Cannot move "hdfs://<namenode>:8020/user/<user>" to the trash, as it contains the trash"
        Hide
        Ravi Prakash added a comment -

        Attaching a patch to fix this regression. This patch applies to trunk revision 1141162.

        Could someone please answer two of my questions?

        1. In Trash.java I'm having to do a

        cause.toString().indexOf("QuotaExceededException") != -1 

        to find if the Exception was a QuotaExceededException (and then too it is not deterministic (what if the message contained that string?)). I would love to do an instanceOf but then HDFS would be added as a dependency to common (QuotaExceededException is defined in HDFS). How can I circumvent this / should I circumvent this? Can you please point me to a pre-existing discussion if one exists?
        2. Similarly in TestTrash.java, to test that the message is also displayed when quota is exceeded, I should call setQuota() in hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/FSNamesystem.java. Again, I'd have to add HDFS as a dependency for testing. I haven't done it in this patch.

        Also, when I ran test-patch on this, I got

         [exec]     -1 javadoc.  The javadoc tool appears to have generated 1 warning messages.
        

        Here's test-patch's javadoc output

             [exec]   [javadoc] Constructing Javadoc information...
             [exec]   [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/SecurityUtil.java:40: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
             [exec]   [javadoc] import sun.security.jgss.krb5.Krb5Util;
             [exec]   [javadoc]                              ^
             [exec]   [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/SecurityUtil.java:41: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
             [exec]   [javadoc] import sun.security.krb5.Credentials;
             [exec]   [javadoc]                         ^
             [exec]   [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/SecurityUtil.java:42: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
             [exec]   [javadoc] import sun.security.krb5.PrincipalName;
             [exec]   [javadoc]                         ^
             [exec]   [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/KerberosName.java:31: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
             [exec]   [javadoc] import sun.security.krb5.Config;
             [exec]   [javadoc]                         ^
             [exec]   [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/KerberosName.java:32: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
             [exec]   [javadoc] import sun.security.krb5.KrbException;
             [exec]   [javadoc]                         ^
             [exec]   [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/KerberosName.java:81: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
             [exec]   [javadoc]   private static Config kerbConf;
             [exec]   [javadoc]                  ^
             [exec]   [javadoc] ExcludePrivateAnnotationsStandardDoclet
             [exec]   [javadoc] javadoc: warning - Error fetching URL: http://java.sun.com/javase/6/docs/api/package-list
             [exec]   [javadoc] Standard Doclet version 1.6.0_01
             [exec]   [javadoc] Building tree for all the packages and classes...
             [exec]   [javadoc] Building index for all the packages and classes...
             [exec]   [javadoc] Building index for all classes...
             [exec]   [javadoc] Generating /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/build/docs/api/stylesheet.css...
             [exec]   [javadoc] 7 warnings
             [exec] 
             [exec] BUILD SUCCESSFUL
             [exec] Total time: 3 minutes 48 seconds
             [exec] 
             [exec] 
             [exec] There appear to be 7 javadoc warnings generated by the patched build.
        

        But I didn't mess with any files. Any clues why I'm -1?

        Thanks

        Show
        Ravi Prakash added a comment - Attaching a patch to fix this regression. This patch applies to trunk revision 1141162. Could someone please answer two of my questions? 1. In Trash.java I'm having to do a cause.toString().indexOf("QuotaExceededException") != -1 to find if the Exception was a QuotaExceededException (and then too it is not deterministic (what if the message contained that string?)). I would love to do an instanceOf but then HDFS would be added as a dependency to common (QuotaExceededException is defined in HDFS). How can I circumvent this / should I circumvent this? Can you please point me to a pre-existing discussion if one exists? 2. Similarly in TestTrash.java, to test that the message is also displayed when quota is exceeded, I should call setQuota() in hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/FSNamesystem.java. Again, I'd have to add HDFS as a dependency for testing. I haven't done it in this patch. Also, when I ran test-patch on this, I got [exec] -1 javadoc. The javadoc tool appears to have generated 1 warning messages. Here's test-patch's javadoc output [exec] [javadoc] Constructing Javadoc information... [exec] [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/SecurityUtil.java:40: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release [exec] [javadoc] import sun.security.jgss.krb5.Krb5Util; [exec] [javadoc] ^ [exec] [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/SecurityUtil.java:41: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release [exec] [javadoc] import sun.security.krb5.Credentials; [exec] [javadoc] ^ [exec] [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/SecurityUtil.java:42: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release [exec] [javadoc] import sun.security.krb5.PrincipalName; [exec] [javadoc] ^ [exec] [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/KerberosName.java:31: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release [exec] [javadoc] import sun.security.krb5.Config; [exec] [javadoc] ^ [exec] [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/KerberosName.java:32: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release [exec] [javadoc] import sun.security.krb5.KrbException; [exec] [javadoc] ^ [exec] [javadoc] /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/src/java/org/apache/hadoop/security/KerberosName.java:81: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release [exec] [javadoc] private static Config kerbConf; [exec] [javadoc] ^ [exec] [javadoc] ExcludePrivateAnnotationsStandardDoclet [exec] [javadoc] javadoc: warning - Error fetching URL: http://java.sun.com/javase/6/docs/api/package-list [exec] [javadoc] Standard Doclet version 1.6.0_01 [exec] [javadoc] Building tree for all the packages and classes... [exec] [javadoc] Building index for all the packages and classes... [exec] [javadoc] Building index for all classes... [exec] [javadoc] Generating /home/raviprak/Code/hadoop/svn/svnhadoop-common/common/build/docs/api/stylesheet.css... [exec] [javadoc] 7 warnings [exec] [exec] BUILD SUCCESSFUL [exec] Total time: 3 minutes 48 seconds [exec] [exec] [exec] There appear to be 7 javadoc warnings generated by the patched build. But I didn't mess with any files. Any clues why I'm -1? Thanks
        Hide
        Hadoop QA added a comment -

        +1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12484676/HADOOP-7430.1.patch
        against trunk revision 1140442.

        +1 @author. The patch does not contain any @author tags.

        +1 tests included. The patch appears to include 3 new or modified tests.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/684//testReport/
        Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/684//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/684//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - +1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12484676/HADOOP-7430.1.patch against trunk revision 1140442. +1 @author. The patch does not contain any @author tags. +1 tests included. The patch appears to include 3 new or modified tests. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/684//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/684//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/684//console This message is automatically generated.
        Hide
        Ravi Prakash added a comment -

        Can someone please review and commit this patch?

        Show
        Ravi Prakash added a comment - Can someone please review and commit this patch?
        Hide
        Daryn Sharp added a comment -

        I think this more appropriate to be implemented in FsShell's rm command. The Trash class knows nothing about the -skipTrash option, so mentioning the flag here creates a false coupling between the classes.

        Show
        Daryn Sharp added a comment - I think this more appropriate to be implemented in FsShell 's rm command. The Trash class knows nothing about the -skipTrash option, so mentioning the flag here creates a false coupling between the classes.
        Hide
        Ravi Prakash added a comment -

        Thanks for your comment Daryn. I've incorporated in the latest patch. Could you please review it?

        Show
        Ravi Prakash added a comment - Thanks for your comment Daryn. I've incorporated in the latest patch. Could you please review it?
        Hide
        Hadoop QA added a comment -

        +1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12486097/HADOOP-7430.2.patch
        against trunk revision 1144858.

        +1 @author. The patch does not contain any @author tags.

        +1 tests included. The patch appears to include 3 new or modified tests.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/715//testReport/
        Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/715//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/715//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - +1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12486097/HADOOP-7430.2.patch against trunk revision 1144858. +1 @author. The patch does not contain any @author tags. +1 tests included. The patch appears to include 3 new or modified tests. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/715//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/715//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/715//console This message is automatically generated.
        Hide
        Daryn Sharp added a comment -

        I'm not very fond of the string searches for special casing exception handling. Almost anything that can go wrong while trashing items will be avoided when using -skipTrash. I think the main part of the patch should just be a short and simple:

        try {
          success = Trash.moveToAppropriateTrash(item.fs, item.path, getConf());
        } catch (IOException ioe) {
          throw new IOException(ioe.getLocalizedMessage() + ". Consider using -skipTrash option");
        }
        
        Show
        Daryn Sharp added a comment - I'm not very fond of the string searches for special casing exception handling. Almost anything that can go wrong while trashing items will be avoided when using -skipTrash . I think the main part of the patch should just be a short and simple: try { success = Trash.moveToAppropriateTrash(item.fs, item.path, getConf()); } catch (IOException ioe) { throw new IOException(ioe.getLocalizedMessage() + ". Consider using -skipTrash option" ); }
        Hide
        Ravi Prakash added a comment -

        Hi Daryn,

        I can't do a blanket catch of IOException and print the suggestion to use skipTrash because moveToTrash() might throw a FileNotFoundException in which case it doesn't really make sense to print that message.
        I'm not too fond either of this way and like I said before, I'd love to do an instanceof but unfortunately its not really an option here. Unless ofcourse we move all of FSShell into HDFS or the exception classes into common.

        Show
        Ravi Prakash added a comment - Hi Daryn, I can't do a blanket catch of IOException and print the suggestion to use skipTrash because moveToTrash() might throw a FileNotFoundException in which case it doesn't really make sense to print that message. I'm not too fond either of this way and like I said before, I'd love to do an instanceof but unfortunately its not really an option here. Unless ofcourse we move all of FSShell into HDFS or the exception classes into common.
        Hide
        Daryn Sharp added a comment -

        Good point. I think you can address the issue by catching FileNotFound and simply rethrow them just prior to catching IOException.

        Show
        Daryn Sharp added a comment - Good point. I think you can address the issue by catching FileNotFound and simply rethrow them just prior to catching IOException .
        Hide
        Ravi Prakash added a comment -

        Thanks for your comments Daryn. I've incorporated your suggestion in the latest patch. Can someone please commit it?

        Show
        Ravi Prakash added a comment - Thanks for your comments Daryn. I've incorporated your suggestion in the latest patch. Can someone please commit it?
        Hide
        Daryn Sharp added a comment -

        +1 good job!

        Show
        Daryn Sharp added a comment - +1 good job!
        Hide
        Hadoop QA added a comment -

        +1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12486207/HADOOP-7430.3.patch
        against trunk revision 1145525.

        +1 @author. The patch does not contain any @author tags.

        +1 tests included. The patch appears to include 3 new or modified tests.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/720//testReport/
        Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/720//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/720//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - +1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12486207/HADOOP-7430.3.patch against trunk revision 1145525. +1 @author. The patch does not contain any @author tags. +1 tests included. The patch appears to include 3 new or modified tests. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/720//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/720//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/720//console This message is automatically generated.
        Hide
        Matt Foley added a comment -

        Committed to trunk. Thanks Ravi! And thanks to Daryn for reviewing.

        Show
        Matt Foley added a comment - Committed to trunk. Thanks Ravi! And thanks to Daryn for reviewing.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk-Commit #686 (See https://builds.apache.org/job/Hadoop-Common-trunk-Commit/686/)
        HADOOP-7430. Improve error message when moving to trash fails due to quota issue. Contributed by Ravi Prakash.

        mattf : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1145832
        Files :

        • /hadoop/common/trunk/common/CHANGES.txt
        • /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/fs/TestTrash.java
        • /hadoop/common/trunk/common/src/java/org/apache/hadoop/fs/shell/Delete.java
        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk-Commit #686 (See https://builds.apache.org/job/Hadoop-Common-trunk-Commit/686/ ) HADOOP-7430 . Improve error message when moving to trash fails due to quota issue. Contributed by Ravi Prakash. mattf : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1145832 Files : /hadoop/common/trunk/common/CHANGES.txt /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/fs/TestTrash.java /hadoop/common/trunk/common/src/java/org/apache/hadoop/fs/shell/Delete.java
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk #745 (See https://builds.apache.org/job/Hadoop-Common-trunk/745/)
        HADOOP-7430. Improve error message when moving to trash fails due to quota issue. Contributed by Ravi Prakash.

        mattf : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1145832
        Files :

        • /hadoop/common/trunk/common/CHANGES.txt
        • /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/fs/TestTrash.java
        • /hadoop/common/trunk/common/src/java/org/apache/hadoop/fs/shell/Delete.java
        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk #745 (See https://builds.apache.org/job/Hadoop-Common-trunk/745/ ) HADOOP-7430 . Improve error message when moving to trash fails due to quota issue. Contributed by Ravi Prakash. mattf : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1145832 Files : /hadoop/common/trunk/common/CHANGES.txt /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/fs/TestTrash.java /hadoop/common/trunk/common/src/java/org/apache/hadoop/fs/shell/Delete.java

          People

          • Assignee:
            Ravi Prakash
            Reporter:
            Ravi Prakash
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development