Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-6631

FileUtil.fullyDelete() should continue to delete other files despite failure at any level.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • None
    • 0.21.0
    • fs, util
    • None
    • Reviewed

    Description

      Ravi commented about this on HADOOP-6536. Paraphrasing...

      Currently FileUtil.fullyDelete(myDir) comes out stopping deletion of other files/directories if it is unable to delete a file/dir(say because of not having permissions to delete that file/dir) anywhere under myDir. This is because we return from method if the recursive call "if(!fullyDelete())

      {return false;}

      " fails at any level of recursion.

      Shouldn't it continue with deletion of other files/dirs continuing in the for loop instead of returning false here ?

      I guess fullyDelete() should delete as many files as possible(similar to 'rm -rf').

      Attachments

        1. HADOOP-6631.patch
          4 kB
          Ravi Gummadi
        2. HADOOP-6631.patch
          4 kB
          Ravi Gummadi
        3. HADOOP-6631.v1.patch
          4 kB
          Ravi Gummadi
        4. HADOOP-6631-20100505.txt
          6 kB
          Vinod Kumar Vavilapalli
        5. HADOOP-6631-20100506.2.txt
          8 kB
          Vinod Kumar Vavilapalli
        6. HADOOP-6631-20100506-ydist.final.txt
          8 kB
          Vinod Kumar Vavilapalli
        7. hadoop-6631-y20s-1.patch
          6 kB
          Sreekanth Ramakrishnan
        8. hadoop-6631-y20s-2.patch
          6 kB
          Sreekanth Ramakrishnan

        Issue Links

          Activity

            People

              ravidotg Ravi Gummadi
              vinodkv Vinod Kumar Vavilapalli
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: