Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-17126

FsDataSetImpl#checkAndUpdate should delete duplicated block meta file.

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsAdd voteVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.4.0
    • None
    • datanode
    • None

    Description

      Think about below case :

      We have one datanode called dn1,   it has two storages call ds1 and ds2 respectively.

      Suppose we have blk_123 and blk_123_1001.meta in ds1 and blk_123_1001.meta in ds2.

      The current logic will not handle the file blk_123_1001.meta in ds2 and only prints logs to tell us DirectoryScanner scan missing block files.

      I think we should do something here.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            zhanghaobo farmmamba Assign to me
            zhanghaobo farmmamba

            Dates

              Created:
              Updated:

              Slack

                Issue deployment