Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-9220

Reading small file (< 512 bytes) that is open for append fails due to incorrect checksum

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 2.7.1
    • 2.8.0, 2.7.2, 2.6.4, 3.0.0-alpha1
    • None
    • None
    • Reviewed

    Description

      Exception:
      2015-10-09 14:59:40 WARN DFSClient:1150 - fetchBlockByteRange(). Got a checksum exception for /tmp/file0.05355529331575182 at BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882:0 from DatanodeInfoWithStorage[10.10.10.10]:5001

      All 3 replicas cause this exception and the read fails entirely with:
      BlockMissingException: Could not obtain block: BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882 file=/tmp/file0.05355529331575182

      Code to reproduce is attached.
      Does not happen in 2.7.0.
      Data is read correctly if checksum verification is disabled.
      More generally, the failure happens when reading from the last block of a file and the last block has <= 512 bytes.

      Attachments

        1. HDFS-9220.002.patch
          3 kB
          Jing Zhao
        2. HDFS-9220.001.patch
          3 kB
          Jing Zhao
        3. HDFS-9220.000.patch
          3 kB
          Jing Zhao
        4. test2.java
          1.0 kB
          Bogdan Raducanu

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            jingzhao Jing Zhao
            bograd Bogdan Raducanu
            Votes:
            0 Vote for this issue
            Watchers:
            19 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment