Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-9220

Reading small file (< 512 bytes) that is open for append fails due to incorrect checksum

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 2.7.1
    • Fix Version/s: 2.8.0, 2.7.2, 2.6.4, 3.0.0-alpha1
    • Component/s: None
    • Labels:
      None
    • Target Version/s:
    • Hadoop Flags:
      Reviewed

      Description

      Exception:
      2015-10-09 14:59:40 WARN DFSClient:1150 - fetchBlockByteRange(). Got a checksum exception for /tmp/file0.05355529331575182 at BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882:0 from DatanodeInfoWithStorage[10.10.10.10]:5001

      All 3 replicas cause this exception and the read fails entirely with:
      BlockMissingException: Could not obtain block: BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882 file=/tmp/file0.05355529331575182

      Code to reproduce is attached.
      Does not happen in 2.7.0.
      Data is read correctly if checksum verification is disabled.
      More generally, the failure happens when reading from the last block of a file and the last block has <= 512 bytes.

        Attachments

        1. test2.java
          1.0 kB
          Bogdan Raducanu
        2. HDFS-9220.002.patch
          3 kB
          Jing Zhao
        3. HDFS-9220.001.patch
          3 kB
          Jing Zhao
        4. HDFS-9220.000.patch
          3 kB
          Jing Zhao

          Issue Links

            Activity

              People

              • Assignee:
                jingzhao Jing Zhao
                Reporter:
                bograd Bogdan Raducanu
              • Votes:
                0 Vote for this issue
                Watchers:
                20 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: