Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-1262

file corruption detected because dfs client does not use replica blocks for checksum file

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 0.12.3
    • Fix Version/s: 0.13.0
    • Component/s: None
    • Labels:
      None

      Description

      A block of a crc file was corrupted. This caused the DFS client to detect a CRc corruption. The client tried all the three replicas of the data file. It did not try any replicas of the CRC file. This caused the client to abort the read request with a bad-CRC message.

      07/04/16 20:42:26 INFO fs.FileSystem: Found checksum error in data stream at block=blk_6205660483922449140 on datanode=xx:50010
      07/04/16 20:42:26 INFO fs.FileSystem: Found checksum error in checksum stream at block=blk_-3722915954820866561 on datanode=yy:50010

      07/04/16 20:42:26 INFO fs.FileSystem: Found checksum error in data stream at block=blk_6205660483922449140 on datanode=zz:50010
      07/04/16 20:42:26 INFO fs.FileSystem: Found checksum error in checksum stream at block=blk_-3722915954820866561 on datanode=yy:50010

      07/04/16 20:42:26 INFO fs.FileSystem: Found checksum error in data stream at block=blk_6205660483922449140 on datanode=xx.:50010
      07/04/16 20:42:26 INFO fs.FileSystem: Found checksum error in checksum stream at block=blk_-3722915954820866561 on datanode=yy:50010

        Attachments

        1. newSource.patch
          0.6 kB
          Hairong Kuang

          Activity

            People

            • Assignee:
              hairong Hairong Kuang
              Reporter:
              dhruba dhruba borthakur
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: