Hadoop Common
  1. Hadoop Common
  2. HADOOP-4483

getBlockArray in DatanodeDescriptor does not honor passed in maxblocks value


    • Type: Bug Bug
    • Status: Closed
    • Priority: Critical Critical
    • Resolution: Fixed
    • Affects Version/s: 0.18.1
    • Fix Version/s: 0.18.2
    • Component/s: None
    • Labels:
    • Environment:

      hadoop-0.18.1 running on a cluster of 16 nodes.

    • Hadoop Flags:


      The getBlockArray method in DatanodeDescriptor.java should honor the passed in maxblocks parameter. In its current form it passed in an array sized to min(maxblocks,blocks.size()) into the Collections.toArray method. As the javadoc for Collections.toArray indicates, the toArray method may discard the passed in array (and allocate a new array) if the number of elements returned by the iterator exceeds the size of the passed in array. As a result, the flawed implementation of this method would return all the invalid blocks for a data node in one go, and thus trigger the NameNode to send a DNA_INVALIDATE command to the DataNode with an excessively large number of blocks. This INVALIDATE command, in turn, could potentially take a very long time to process at the DataNode, and since DatanodeCommand(s) are processed in between heartbeats at the DataNode, this would trigger the NameNode to consider the DataNode to be offline / unresponsive (due to a lack of heartbeats).

      In our use-case at CommonCrawl.org, we regularly do large scale hdfs file deletions after certain stages of our map-reduce pipeline. These deletes would make certain DataNode(s) unresponsive, and thus impact the cluster's capability to properly balance file-system reads / writes across the whole available cluster. This problem only surfaced once we migrated from our 16.2 deployment to the current 18.1 release.

      1. invalidateBlocksCopy-br18.patch
        4 kB
        Hairong Kuang
      2. invalidateBlocksCopy.patch
        4 kB
        Hairong Kuang
      3. HADOOP-4483-v3.patch
        2 kB
        dhruba borthakur
      4. HADOOP-4483-v3.patch
        6 kB
        dhruba borthakur
      5. HADOOP-4483-v2.patch
        2 kB
        Ahad Rana
      6. patch.HADOOP-4483
        1 kB
        Ahad Rana


        Ahad Rana created issue -
        Ahad Rana made changes -
        Field Original Value New Value
        Fix Version/s 0.18.2 [ 12313424 ]
        Status Open [ 1 ] Patch Available [ 10002 ]
        Ahad Rana made changes -
        Comment [ This fixes the getBlockArray method in DatanodeDescriptor to constrained the returned Block array to the maxBlocks values passed in. ]
        Ahad Rana made changes -
        Attachment patch.HADOOP-4483 [ 12392603 ]
        Ahad Rana made changes -
        Summary getBlockArray in DatanodeDescriptor does not not honor passed in maxblocks value getBlockArray in DatanodeDescriptor does not honor passed in maxblocks value
        Ahad Rana made changes -
        Attachment HADOOP-4483-v2.patch [ 12392622 ]
        dhruba borthakur made changes -
        Attachment HADOOP-4483-v3.patch [ 12392667 ]
        dhruba borthakur made changes -
        Attachment HADOOP-4483-v3.patch [ 12392668 ]
        Hairong Kuang made changes -
        Attachment invalidateBlocksCopy.patch [ 12393001 ]
        Hairong Kuang made changes -
        Attachment invalidateBlocksCopy-br18.patch [ 12393011 ]
        Tsz Wo Nicholas Sze made changes -
        Resolution Fixed [ 1 ]
        Hadoop Flags [Reviewed]
        Assignee Abdul Qadeer [ aqadeer ]
        Status Patch Available [ 10002 ] Resolved [ 5 ]
        Owen O'Malley made changes -
        Assignee Abdul Qadeer [ aqadeer ] Ahad Rana [ ahadr ]
        Nigel Daley made changes -
        Status Resolved [ 5 ] Closed [ 6 ]
        Owen O'Malley made changes -
        Component/s dfs [ 12310710 ]


          • Assignee:
            Ahad Rana
            Ahad Rana
          • Votes:
            0 Vote for this issue
            3 Start watching this issue


            • Created:

              Time Tracking

              Original Estimate - 1h
              Remaining Estimate - 1h
              Time Spent - Not Specified
              Not Specified