Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-1136

exception in UnderReplicatedBlocks:add when ther are more replicas of a block than required

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 0.12.2
    • 0.13.0
    • None
    • None

    Description

      I was running a random writer followed by a sort when I saw this condition:

      Exception in thread "org.apache.hadoop.dfs.FSNamesystem$ReplicationMonitor@187814" java.lang.ArrayIndexOutOfBoundsException: 3
      at org.apache.hadoop.dfs.FSNamesystem$UnderReplicatedBlocks.add(FSNamesystem.java:447)
      at org.apache.hadoop.dfs.FSNamesystem$UnderReplicatedBlocks.add(FSNamesystem.java:464)
      at org.apache.hadoop.dfs.FSNamesystem.processPendingReplications(FSNamesystem.java:1891)
      at org.apache.hadoop.dfs.FSNamesystem$ReplicationMonitor.run(FSNamesystem.java:1795)
      at java.lang.Thread.run(Thread.java:619)

      processPendingReplications is invoking neededReplications.add(). The add method then invokes add(block, 3, 4). This add method calls getPriority and that returns 3. Now, the call to priorityQueues[3].add() throws an OutOfBoundsException.

      Attachments

        1. neededReplicationAdd.patch
          0.6 kB
          Hairong Kuang
        2. neededReplicationAdd1.patch
          0.6 kB
          Hairong Kuang

        Activity

          People

            hairong Hairong Kuang
            dhruba Dhruba Borthakur
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: