Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-720

NPE in BlockReceiver$PacketResponder.run(BlockReceiver.java:923)

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.21.0
    • Fix Version/s: 0.21.0
    • Component/s: datanode
    • Labels:
      None
    • Environment:

      Description

      Running some loadings on hdfs I had one of these on the DN XX.XX.XX.139:51010:

      2009-10-21 04:57:02,755 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_6345892463926159834_1029 src: /XX,XX,XX.140:37890 dest: /XX.XX.XX.139:51010
      2009-10-21 04:57:02,829 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_6345892463926159834_1029 1 Exception java.lang.NullPointerException
              at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:923)
              at java.lang.Thread.run(Thread.java:619)
      

      On XX,XX,XX.140 side, it looks like this:

      10-21 04:57:01,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_6345892463926159834_1029 src: /XX.XX.XX.140:37385 dest: /XX.XX.XX140:51010
      2009-10-21 04:57:02,836 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 2 for block blk_6345892463926159834_1029 terminating
      2009-10-21 04:57:02,885 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(XX.XX.XX.140:51010, storageID=DS-1292310101-208.76.44.140-51010-1256100924816, infoPort=51075, ipcPort=51020):Exception writing block blk_6345892463926159834_1029 to mirror XX.XX.XX.139:51010
      java.io.IOException: Connection reset by peer
          at sun.nio.ch.FileDispatcher.write0(Native Method)
          at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
          at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:104)
          at sun.nio.ch.IOUtil.write(IOUtil.java:75)
          at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:334)
          at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
          at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
          at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
          at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
          at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
          at java.io.DataOutputStream.write(DataOutputStream.java:90)
          at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:466)
          at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:434)
          at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:573)
          at org.apache.hadoop.hdfs.server.datanode.DataXceiver.opWriteBlock(DataXceiver.java:352)
          at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Receiver.opWriteBlock(DataTransferProtocol.java:382)
          at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Receiver.processOp(DataTransferProtocol.java:323)
          at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:111)
          at java.lang.Thread.run(Thread.java:619)
      

      Here is the bit of code inside the run method:

       922                   pkt = ackQueue.getFirst();
       923                   expected = pkt.seqno;
      

      So 'pkt' is null? But LinkedList API says that it throws NoSuchElementException if list is empty so you'd think we wouldn't get a NPE here. What am I missing?

      1. dn.log
        17 kB
        stack

        Activity

          People

          • Assignee:
            Unassigned
            Reporter:
            stack
          • Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development