Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
None
-
Reviewed
Description
The following error message is wrong.
BlockRecoveryWorker#recover
} catch (IOException e) { ++errorCount; InterDatanodeProtocol.LOG.warn( "Failed to obtain replica info for block (=" + block + ") from datanode (=" + id + ")", e); }
The operation performed in the try block is an attempt to recover the block, not obtain replica info from the datanode.
This is the error message printed by the above code:
2017-03-01 16:15:35,884 WARN org.apache.hadoop.hdfs.server.protocol.InterDatanodeProtocol: Failed to obtain replica info for block (=BP-1949147302-10.0.0.140-1423905184563:blk_1074852850_1112215) from datanode (=DatanodeInfoWithStorage[10.0.0.53:50010,null,null]) java.io.IOException: THIS IS NOT SUPPOSED TO HAPPEN: replica.getGenerationStamp() >= recoveryId = 1112223, block=blk_1074852850_1112215, replica=FinalizedReplica, blk_1074852850_1112231, FINALIZED getNumBytes() = 12823160 getBytesOnDisk() = 12823160 getVisibleLength()= 12823160 getVolume() = /dfs/dn/current getBlockFile() = /dfs/dn/current/BP-1949147305-10.0.0.140-1423905184563/current/finalized/subdir16/subdir243/blk_1074852850 at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery(FsDatasetImpl.java:2318) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery(FsDatasetImpl.java:2277) at org.apache.hadoop.hdfs.server.datanode.DataNode.initReplicaRecovery(DataNode.java:2548) at org.apache.hadoop.hdfs.protocolPB.InterDatanodeProtocolServerSideTranslatorPB.initReplicaRecovery(InterDatanodeProtocolServerSideTranslatorPB.java:55) at org.apache.hadoop.hdfs.protocol.proto.InterDatanodeProtocolProtos$InterDatanodeProtocolService$2.callBlockingMethod(InterDatanodeProtocolProtos.java:2983) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95) at org.apache.hadoop.hdfs.server.datanode.BlockRecoveryWorker.callInitReplicaRecovery(BlockRecoveryWorker.java:339) at org.apache.hadoop.hdfs.server.datanode.BlockRecoveryWorker.access$300(BlockRecoveryWorker.java:46) at org.apache.hadoop.hdfs.server.datanode.BlockRecoveryWorker$RecoveryTaskContiguous.recover(BlockRecoveryWorker.java:118) at org.apache.hadoop.hdfs.server.datanode.BlockRecoveryWorker$1.run(BlockRecoveryWorker.java:374) at java.lang.Thread.run(Thread.java:745)