2011-08-25 20:58:34,118 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Pushed=78 entries from hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330253921 2011-08-25 20:58:34,118 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Splitting hlog 57 of 58: hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934, length=33404416 2011-08-25 20:58:34,118 INFO org.apache.hadoop.hbase.util.FSUtils: Recovering file hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 2011-08-25 20:58:35,121 INFO org.apache.hadoop.hbase.util.FSUtils: Finished lease recover attempt for hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 2011-08-25 20:58:35,526 INFO org.apache.hadoop.fs.FSInputChecker: Found checksum error: b[0, 512]=003aef2f000000240a313838333536343430340775736572732d3600000023dabcaa4f00000132043561cd00ffffffff000094360000005f00000026000000310015dfb353eec0be7a52f5be785103bbc8d34e9b09e601016ef4d4020100000132043561cd040b9490bca1ee25f4d4029490bca1ee25ff0000000000000000070000cbca1a02caa68599a02602020202acd29e91fb25000000005f00000026000000310015dfb353eec0be7a52f5be785103bbc8d34e9b09e601016ee3d1010100000132043561cd04 0b98f5c2b8f125e3d10198f5c2b8f125ff0000000000000000000000cbca1a02fde195e88d2602020202c299a2909726000000005d00000025000000300015dfb353eec0be7a52f5be785103bbc8d34e9b09e601016ec0140100000132043561cd040bd081d9c4f725c014d081d9c4f725ff0000000000000000070000cbca1a02caa68599a0260101010195b68fb99e26000000005d00000025000000300015dfb353eec0be7a52f5be785103bbc8d34e9b09e601016ebf1a0100000132043561cd040b91b2d89cf925bf1a91b2d89cf925ff0000000000000000070000cbca1a02caa68599a0260101010191b2d89cf925000000005f000000 26000000310015dfb353eec0be7a52f5be785103bbc8d34e9b09e601016eaad3020100000132043561cd040bc294e6c0f925aad302c294e6c0f925ff00 org.apache.hadoop.fs.ChecksumException: Checksum error: /blk_3784810663390554197:of:/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 at 32754176 at org.apache.hadoop.fs.FSInputChecker.verifySum(FSInputChecker.java:277) at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:241) at org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:176) at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:193) at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158) at org.apache.hadoop.hdfs.DFSClient$BlockReader.read(DFSClient.java:1193) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.readBuffer(DFSClient.java:1823) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1875) at java.io.DataInputStream.read(DataInputStream.java:132) at java.io.DataInputStream.readFully(DataInputStream.java:178) at org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63) at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1937) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1837) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1883) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:198) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:172) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.parseHLog(HLogSplitter.java:429) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLog(HLogSplitter.java:262) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLog(HLogSplitter.java:188) at org.apache.hadoop.hbase.master.MasterFileSystem.splitLog(MasterFileSystem.java:197) at org.apache.hadoop.hbase.master.handler.ServerShutdownHandler.process(ServerShutdownHandler.java:95) at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:156) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) 2011-08-25 20:58:35,529 WARN org.apache.hadoop.hdfs.DFSClient: Found Checksum error for blk_3784810663390554197_157755503 from 192.168.41.78:50010 at 32754176 2011-08-25 20:58:35,530 INFO org.apache.hadoop.hdfs.DFSClient: Could not obtain block blk_3784810663390554197_157755503 from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry... 2011-08-25 20:58:38,534 INFO org.apache.hadoop.hdfs.DFSClient: Could not obtain block blk_3784810663390554197_157755503 from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry... 2011-08-25 20:58:41,555 INFO org.apache.hadoop.hdfs.DFSClient: Could not obtain block blk_3784810663390554197_157755503 from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry... 2011-08-25 20:58:44,560 WARN org.apache.hadoop.hdfs.DFSClient: DFS Read: java.io.IOException: Could not obtain block: blk_3784810663390554197_157756711 file=/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1917) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1719) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1872) at java.io.DataInputStream.read(DataInputStream.java:132) at java.io.DataInputStream.readFully(DataInputStream.java:178) at org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63) at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1937) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1837) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1883) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:198) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:172) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.parseHLog(HLogSplitter.java:429) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLog(HLogSplitter.java:262) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLog(HLogSplitter.java:188) at org.apache.hadoop.hbase.master.MasterFileSystem.splitLog(MasterFileSystem.java:197) at org.apache.hadoop.hbase.master.handler.ServerShutdownHandler.process(ServerShutdownHandler.java:95) at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:156) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) 2011-08-25 20:58:44,578 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Pushed=51 entries from hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 2011-08-25 20:58:44,578 INFO org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Got while parsing hlog hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934. Marking as corrupted java.io.IOException: hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934, entryStart=28937644, pos=28937672, end=33404416, edit=51 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.addFileInfoToException(SequenceFileLogReader.java:244) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:200) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:172) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.parseHLog(HLogSplitter.java:429) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLog(HLogSplitter.java:262) at org.apache.hadoop.hbase.regionserver.wal.HLogSplitter.splitLog(HLogSplitter.java:188) at org.apache.hadoop.hbase.master.MasterFileSystem.splitLog(MasterFileSystem.java:197) at org.apache.hadoop.hbase.master.handler.ServerShutdownHandler.process(ServerShutdownHandler.java:95) at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:156) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) Caused by: java.io.IOException: Could not obtain block: blk_3784810663390554197_157756711 file=/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1917) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1719) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1872) at java.io.DataInputStream.read(DataInputStream.java:132) at java.io.DataInputStream.readFully(DataInputStream.java:178) at org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63) at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1937) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1837) at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1883) at org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.next(SequenceFileLogReader.java:198) ... 10 more 2011-08-25 20:58:44,578 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Splitting hlog 58 of 58: hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330607215, length=0 2011-08-25 20:58:44,578 INFO org.apache.hadoop.hbase.util.FSUtils: Recovering file hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330607215 2011-08-25 20:58:45,580 INFO org.apache.hadoop.hbase.util.FSUtils: Finished lease recover attempt for hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330607215 2011-08-25 20:58:45,580 WARN org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: File hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330607215 might be still open, length is 0 2011-08-25 20:58:45,592 DEBUG org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Pushed=0 entries from hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330607215 2011-08-25 20:58:45,639 INFO org.apache.hadoop.hbase.regionserver.wal.HLogSplitter: Moving corrupted log hdfs://hmaster1:9000/hbase/.logs/hslave3,60020,1313943560128/hslave3%3A60020.1314330342934 to hdfs://hmaster1:9000/hbase/.corrupt/hslave3%3A60020.1314330342934 ,60020,1313943560128/hslave3%3A60020.1314329504896 to hdfs://hmaster1:9000/hbase/.oldlogs/hslave3%3A60020.1314329504896