Details
-
New Feature
-
Status: Closed
-
Major
-
Resolution: Won't Fix
-
site
-
None
-
None
-
None
-
patch is based on hadoop core trunk ,revision 728879
Description
The fsck can get the file's block detail,but when you want see which file or datanode the block belongs to ,it will be helpless.
The BlockTool will be helpfull in developing,for example when you happened to these message :
2008-12-25 12:12:10,049 WARN dfs.DataNode (DataNode.java:readBlock(901)) - Got exception while serving blk_28622148 to /10.7
3.4.101:
java.io.IOException: Block blk_28622148 is not valid.
at org.apache.hadoop.dfs.FSDataset.getBlockFile(FSDataset.java:541)
at org.apache.hadoop.dfs.DataNode$BlockSender.<init>(DataNode.java:1090)
at org.apache.hadoop.dfs.DataNode$DataXceiver.readBlock(DataNode.java:882)
at org.apache.hadoop.dfs.DataNode$DataXceiver.run(DataNode.java:840)
at java.lang.Thread.run(Thread.java:595)
the Blocktool may help you to get the location,it can get the file name and which datanodes hold the block.
Also it can get the file or directory 's block details too.
Attachments
Attachments
Issue Links
- is cloned by
-
HDFS-207 add querying block's info in the fsck facility
-
- Open
-