Details
Description
We could not identify which file it belongs easily when DFSInputStream meet CannotObtainBlockLengthException, as the following exception log. Just suggest to log file path string when we meet CannotObtainBlockLengthException.
Caused by: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-***:blk_***_***; getBlockSize()=690504; corrupt=false; offset=1811939328; locs=[DatanodeInfoWithStorage[*:50010,DS-2bcadcc4-458a-45c6-a91b-8461bf7cdd71,DISK], DatanodeInfoWithStorage[*:50010,DS-8f2bb259-ecb2-4839-8769-4a0523360d58,DISK], DatanodeInfoWithStorage[*:50010,DS-69f4de6f-2428-42ff-9486-98c2544b1ada,DISK]]} at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:402) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:345) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:280) at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:272) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1664) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:304) at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:300) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:300) at org.apache.hadoop.fs.FilterFileSystem.open(FilterFileSystem.java:161) at org.apache.hadoop.fs.viewfs.ChRootedFileSystem.open(ChRootedFileSystem.java:266) at org.apache.hadoop.fs.viewfs.ViewFileSystem.open(ViewFileSystem.java:481) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:828) at org.apache.hadoop.mapred.LineRecordReader.<init>(LineRecordReader.java:109) at org.apache.hadoop.mapred.TextInputFormat.getRecordReader(TextInputFormat.java:67) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65) ... 16 more