Details
-
Sub-task
-
Status: Open
-
Critical
-
Resolution: Unresolved
-
None
-
None
-
None
-
None
Description
What happened
After setting hbase.lru.blockcache.hard.capacity.limit.factor=-0.4921875, running test org.apache.hadoop.hbase.io.hfile.TestHFile#testReaderWithAdaptiveLruCombinedBlockCacheresults in a null pointer exception.
Where's the problem
In the test:
cachedBlock = combined.getBlock(key, false, false, true); try { ... } finally { cachedBlock.release(); }
However, cacheBlock might not be initialized properly and could be null, causing an unhandled NullPointerException.
How to reproduce
- set hbase.lru.blockcache.hard.capacity.limit.factor to {{-0.4921875 }}
- run org.apache.hadoop.hbase.io.hfile.TestHFile#testReaderWithAdaptiveLruCombinedBlockCache
you should observejava.lang.NullPointerException at org.apache.hadoop.hbase.io.hfile.TestHFile.testReaderCombinedCache(TestHFile.java:1052) at org.apache.hadoop.hbase.io.hfile.TestHFile.testReaderWithAdaptiveLruCombinedBlockCache(TestHFile.java:1011)
For an easy reproduction, run the reproduce.sh in the attachment.
We are happy to provide a patch if this issue is confirmed.