Details

    • Sub-task
    • Status: Open
    • Critical
    • Resolution: Unresolved
    • None
    • None
    • None
    • None

    Description

      What happened

      After setting hbase.lru.blockcache.hard.capacity.limit.factor=-0.4921875, running test org.apache.hadoop.hbase.io.hfile.TestHFile#testReaderWithAdaptiveLruCombinedBlockCacheresults in a null pointer exception.

      Where's the problem

      In the test:

            cachedBlock = combined.getBlock(key, false, false, true);
            try {
      ...
            } finally {
              cachedBlock.release();
            }

      However, cacheBlock might not be initialized properly and could be null, causing an unhandled NullPointerException.

      How to reproduce

      1. set hbase.lru.blockcache.hard.capacity.limit.factor to {{-0.4921875 }}
      2. run org.apache.hadoop.hbase.io.hfile.TestHFile#testReaderWithAdaptiveLruCombinedBlockCache
        you should observe
        java.lang.NullPointerException
            at org.apache.hadoop.hbase.io.hfile.TestHFile.testReaderCombinedCache(TestHFile.java:1052)
            at org.apache.hadoop.hbase.io.hfile.TestHFile.testReaderWithAdaptiveLruCombinedBlockCache(TestHFile.java:1011)

        For an easy reproduction, run the reproduce.sh in the attachment.

      We are happy to provide a patch if this issue is confirmed.

      Attachments

        1. HBASE-27995.patch
          1 kB
          Konstantin Ryakhovskiy
        2. reproduce.sh
          0.7 kB
          ConfX

        Activity

          People

            Unassigned Unassigned
            FuzzingTeam ConfX
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: