Uploaded image for project: 'Solr'
  1. Solr
  2. SOLR-14498

BlockCache gets stuck not accepting new stores



    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 6.5, 6.6.5, 7.7.3, 8.5.1, main (9.0)
    • Fix Version/s: 8.6
    • Component/s: query
    • Labels:


      BlockCache uses two components: "storage", i.e. banks and "eviction mechanism", i.e cache, implemented by caffeine cache.
      The relation between them is that "storage" enforces a strict limit for the number of entries (

      numberOfBlocksPerBank * numberOfBanks) whereas the "eviction mechanism" takes care of freeing entries from the storage thanks to maximumSize set for the caffeine cache to numberOfBlocksPerBank * numberOfBanks - 1.

      The storage relies on caffeine cache to eventually free at least 1 entry from the storage. If that doesn't happen the BlockCache starts to fail all new stores.

      As it turns out, the caffeine cache may not reduce it's size to the desired maximumSize for as long as no put or getIfPresent which finds an entry is executed.

      With a sufficiently unlucky read pattern, the block cache may be rendered useless (0 hit ratio):
      cache poisoned by non-reusable entries; new, reusable entries are not stored and thus not reused.

      Further info may be found in https://github.com/ben-manes/caffeine/issues/420


      Change in caffeine that triggers it's internal cleanup mechanism regardless of whether getIfPresent gets a hit has been implemented in https://github.com/ben-manes/caffeine/commit/7239bb0dda2af1e7301e8f66a5df28215b5173bc
      and is due to be released in caffeine 2.8.4




            • Assignee:
              ab Andrzej Bialecki
              jakubzytka Jakub Zytka
            • Votes:
              0 Vote for this issue
              9 Start watching this issue


              • Created: