Details
-
Bug
-
Status: Closed
-
Blocker
-
Resolution: Fixed
-
1.6.3
-
None
-
None
-
Patch
Description
Due to a bug in the MemoryAwareConcurrentReadMap implementation, a call to remove entries from the MetaClassRegistry can result in other entries becoming inaccessible.
This can cause very unpredictable behavior/failures when relying on dynamic modifications to a Class' ExpandoMetaClass. For example, this bug is responsible for the behavior described in GRAILS-3666.
The failure occurs when a hash table entry that is not at the head of its bucket is removed. The mechanism to recreate the bucket's linked list after removal does not dereference the SoftReference objects for entries following the removal. As a result, any entries in the list after the removed entry will no longer be accessible by their original key.
A simple patch for the problem has been attached. Note that the fix is not necessary for the sremove() method, as proper dereferencing is used there.
The failure case can be easily demonstrated with the following test code:
import org.codehaus.groovy.runtime.metaclass.*; def initalCapacity = 4 def loadFactor = 1000 // keep number of buckets low to increase entries per bucket def keysToInsert = 2000 def map = new MemoryAwareConcurrentReadMap(initalCapacity, loadFactor) def keys = [] keysToInsert.times { def random = Math.random() map.put(random, random) // insert random key keys << random // keep track of keys so we can test reachability } println "Inserted $keysToInsert random keys" int count = 0 keys.each { if (map.get(it)) count++ } println "Keys in map: $count" def removeMe = map.get(keys[0]) // just remove any key map.remove(removeMe) println "Removed one key" count = 0 keys.each { if (map.get(it)) count++ } println "Keys in map: $count" assert count == keysToInsert - 1 //we expect our count to be decreased by one
In the example above, the load factor has been increased to an impracticable level to ensure failure will occur. The assertion will still fail with the default load factor, but not as consistently.