In our product using the dependency manager and having configured several filter index settings, the memory consumption of the MultiPropertyFilterIndex is relatively high because it contains a map of string keys to service reference, where the keys are concatenated strings based on the service properties. These strings are all newly created objects resulting in inefficient memory usage.
In the attachment you find a version of the index which uses less memory because it no longer stores appended strings in new string objects, but stores the original strings in string arrays. This way it can benefit for instance from the garbage collector deduplicating strings.
Also attached are some unit tests to ensure the behavior did not change (note that we could not find any existing unit tests on the MultiPropertyFilterIndex).
If a large project is configured in our product we could have around 41500 services total. The configured MultiproperyFilterIndexes then contain more then 500000 keys.
When executing the attached unit test 'MultiPropertyFilterIndexPerformanceTest' for 500000 service references we found the memory consumption is reduced by 44%:
fastest write time: 3.433 milliseconds
fastest read time: 1.577 milliseconds
memory consumption: 1374 MB
fastest write time: 2.329 milliseconds
fastest read time: 1.997 milliseconds
memory consumption: 771 MB
Would it be either possible to replace the existing implementation or make it possible to supply a custom implementation of a filter index to the dependency manager?