Details
Description
Spinoff from LUCENE-6818:
Ahmet Arslan found problems with every Similarity (except ClassicSimilarity) when trying to test how they behave on every possible norm value, to ensure they are robust for all index-time boosts.
There are several problems:
1. buggy normalization decode that causes the smallest possible norm value (0) to be treated as an infinitely long document. These values are intended to be encoded as non-negative finite values, but going to infinity breaks everything.
2. various problems in the less practical functions that already have documented warnings that they do bad things for extreme values. These impact DFR models D, Be, and P and IB distribution SPL.
Activity
- All
- Comments
- Work Log
- History
- Activity
- Transitions
I committed the fixes and tests. I did discard my changes (delta) to model P after some investigation, as it does not fix all P's problems with abnormal TF values.