Affects Version/s: 0.8
Fix Version/s: 0.8
Reading the code for LinkDb.reduce(): if we have page duplicates in input segments, or if we have two copies of the same input segment, we will create the same Inlink values (satisfying Inlink.equals()) multiple times. Since Inlinks is a facade for List, and not a Set, we will get duplicate Inlink-s in Inlinks (if you know what I mean .
The problem is easy to test: create a new linkdb based on 2 identical segments. This problem also makes it more difficult to properly implement LinkDB updating mechanism (i.e. incremental invertlinks).
I propose to change Inlinks to use a Set semantics, either explicitly by using a HashSet or implicitly by checking if a value to be added already exists. If there are no objections I'll commit this change shortly.
|Status||Open [ 1 ]||Closed [ 6 ]|
|Resolution||Fixed [ 1 ]|
|Fix Version/s||0.8-dev [ 12310224 ]|