Details
-
New Feature
-
Status: Closed
-
Minor
-
Resolution: Not A Problem
-
None
-
None
Description
the RemoveDuplicatesTokenFilter seems broken as it initializes its map and attributes at the class level and not within its constructor
in addition i would think the expected behaviour would be to remove identical terms with the same offset positions, instead it looks like it removes duplicates based on position increment which wont work when using it after something like the edgengram filter. when i posted this to the mailing list even erik hatcher seemed to think thats what this filter was supposed to do...
attaching a patch that has the expected behaviour and initializes variables in constructor