Details
-
New Feature
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
None
-
None
-
New, Patch Available
Description
A TokenFilter that emits a single token which is a sorted, de-duplicated set of the input tokens.
This approach to normalizing text is used in tools like OpenRefine[1] and elsewhere [2] to help in clustering or linking texts.
The implementation proposed here has a an upper limit on the size of the combined token which is output.
[1] https://github.com/OpenRefine/OpenRefine/wiki/Clustering-In-Depth
[2] https://rajmak.wordpress.com/2013/04/27/clustering-text-map-reduce-in-python/