Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
1.3
-
None
-
any
Description
If a field is split in tokens (by a tokenizer) and after that is aplied the NGramFilterFactory for these tokens...the indexing goes well while the length of the tokens is greater or equal with minim ngram size (ussually is 3). Otherwise the indexing breaks in this point and the rest of tokens are no more indexed. This behaviour can be easy observed with the analysis tool which is in Solr admin interface.
Attachments
Issue Links
- depends upon
-
LUCENE-1491 EdgeNGramTokenFilter stops on tokens smaller then minimum gram size.
- Closed