Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
None
-
None
-
None
-
Patch Available
Description
The issue is much the same with https://issues.apache.org/jira/browse/LUCENE-1224
Attachments
Attachments
Issue Links
- is part of
-
LUCENE-1227 NGramTokenizer to handle more than 1024 chars
- Resolved
- relates to
-
LUCENE-1224 NGramTokenFilter creates bad TokenStream
- Closed