Details
-
New Feature
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
None
-
None
Description
Lucene supports Tokenizers for a lot of languages. While the OpenNLP or Whitespace character based Tokenizers are fine for most of the languages this allows users to use special one (e.g. for Chinese the smartcn analyzer package)