Uploaded image for project: 'Lucene - Core'
  1. Lucene - Core
  2. LUCENE-2407

make CharTokenizer.MAX_WORD_LEN parametrizable

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Minor
    • Resolution: Unresolved
    • Affects Version/s: 3.0.1
    • Fix Version/s: 4.9, 6.0
    • Component/s: modules/analysis
    • Labels:
    • Lucene Fields:
      New

      Description

      as discussed here http://n3.nabble.com/are-long-words-split-into-up-to-256-long-tokens-tp739914p739914.html it would be nice to be able to parametrize that value.

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              jmwap jmwap
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated: