Uploaded image for project: 'Lucene - Core'
  1. Lucene - Core
  2. LUCENE-3907

Improve the Edge/NGramTokenizer/Filters

Details

    • Improvement
    • Status: Closed
    • Major
    • Resolution: Fixed
    • None
    • 4.4
    • None
    • New

    Description

      Our ngram tokenizers/filters could use some love. EG, they output ngrams in multiple passes, instead of "stacked", which messes up offsets/positions and requires too much buffering (can hit OOME for long tokens). They clip at 1024 chars (tokenizers) but don't (token filters). The split up surrogate pairs incorrectly.

      Attachments

        1. LUCENE-3907.patch
          43 kB
          Adrien Grand

        Issue Links

          Activity

            People

              jpountz Adrien Grand
              mikemccand Michael McCandless
              Votes:
              2 Vote for this issue
              Watchers:
              12 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: