Uploaded image for project: 'Lucene - Core'
  1. Lucene - Core
  2. LUCENE-8516

Make WordDelimiterGraphFilter a Tokenizer

Details

    • Task
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • None
    • None
    • New

    Description

      Being able to split tokens up at arbitrary points in a filter chain, in effect adding a second round of tokenization, can cause any number of problems when trying to keep tokenstreams to contract. The most common offender here is the WordDelimiterGraphFilter, which can produce broken offsets in a wide range of situations.

      We should make WDGF a Tokenizer in its own right, which should preserve all the functionality we need, but make reasoning about the resulting tokenstream much simpler.

      Attachments

        1. LUCENE-8516.patch
          51 kB
          Alan Woodward

        Activity

          People

            romseygeek Alan Woodward
            romseygeek Alan Woodward
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: