Uploaded image for project: 'Lucene - Core'
  1. Lucene - Core
  2. LUCENE-4993

BeiderMorseFilter inserts tokens with positionIncrement=0, but ignores all custom attributes except OffsetAttribute

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 4.3
    • 4.3.1, 6.0
    • modules/analysis
    • None
    • New, Patch Available

    Description

      BeiderMorseFilter inserts sometimes additional phonetic tokens for the same source token. Currently it calls clearAttributes before doing this and sets the new token's term, positionIncrement=0 and the original offset.

      This leads to problems if the TokenStream contains other attributes inserted before (like KeywordAttribute, FlagsAttribute,...). Those are all reverted to defaults for the inserted tokens.

      The TokenFilter should remove the special case done for preserving offsets and instead to captureState() and restoreState().

      Attachments

        1. LUCENE-4993.patch
          4 kB
          Uwe Schindler

        Issue Links

          Activity

            People

              uschindler Uwe Schindler
              uschindler Uwe Schindler
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: