Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 4.9, 6.0
    • Component/s: None
    • Labels:
      None
    • Lucene Fields:
      New

      Description

      previously this was 3 hashes (prefixes, words, suffixes) and it tried to split the words in various ways and do lookups. This was changed to FST, but the algorithm wasn't adjusted to use it properly (e.g. single pass, terminate when it reaches a "dead end").

      this makes for slower indexing when using this stemmer...

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              rcmuir Robert Muir
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: