Details

    • Type: New Feature New Feature
    • Status: Resolved
    • Priority: Trivial Trivial
    • Resolution: Won't Fix
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: modules/analysis
    • Labels:
      None
    • Lucene Fields:
      New, Patch Available

      Description

      Alternative NGram filter that produce tokens with composite prefix and suffix markers.

      ts = new WhitespaceTokenizer(new StringReader("hello"));
      ts = new CombinedNGramTokenFilter(ts, 2, 2);
      assertNext(ts, "^h");
      assertNext(ts, "he");
      assertNext(ts, "el");
      assertNext(ts, "ll");
      assertNext(ts, "lo");
      assertNext(ts, "o$");
      assertNull(ts.next());
      
      1. LUCENE-1306.txt
        9 kB
        Karl Wettin
      2. LUCENE-1306.txt
        15 kB
        Karl Wettin

        Activity

        No work has yet been logged on this issue.

          People

          • Assignee:
            Karl Wettin
            Reporter:
            Karl Wettin
          • Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development