Uploaded image for project: 'Lucene - Core'
  1. Lucene - Core
  2. LUCENE-1489

highlighter problem with n-gram tokens

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Won't Fix
    • None
    • None
    • modules/highlighter
    • None
    • New

    Description

      I have a problem when using n-gram and highlighter. I thought it had been solved in LUCENE-627...

      Actually, I found this problem when I was using CJKTokenizer on Solr, though, here is lucene program to reproduce it using NGramTokenizer(min=2,max=2) instead of CJKTokenizer:

      public class TestNGramHighlighter {
      
        public static void main(String[] args) throws Exception {
          Analyzer analyzer = new NGramAnalyzer();
          final String TEXT = "Lucene can make index. Then Lucene can search.";
          final String QUERY = "can";
          QueryParser parser = new QueryParser("f",analyzer);
          Query query = parser.parse(QUERY);
          QueryScorer scorer = new QueryScorer(query,"f");
          Highlighter h = new Highlighter( scorer );
          System.out.println( h.getBestFragment(analyzer, "f", TEXT) );
        }
      
        static class NGramAnalyzer extends Analyzer {
          public TokenStream tokenStream(String field, Reader input) {
            return new NGramTokenizer(input,2,2);
          }
        }
      }
      

      expected output is:
      Lucene <B>can</B> make index. Then Lucene <B>can</B> search.

      but the actual output is:
      Lucene <B>can make index. Then Lucene can</B> search.

      Attachments

        1. lucene1489.patch
          3 kB
          David Bowen
        2. LUCENE-1489.patch
          3 kB
          David Bowen

        Issue Links

          Activity

            People

              Unassigned Unassigned
              koji Koji Sekiguchi
              Votes:
              3 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: