Uploaded image for project: 'Lucene - Core'
  1. Lucene - Core
  2. LUCENE-759

Add n-gram tokenizers to contrib/analyzers

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • None
    • None
    • modules/analysis
    • None
    • New, Patch Available

    Description

      It would be nice to have some n-gram-capable tokenizers in contrib/analyzers. Patch coming shortly.

      Attachments

        1. LUCENE-759.patch
          15 kB
          Otis Gospodnetic
        2. LUCENE-759.patch
          12 kB
          Otis Gospodnetic
        3. LUCENE-759.patch
          11 kB
          Otis Gospodnetic
        4. LUCENE-759-filters.patch
          19 kB
          Otis Gospodnetic

        Issue Links

          Activity

            People

              otis Otis Gospodnetic
              otis Otis Gospodnetic
              Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Slack

                  Issue deployment