Details

Type: Improvement

Status: Closed

Priority: Minor

Resolution: Fixed

Affects Version/s: None

Component/s: modules/other

Labels:None

Lucene Fields:New, Patch Available
Description
I had this problem quite a couple of times the last couple of month that searches very often contained super high frequent terms and disjunction queries became way too slow. The main problem was that stopword filtering wasn't really an option since in the domain those highfreq terms where not really stopwords though. So for instance searching for a song title "this is it" or for a band "A" didn't really fly with stopwords. I thought about that for a while and came up with a query based solution that decides based on a threshold if something is considered a stopword or not and if so it moves the term in two boolean queries one for highfrequent and one for lowfrequent such that those high frequent terms are only matched if the lowfrequent subquery produces a match. Yet if all terms are high frequent it makes the entire thing a Conjunction which gave me reasonable results as well as performance.
Activity
 All
 Comments
 Work Log
 History
 Activity
 Transitions
This is nice.
Would it make sense, perhaps, to base the cutoff on the cumulative document frequency – so sort terms by DF, then add terms into the MUST subquery one at a time until a limit is exceeded on the total DF of all terms added. Then the remaining terms get added into a SHOULD subquery.
This seems like it would set an upper bound on the total number of documents scored, or the total number of postings list entries which need to be inspected to select documents for scoring. (Good chance I'm missing something here mind...)
Whereas a cutoff based on perterm doc frequency, you could have arbitrarily many terms introduced into the MUST subquery, provided they all slip under the perterm DF threshold. And hence arbitrarily many documents scored.