I would like to realize a fuzzy search with Python Whoosh but I don' get it. I've tried to make the fuzzy search possible with the help of NGRAMWORDS.
Here is my schema:
schema = Schema(id=ID(stored=True),
name=NGRAMWORDS(minsize=2, maxsize=4, stored=True, queryor=True),
street=NGRAMWORDS(minsize=2, maxsize=4, stored=True, queryor=True),
city=NGRAMWORDS(minsize=2, maxsize=4, stored=True, queryor=False))
The index is then filled as below stated:
writer.add_document(id=unicode(row["id"]), name=unicode(row["name"]), street=unicode(row["street"]), city=unicode(row["city"]))
Unfortunately, when it comes to the search no results are retrieved from the index:
with self.index.searcher() as searcher:
from whoosh.query import Term, Or, FuzzyTerm
from whoosh.analysis import NgramWordAnalyzer
ngramAnalyzer = NgramWordAnalyzer(minsize=2, maxsize=4)
tokens = [token.text for token in ngramAnalyzer(unicode(name))]
fetig = list()
for t in tokens:
tt = FuzzyTerm("name", unicode(t))
fetig.append(tt)
myQuery = Or(fetig)
res = searcher.search(myQuery, limit=10)
I get zero hits back when searching for "Ali":
<Top 0 Results for Or([FuzzyTerm('name', u'al', boost=1.000000, maxdist=1, prefixlength=1), FuzzyTerm('name', u'ali', boost=1.000000, maxdist=1, prefixlength=1), FuzzyTerm('name', u'li', boost=1.000000, maxdist=1, prefixlength=1)]) runtime=0.000411987304688>
It is solved now. The problem was the already existent index wasn't opened via
index = open_dir("index", schema=self.schema)
and instead I have created a new one.
Furthermore in the query it was crucial to use Term instead of FuzzyTerm in order to get plausible results:
ngramAnalyzer = NgramWordAnalyzer(minsize=3, maxsize=6)
tokens = [token.text for token in ngramAnalyzer(unicode(name))]
fetig = list()
for t in tokens:
tt = Term("name", unicode(t))
fetig.append(tt)
myQuery = Or(fetig)
res = searcher.search(myQuery, limit=10)
And as you can see I have increased minsize of the NGRAMWORDS to 3 instead of 2.
Thank you for your precious work, Matt Chaput.