Re: Full Text Search - Slow on common words

From: "Reid Thompson" <Reid(dot)Thompson(at)ateb(dot)com>
To: <steve(at)subwest(dot)com>
Cc: "Reid Thompson" <Reid(dot)Thompson(at)ateb(dot)com>, <pgsql-general(at)postgresql(dot)org>
Subject: Re: Full Text Search - Slow on common words
Date: 2010-10-28 19:56:30
Message-ID: 7C0800F63CCF4149AC0FC5EE2A041226D52386@sr002-2k3exc.ateb.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Thu, 2010-10-28 at 12:08 -0700, sub3 wrote:
> Hi,
>
> I have a small web page set up to search within my domain based on keywords.
> One of the queries is:
> SELECT page.id ts_rank_cd('{1.0, 1.0, 1.0, 1.0}',contFTI,q) FROM page,
> to_tsquery('steve') as q WHERE contFTI @@ q
>
> My problem is: when someone puts in a commonly seen word, the system slows
> down and takes a while because of the large amount of data being returned
> (retrieved from the table) & processed by the rand_cd function.
>
> How does everyone else handle something like this? I can only think of 2
> possible solutions:
> - change the query to search for the same terms at least twice in the same
> document (can I do that?)
> - limit any searches to x results before ranking & tell the user their
> search criteria is too generic.
>
> Is there a better solution that I am missing?
>

if the keyword is that common, is it really a keyword? Exclude it.

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Dann Corbit 2010-10-28 20:00:49 Re: Full Text Search - Slow on common words
Previous Message Daniel.Crespo 2010-10-28 19:55:17 Re: How to merge data from two separate databases into one (maybe using xlogs)?