I've got to set up a program that automatically does searches for up to
10000 members (batched once a week at night), and then email the results of
those searches to them. The search query is rather complex, requiring up to
5 subqueries, close to 10 tables, many joins, and some full-text searches.
I've done the indexing, and has changed the run-time variables of
postmaster (i.e. sort-mem and number of buffers), and vacuumed the database
regularly. Yet, it's still taking a couple of hours on a PIII 850 server
with 512MB RAM.
I've read that writing the search as a stored procedure improves
performance. If so, how significant is it and what affects the efficiency
gains? Also, I think full-text indexing will help, but it's not supported
in postgres natively right?
Thanks in advance
pgsql-novice by date
|Next:||From: Michael||Date: 2001-06-15 06:39:23|
|Subject: configuring question?|
|Previous:||From: Rob Brown-Bayliss||Date: 2001-06-13 03:43:25|
|Subject: Re: Multiple Columns Keys - Good or Bad idea?|