Re: eWeek Poll: Which database is most critical to your

From: "Dann Corbit" <DCorbit(at)connx(dot)com>
To: "Henshall, Stuart - WCP" <SHenshall(at)westcountrypublications(dot)co(dot)uk>, "Neil Conway" <nconway(at)klamath(dot)dyndns(dot)org>
Cc: <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: eWeek Poll: Which database is most critical to your
Date: 2002-02-27 19:15:42
Message-ID: D90A5A6C612A39408103E6ECDD77B82906F3F5@voyager.corporate.connx.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

[snip]
If I understand correctly you'll be taking matching the on the original
query string, then pulling out the previous plan, rather than doing all
the planning again? Or where you thinking of storing the resultant
tuples
(seems far more diffcult to do effciently)?
Either way would be handy for me though as I have a number of clients
who
all basically ask the same query and then ask it again every few minutes

to update themselves. Therefore this sounds like something that would
improve performance for me.
Hope I've understood correctly,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
If they expect the information to change, then caching the query data
is not going to help. Unless you have a read-only database, it is my
opinion that caching of the actual query data is a very bad idea. It
will complicate the code and (in the end) not end up being any faster.

Two other aspects of the approach (gathering statistical information
on the frequency and usage of queries and storing the prepared query)
are both excellent ideas and ought to be pursued. Maintaining an LRU
cache of prepared queries is also a good idea.

If you have a read only database or table, then you can make a lot of
safe, simplifying assumptions about queries against that table. I
think that you can cache the data for a read-only table or database.

But for those instances, will it really be worth it? In other words,
if I have some read only tables and they really are frequently
accessed by user queries, the data will be in memory anyway. How much
will you actually gain by saving the data somewhere?

Benchmarks are great, but I think it makes a lot more sense to focus
on real applications. If we want to improve web server performance
(here is a tangible, real goal) then why not try with a real, useful
web server application? If we want to perform well in benchmarks, use
real benchmarks like the TPC-X benchmarks. "The Benchmark Factory"
has some useful stuff along these lines.

For sure, I can write a benchmark that makes any tool look good. But
does it indicate that the tool is actually useful for getting real
work done?

Instead of trying to win benchmarks, we should imagine how we can
make improvements that will solve real scientific and business
problems.

IMO-YMMV
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

Browse pgsql-hackers by date

  From Date Subject
Next Message Hannu Krosing 2002-02-27 20:48:17 Re: Oracle vs PostgreSQL in real life
Previous Message Tom Lane 2002-02-27 19:06:37 Re: Refactoring of command.c