I'm interested in the response time instead of the quality of it, that is,
if this response is (or not) a best plan. Thus, my interest on the response
time is to compare the GEQO algorithm with other solutions for optimization
of queries, more specificaly when there are 12 or more tables evolved.
Accordingly, the algorithm of dynamic programming becomes hard because of
the computational cost to obtain the optimal plan.
My question is to know what tool or method was used to test the GEQO
algorithm or if there is some documentation that explain how the geqo was
2008/5/28 Simon Riggs <simon(at)2ndquadrant(dot)com>:
> On Wed, 2008-05-28 at 13:13 -0300, Tarcizio Bini wrote:
> > Of course, the geqo_threshold can be changed so that the geqo be
> > performed in queries that have less than 12 tables. However, we aim to
> > test the GEQO algorithm in conditions where the standard algorithm
> > (dynamic programming) has a high cost to calculate the query plan.
> My understanding is the GEQO cannot arrive at a better plan than the
> standard optimizer, so unless you wish to measure planning time there
> isn't much to test. What is the quality of the plans it generates? Well
> that varies according to the input; sometimes it gets the right plan,
> other times it doesn't get close.
> Simon Riggs www.2ndQuadrant.com
> PostgreSQL Training, Services and Support
In response to
pgsql-performance by date
|Next:||From: Bruce Momjian||Date: 2008-05-28 23:37:36|
|Subject: Re: Creating large database of MD5 hash values|
|Previous:||From: Simon Riggs||Date: 2008-05-28 16:52:40|
|Subject: Re: GEQO Benchmark|