Skip site navigation (1) Skip section navigation (2)

Re: Planner creating ineffective plans on LEFT OUTER joins

From: Chris Browne <cbbrowne(at)acm(dot)org>
To: pgsql-hackers(at)postgresql(dot)org
Subject: Re: Planner creating ineffective plans on LEFT OUTER joins
Date: 2008-06-26 15:31:34
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-hackers
simon(at)2ndquadrant(dot)com (Simon Riggs) writes:
> On Wed, 2008-06-25 at 23:34 -0400, Robert Haas wrote:
>> I can predict that Tom will say that the planning time it would take
>> to avoid this problem isn't justified by the number of queries that it
>> would improve.  
>> That's possible, but it's unfortunate that there's no
>> way to fiddle with the knobs and get the planner to do this kind of
>> thing when you want it to.
> I don't think we should invent a new parameter for each new
> optimisation. We would soon get swamped.
> IMHO we should have a single parameter which indicates how much planning
> time we consider acceptable for this query. e.g.
>  optimization_level = 2 (default), varies 1-3
> Most automatic optimisation systems allow this kind of setting, whether
> it be a DBMS, or compilers (e.g. gcc). 
> We should agree a simple framework so that each new category of
> optimization can be described as being a level X optimisation, or
> discarded as being never worth the time. We do this with error messages,
> so why not do this with something to control planning time?

Is there something "more parametric" that we could use to characterize

That is, to attach some value that *does* have some numeric

I don't quite have a "for instance," but here's some thoughts on
modelling this...

 - If there is some query optimization option/node that clearly adds
   to planning cost in a linear (or less) fashion, then it would be
   meaningful to mark it as "linear", and we'd be fairly certain to
   validate any linear options.

 - There would also be options/nodes that have a multiplicative effect
   on planning time.

 - Thirdly, there are options/nodes (particularly when considering
   cases of multiple joins) where there is a polynomial/exponential
   effect on query planning.

I could see:

  a) Evaluating which roads to consider from a
     "linear/multiplicative/exponential" perspective, which would look
     a lot like "level 1, level 2, level 3".

  b) Estimating  values, and, in effect, trying to model
     the amount of planning effort, and dropping out sets of routes
     that are expected to make the effort exceed [some value].

Sane?  Silly?

In response to

pgsql-hackers by date

Next:From: KaiGai KoheiDate: 2008-06-26 15:32:32
Subject: Re: [0/4] Proposal of SE-PostgreSQL patches
Previous:From: David E. WheelerDate: 2008-06-26 15:27:36
Subject: Re: Latest on CITEXT 2.0

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group