Skip site navigation (1) Skip section navigation (2)

Re: Rapidly finding maximal rows

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: James Cranch <jdc41(at)cam(dot)ac(dot)uk>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: Rapidly finding maximal rows
Date: 2011-10-14 04:25:53
Message-ID: 13279.1318566353@sss.pgh.pa.us (view raw or flat)
Thread:
Lists: pgsql-performance
James Cranch <jdc41(at)cam(dot)ac(dot)uk> writes:
> I have a slow query. I expect that there is either a more intelligent way 
> to write it, or that one could make some indexes that would speed it up. 
> I've tried various indexes, and am not getting anywhere.
> I'd be grateful for any suggestions. Reasonably full details are below.

Two bits of advice:

1. Avoid unnecessary use of WITH.  It acts as an optimization fence,
which you don't want here.  In particular, the only way to avoid
sorting/aggregating over the whole table is for the outer query's WHERE
conditions on competition_name and academic_year_beginning to get pushed
down into the scans on challenge_entries ... and that can't happen if
there's a WITH in between.

2. Try increasing work_mem.  I think that your first view would work all
right if it had enough work_mem to go for a HashAgg plan instead of
sort-and-group, even without pushdown of the outer WHERE.  It'd
definitely be faster than what you've got, anyway.

The other approach with a window function is probably a lost cause.
Postgres hasn't got a lot of intelligence about optimizing
window-function queries ...

			regards, tom lane

In response to

pgsql-performance by date

Next:From: Tom LaneDate: 2011-10-14 04:37:48
Subject: Re: Join over two tables of 50K records takes 2 hours
Previous:From: davidsarmstrongDate: 2011-10-13 19:34:09
Subject: Re: Slow query when using ORDER BY *and* LIMIT

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group