Re: Very long deletion time on a 200 GB database

From: "Reuven M(dot) Lerner" <reuven(at)lerner(dot)co(dot)il>
To: Andrew Dunstan <andrew(at)dunslane(dot)net>
Cc: Marcin Mańk <marcin(dot)mank(at)gmail(dot)com>, pgsql-performance(at)postgresql(dot)org
Subject: Re: Very long deletion time on a 200 GB database
Date: 2012-02-23 15:25:46
Message-ID: 4F465A7A.4000905@lerner.co.il
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

Hi, everyone. Thanks for all of the help and suggestions so far; I'll
try to respond to some of them soon. Andrew wrote:

>> How about:
>>
>> DELETE FROM B
>> WHERE r_id IN (SELECT distinct R.id
>> FROM R WHERE r.end_date< (NOW() - (interval '1 day' * 30))
>>
>> ?
>>
>
> Or possibly without the DISTINCT. But I agree that the original query
> shouldn't have B in the subquery - that alone could well make it crawl.

I put B in the subquery so as to reduce the number of rows that would be
returned, but maybe that was indeed backfiring on me. Now that I think
about it, B is a huge table, and R is a less-huge one, so including B in
the subselect was probably a mistake.

>
> What is the distribution of end_dates? It might be worth running this in
> several steps, deleting records older than, say, 90 days, 60 days, 30 days.

I've suggested something similar, but was told that we have limited time
to execute the DELETE, and that doing it in stages might not be possible.

Reuven

--
Reuven M. Lerner -- Web development, consulting, and training
Mobile: +972-54-496-8405 * US phone: 847-230-9795
Skype/AIM: reuvenlerner

In response to

Responses

Browse pgsql-performance by date

  From Date Subject
Next Message ktm@rice.edu 2012-02-23 15:28:37 Re: Very long deletion time on a 200 GB database
Previous Message Andy Colson 2012-02-23 15:18:35 Re: set autovacuum=off