Skip site navigation (1) Skip section navigation (2)

Re: Adding and filling new column on big table

From: Francisco Reyes <lists(at)stringsutils(dot)com>
To: Jonathan Blitz <jb(at)anykey(dot)co(dot)il>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: Adding and filling new column on big table
Date: 2006-05-30 15:58:38
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-performance
Jonathan Blitz writes:

> I just gave up in the end and left it with NULL as the default value.

Could you do the updates in batches instead of trying to do them all at 

Have you done a vacuum full on this table ever?

> There were, in fact, over 2 million rows in the table rather than 1/4 of a
> million so that was part of the problem.

What hardware?
I have a dual CPU opteron with 4GB of RAM and 8 disks in RAID 10 (SATA). 
Doing an update on a 5 million record table took quite a while, but it did 
fininish. :-)

I just did vacuum full before and after though.. That many updates tend to 
slow down operations on the table aftewards unless you vacuum the table. 
Based on what you wrote it sounded as if you tried a few times and may have 
killed the process.. this would certainly slow down the operations on that 
table unless you did a vacuum full. 

I wonder if running vacuum analyze against the table as the updates are 
running would be of any help.

In response to


pgsql-performance by date

Next:From: Tom LaneDate: 2006-05-30 16:20:23
Subject: Re: pg_dump issue
Previous:From: mcelroy, timDate: 2006-05-30 15:33:54
Subject: Re: pg_dump issue

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group