Is Postgres supposed to be able to handle concurrent requests while
doing large updates?
This morning I was executing the following simple update statement
that would affect 220,000 rows in my product table:
update product set is_hungry = 'true' where date_modified >
current_date - 10;
But the application that accesses the product table for reading
became very unresponsive while the update was happening.
Is it just a matter of slow I/O? The CPU usage seemed very low (less
than 5%) and iostat showed less than 1 MB / sec throughput.
I was doing the update in psql.
Are there any settings that I could tweak that would help with this
sort of thing?
Brendan Duddridge | CTO | 403-277-5591 x24 | brendan(at)clickspace(dot)com
ClickSpace Interactive Inc.
Suite L100, 239 - 10th Ave. SE
Calgary, AB T2G 0V9
pgsql-performance by date
|Next:||From: Brendan Duddridge||Date: 2006-05-28 09:43:23|
|Subject: Re: App very unresponsive while performing simple update|
|Previous:||From: Mark Kirkwood||Date: 2006-05-28 01:11:47|
|Subject: Re: is it possible to make this faster?|