Skip site navigation (1) Skip section navigation (2)


From: Tom Allison <tallison(at)tacocat(dot)net>
To: pgsql-novice(at)postgresql(dot)org
Subject: performance
Date: 2006-02-16 01:09:54
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-novice
Probably not a rare topic...

I made a little application last night to run a test:

Given a table of few fields (3)
Insert 10 million rows minimum, not a problem.
Randomly update rows, individually, as quickly as a perl script will allow for a 
total number of changes up to however many I can up to 1 billion transactions.
Each update is committed immediately.
Disk is a single EIDE 7400 RPM hard drive with EXT3 journaling.
I have 1GB RAM on some kind of 32-bit AMD CPU (2.? GHz)
postgresql version 7.4 (can't upgrade yet...)

I got about 1.25 million transactions and the IO was just clobbered.
So much so I couldn't log in to the machine to kill the client job.

I have no intention on turning this machine into a hard core server, yet.
But can someone identify what might be considered the top 5 things to do in 
order to keep the system from falling over.
I'm willing to take some performance hits to allow the system to keep running. 
But considering how many parameters there are to fiddle with I'm reluctant to 
just start twirling knobs without knowing what I'm doing.

pgsql-novice by date

Next:From: James A. BoweryDate: 2006-02-16 05:11:16
Subject: Ident authentication failed without su to user
Previous:From: Nils ZierathDate: 2006-02-15 22:01:50
Subject: newst packages for ubuntu

Privacy Policy | About PostgreSQL
Copyright © 1996-2018 The PostgreSQL Global Development Group