Skip site navigation (1) Skip section navigation (2)

Re: pgsql-admin-digest V1 #452

From: Chris Albertson <calbertson(at)logicon(dot)com>
To: pgsql-admin(at)hub(dot)org
Cc: Chris Czeyka <chris(dot)czeyka(at)rommel(dot)stw(dot)uni-erlangen(dot)de>
Subject: Re: pgsql-admin-digest V1 #452
Date: 2000-02-16 19:02:13
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-admin
pgsql-admin-digest wrote:
> pgsql-admin-digest     Tuesday, February 15 2000     Volume 01 : Number 452
> Index:
> system requirments
> ----------------------------------------------------------------------
> Date: Tue, 15 Feb 2000 11:38:05 +0100
> From: Chris Czeyka <chris(dot)czeyka(at)rommel(dot)stw(dot)uni-erlangen(dot)de>
> Subject: system requirments
> Hello to all Postgres brains,
> for my project I want to use an older PC from my institute for a dictionary
> database. The database will contain at several relations (around 10) with about
> 30.000 entries each. Additionaly those tables will be linked in n:m relations
> (about 20 n:m relations).
> As I just started in databases, I wanted to ask all experienced postgres-admins
> whether a 133-Pentium with 128MB could bear this burden in the beginning or if
> it will go down in smoke. As a maximum value there will be 5 users in the
> database simultaneously.
> If you can tell me about your system and your experiences, I can judge the
> situation in a better way.

I think you will do OK.  I find that RAM helps more than a fast CPU.
If you don't have enough RAM then you are I/O bound and a fast CPU is
of no help.  With 128MB I'd say you will be doing OK.  What really helps
is if much of the database gets cached in RAM.  With only 30,000 records
at (let's say) 1KB each that is only 30,000,000 bytes total.  The whole
thing would fit in RAM.

I made the mistake of testing my application with small sized tables
of a few ten thousand records.  All was pretty fast.  My "real" data
set is of order 1E8 records.  My design did not scale and I had to
re-write a lot of code.  My advice if you are worried about performence
is to build a test database filled up with 30K random "junk" records and
do a few queries from the psql monitor by hand.  But with 30K tupels and
128MB RAM and the proper index files, you will do OK.

The "top" program is usful when tunning Postgres.  You can see how
much RAM is being used for what ppurpose and CPU utilazation.  Your
goal is to adjust things such that your CPU utilazation is close
to 100%.  Any less means you are waiting on your disk.
> End of pgsql-admin-digest V1 #452
> *********************************
> ************

  Chris Albertson

  calbertson(at)logicon(dot)com                  Voice: 626-351-0089  X127
  Logicon, Pasadena California            Fax:   626-351-0699

pgsql-admin by date

Next:From: listDate: 2000-02-17 18:46:18
Previous:From: Chris CzeykaDate: 2000-02-15 10:38:05
Subject: system requirments

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group