Skip site navigation (1) Skip section navigation (2)

Re: Setting up of a large database

From: "Scott Marlowe" <scott(dot)marlowe(at)gmail(dot)com>
To: "Roberto Edwins" <robertoedwins(at)gmail(dot)com>
Cc: pgsql-admin(at)postgresql(dot)org
Subject: Re: Setting up of a large database
Date: 2008-05-16 20:18:02
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-admin
On Fri, May 16, 2008 at 11:50 AM, Roberto Edwins
<robertoedwins(at)gmail(dot)com> wrote:
> What are the values that should go in postgresql.conf for a larga
> database, a couple of which tables shall contain over 100 million
> records?

Well, that really kind of depends.  Is this a database to handle lots
of users with small bank style transactions?  Or is it designed to
hold the text type content for millions of users but with only a
handful accessing it at once?  Or maybe it's a reporting database with
lots of statistical data that will be accessed by one or two people at
a time but do huge ugly queries.

It really all kind of depends.

A good idea is to keep from setting max clients any higher than you
have to.  Set shared_buffers around 25% of total memory to start, and
set work_mem to about 8M or so.  Note that it's easy to use up a lot
of memory fast with higher work_mem settings because each sort / hash
agg etc can use up work_mem worth of memory.  100 users times 2 sorts
each would equal 8M*200 or 1600M with a setting of 8M work_mem.

Most importantly, make small incremental changes and benchmark with a
realistic load, it's the only way to be sure.

In response to

pgsql-admin by date

Next:From: Derek HopkinsDate: 2008-05-16 20:20:49
Subject: Re: Can't reinstall b/c don't know the very long installation password
Previous:From: Scott MarloweDate: 2008-05-16 20:14:14
Subject: Re: Can't reinstall b/c don't know the very long installation password

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group