Tuning Postgres for single user manipulating large amounts of data

From: Paul Taylor <paul_t100(at)fastmail(dot)fm>
To: pgsql-general(at)postgresql(dot)org
Subject: Tuning Postgres for single user manipulating large amounts of data
Date: 2010-12-09 11:13:58
Message-ID: 4D00B9F6.2060409@fastmail.fm
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi, Im using Postgres 8.3 on a Macbook Pro Labtop.
I using the database with just one db connection to build a lucene
search index from some of the data, and Im trying to improve
performance. The key thing is that I'm only a single user but
manipulating large amounts of data , i.e processing tables with upto 10
million rows in them, so I think want to configure Postgres so that it
can create large temporary tables in memory

Ive tried changes various paramters such as shared_buffers, work_mem and
checkpoint_segments but I don't really understand what they values are,
and the documentation seems to be aimed towards configuring for multiple
users, and my changes make things worse. For example my machine has 2GB
of memory and I read if using as a dedicated server you should set
shared memory to 40% of total memory, but when I increase to more than
30MB Postgres will not start complaining about my SHMMAX limit.

Paul

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Paul Taylor 2010-12-09 12:25:56 Tuning Postgres for single user manipulating large amounts of data
Previous Message Pavel Stehule 2010-12-09 09:51:46 Re: Which query is good - IN or OR