Tuning Postgres for single user manipulating large amounts of data

From: Paul Taylor <ijabz(at)fastmail(dot)fm>
To: pgsql-general(at)postgresql(dot)org
Subject: Tuning Postgres for single user manipulating large amounts of data
Date: 2010-12-09 12:25:56
Message-ID: 4D00CAD4.6090307@fastmail.fm
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi, Im using Postgres 8.3 on a Macbook Pro Labtop.
I using the database with just one db connection to build a lucene
search index from some of the data, and Im trying to improve
performance. The key thing is that I'm only a single user but
manipulating large amounts of data , i.e processing tables with upto 10
million rows in them, so I think want to configure Postgres so that it
can create large temporary tables in memory

I've tried changes various parameters such as shared_buffers, work_mem
and checkpoint_segments but I don't really understand what they values
are, and the documentation seems to be aimed towards configuring for
multiple users, and my changes make things worse. For example my machine
has 2GB of memory and I read if using as a dedicated server you should
set shared memory to 40% of total memory, but when I increase to more
than 30MB Postgres will not start complaining about my SHMMAX limit.

Paul

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Vick Khera 2010-12-09 13:32:51 Re: SELECT is immediate but the UPDATE takes forever
Previous Message Paul Taylor 2010-12-09 11:13:58 Tuning Postgres for single user manipulating large amounts of data