Out of Memory and Configuration Problems (Big Computer)

From: Tom Wilcox <hungrytom(at)googlemail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Out of Memory and Configuration Problems (Big Computer)
Date: 2010-05-28 12:59:40
Message-ID: 4BFFBE3C.9020204@gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


Hi,

I am fairly new to postgres and I have been using it with Python to
develop a set of functions that operate on some moderately large tables
(30million rows with 13-15 columns).

I have spent some time messing with the configuration file to get PG to
use indexes when planning queries, etc.

At the moment, I have a fairly fundamental, recurring problem which is
that a query has ran out of memory. This particular query is a
sequential scan UPDATE query that is operating on a table with
approximately 27,778,000 rows:

UPDATE tbl SET f1 = COALESCE(f2,'') || ' ' || COALESCE(f3);

ERROR: out of memory
DETAIL: Failed on request of size 36
Run time (est): 6,000,000ms

EXPLAIN shows the query plan as:

Seq Scan on tbl (cost=0.00..2088542.83 rows=59868855 width=128)

I have noticed that the "rows=59xxxxxx" suggests that the plan features
2 x the number of rows in the table. Perhaps I am writing poor SQL?

Can anyone suggest reasons why I might be running out of memory on such
a simple query? Is it possible that, because it is executed as a
transaction, the changes are not being committed until the query is
complete and therefore the whole table is being stored in memory?

Also, can anyone give me any pointers for configuring postgres to use
ALL 96GB of RAM in my new machine? I would like to know it was using
everything available.. especially when it is possible to load an entire
30m row table into memory! I am currently using the default
configuration from standard installation.

Any help/suggestions are very much appreciated.

Cheers,
Tom

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Joachim Worringen 2010-05-28 13:17:33 Re: INSERTing lots of data
Previous Message Craig Ringer 2010-05-28 12:55:39 Re: INSERTing lots of data