Re: INSERT performance deteriorates quickly during a large import

From: Márcio Geovani Jasinski <marciogj(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: INSERT performance deteriorates quickly during a large import
Date: 2007-11-09 11:52:14
Message-ID: fc73093f0711090352i241b42e0x9a99a84c32b0bd00@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hello Krasimir,

You got a lot of good advices above and I would like to add another one:

d) Make sure of your PHP code is not recursive. As you said the memory is
stable so I think your method is iterative.
A recursive method certainly will increase a little time for each insert
using more memory.
But iterative methods must be correctly to be called just once and maybe
your code is running much more than need.

Pay attention on Tomas advices, and after that (I agree with Cris) "there
should be no reason for loading data to get more costly as
the size of the table increases" - Please check your code.

I did some experiences long time ago with 40000 data with a lot of BLOBs. I
used PHP code using SELECT/INSERT from Postgres to Postgres and the time
wasn't constant but wasn't so bad as your case. (And I didn't the Tomas a,
b and c advices)

Good Luck
--
Márcio Geovani Jasinski

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Karsten Hilbert 2007-11-09 12:53:51 Re: Warning about max_fsm_pages: what's that?
Previous Message Leif B. Kristensen 2007-11-09 11:33:32 Re: any way for ORDER BY x to imply NULLS FIRST in 8.3?