Re: Commit every processed record

From: "Bart Degryse" <Bart(dot)Degryse(at)indicator(dot)be>
To: <pgsql-sql(at)postgresql(dot)org>, "A(dot) Kretschmer" <andreas(dot)kretschmer(at)schollglas(dot)com>
Subject: Re: Commit every processed record
Date: 2008-04-07 14:22:25
Message-ID: 47FA4A40.A3DD.0030.0@indicator.be
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-sql

Well, actually there is.
Do the processing in a plperlu function which uses it's own connection to the db. Then every instance of the function will have it's own transaction.
Try to start that perl connection outside the function or your performance will drop too much.
I use this technique to fetch, process and store records from an oracle db (erp) in a postgresql db (datawarehousing/reporting/external data).
It gets me some 500,000 records in little over 50 minutes.
Probably real perl addicts would have a lot to say about my coding, but being a perl illiterate (does one write it like this?) I must say that I'm quite happy with the result.
All I used was the perl chapter in the postgresql manual (big hurray for the manual), perl express for testing and debugging of a lot of very
basic things like hashes (yes, illiterate !-) and EMS SQL Manager.
Good luck

>>> "A. Kretschmer" <andreas(dot)kretschmer(at)schollglas(dot)com> 2008-04-07 15:01 >>>
am Mon, dem 07.04.2008, um 14:46:50 +0200 mailte luke(dot)78(at)libero(dot)it folgendes:
> Hi,
> I have to execute commit for evey record that i processed during a cursor fetch in a function.
> There is a way to do it?

No.

Regards, Andreas
--
Andreas Kretschmer
Kontakt: Heynitz: 035242/47150, D1: 0160/7141639 (mehr: -> Header)
GnuPG-ID: 0x3FFF606C, privat 0x7F4584DA http://wwwkeys.de.pgp.net ( http://wwwkeys.de.pgp.net/ )

In response to

Browse pgsql-sql by date

  From Date Subject
Next Message Ivan Sergio Borgonovo 2008-04-07 20:33:22 advocacy: case studies
Previous Message Tom Lane 2008-04-07 14:03:33 Re: undefined relations in pg_locks