From: | Jonathan Daugherty <jdaugherty(at)commandprompt(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Bulk data insertion |
Date: | 2004-11-26 22:31:42 |
Message-ID: | 41A7AECE.5020706@commandprompt.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hello,
I have a PL/PgSQL function that I need to call with some ARRAY
parameters. These array values are very large -- typically thousands of
elements. Each element is a 4-element array. This function is called
to do some sanity checking on the array data and use the individual
elements to do inserts where appropriate.
The problem is that I don't want to spend a lot of time and memory
building such a query (in C). I would like to know if there is a way to
take this huge chunk of data and get it into the database in a less
memory-intensive way. I suppose I could use COPY to put the data into a
table with triggers that would do the checks on the data, but it seems
inelegant and I'd like to know if there's a better way.
Thoughts? Thanks for your time.
--
Jonathan Daugherty
Command Prompt, Inc. - http://www.commandprompt.com/
PostgreSQL Replication & Support Services, (503) 667-4564
From | Date | Subject | |
---|---|---|---|
Next Message | Christopher Browne | 2004-11-26 22:50:24 | Re: Problems setting up slony-1 |
Previous Message | Kenneth Tanzer | 2004-11-26 22:26:48 | Re: Regexp matching: bug or operator error? |