Optimization required for multiple insertions in PostgreSQL

From: siva palanisamy <psivait(at)gmail(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Cc: Siva Palanisamy <siva_p(at)hcl(dot)com>
Subject: Optimization required for multiple insertions in PostgreSQL
Date: 2011-11-03 15:52:42
Message-ID: CA+AN_OYTUtY107A6gz6WV4ULDuMP1XHs5C6vhdYvnRjCXZzDOg@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

I basically have 3 tables. One being the core table and the other 2 depend
on the 1st. I have the requirement to add upto 70000 records in the tables.
I do have constraints (primary & foreign keys, index, unique etc) set for
the tables. I can't go for bulk import (using COPY command) as there is no
standard .csv file in requirement, and the mapping is explicitly required
plus few validations are externally applied in a C based programming file.
Each record details (upto 70000) will be passed from .pgc (an ECPG based C
Programming file) to postgresql file. It takes less time for the 1st few
records and the performance is turning bad to the latter records! The
result is very sad that it takes days to cover upto 20000! What are the
performance measures could I step in into this? Please guide me

Responses

Browse pgsql-performance by date

  From Date Subject
Next Message Yeb Havinga 2011-11-03 15:52:48 Re: Intel 710 pgbench write latencies
Previous Message Robert Haas 2011-11-03 15:40:30 Re: function slower than the same code in an sql file