I want to use a database to store lots of records, quite a few million. I'm looking for advice on the best way to add and retrieve many thousands of records at one go from within a C program.
Each record is a scientific observation, comprising a few double-precision floating point numbers along with a few other fields. I'll have 2 C programs: one populates the DB in real-time with data as it becomes available, and the other analyses/processes the data, attempting to complete analysis before more data comes in.
In detail, I want to:
1. Write an array of C structures as new records in the DB.
2. Send a query to retrieve some subset of DB records, and translate them into C structures.
Sounds pretty standard stuff, but given that both steps may be dealing with a number in the order of 100,000 records at a time, and there is some pressure to do this quickly (so analysis doesn't lag behind data acquisition), what's the best way do code-up interaction with the database?
I know a little about ECPG (but that seems very inefficient) and have a little experience from a couple of years ago of writing a backend function that made lots of calls like 'DatumGetFloat8' (but presumably that approach has the benefit of directly accessing the database files, being a backend call). In the manual I see mention of 'prepared statements' and the like which I know nothing about, so please treat me like the ignorant newbie that I am.
PS. If the mechanism works in postgres 7.4 then even better, as I'm working with a legacy DB that there is much resistence to upgrade. Don't ask!
pgsql-novice by date
|Next:||From: Sergey Samokhin||Date: 2009-09-08 18:22:08|
|Subject: Can't understand how a query from the documentation works|
|Previous:||From: jr||Date: 2009-09-07 16:19:44|
|Subject: Re: require some suggestion to write trigger function|