Thanks for the reply.
I have one more question. While extracting data using COPT TO command (in TEXT mode) we have an API, PQunescapeBytea, to convert string representation of binary data(bytea) into binary. Similary we need to convert binary data into it's string representation while loading data into PostgreSQL ,using COPY FROM command in TEXT mode, with PQputCopydata. But there is no API to convert binary data into it's string representation. Could you tell me API or I am misinterpreting something.
Thanks in advance,
----- Original Message ----
From: Sean Davis <sdavis2(at)mail(dot)nih(dot)gov>
Cc: Sandeep Khandelwal <sandeep_khandelwal27(at)yahoo(dot)com>
Sent: Monday, October 16, 2006 4:04:45 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL
On Monday 16 October 2006 03:07, Sandeep Khandelwal wrote:
> Hi All.
> I want to extract and Load data from PostgreSQL using Libpq C API. Please
> let me know which approach will be good to load large number of rows into
> PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract
> large number of rows from PostgreSQL (COPY TO or SELECT). I want to handle
> all the data types supported in the PostgreSQL.
copy is the faster way to go for a single table.
---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to
choose an index scan if your joining column's datatypes do not
pgsql-interfaces by date
|Next:||From: Davis, Sean (NIH/NCI) [E]||Date: 2006-10-17 10:43:56|
|Subject: Re: Bulk Load and Extract from PostgreSQL|
|Previous:||From: Andro||Date: 2006-10-16 15:58:21|
|Subject: play with current schema|