Re: PQescapeStringConn

From: Scott Frankel <frankel(at)circlesfx(dot)com>
To: Richard Huxton <dev(at)archonet(dot)com>
Cc: PostgreSQL List <pgsql-general(at)postgresql(dot)org>
Subject: Re: PQescapeStringConn
Date: 2010-07-30 20:58:22
Message-ID: C1FD2C3D-4EA9-4C42-B4A8-F8C182ED760E@circlesfx.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


On Jul 30, 2010, at 9:11 AM, Richard Huxton wrote:

> On 30/07/10 16:57, Scott Frankel wrote:
>>
>> On Jul 30, 2010, at 1:13 AM, Richard Huxton wrote:
>>
>>> On 30/07/10 07:52, Scott Frankel wrote:
>>>> I have a number of very long strings that each contain many
>>>> instances of
>>>> semi-colons, single quotes, forward and back slashes, etc. I'm
>>>> looking
>>>> for an efficient and safe way to write them to my db using a
>>>> prepared
>>>> statement.
>
> You're going to have to pre-process the strings in some way, or
> there will always be the chance of problems. Probably the best way
> to handle a bulk insert is through the COPY command:

Excellent! Thanks for the lead. I see from the docs:
COPY weather FROM '/home/user/weather.txt';

I am using Python to create the strings; and Python is the ultimate
consumer of the strings after storage in the db. Thus I have a fair
degree of control over the strings' formatting. COPY from a plain
text file on my server looks very promising.

Thanks!
Scott

> BEGIN;
>
> COPY foo (name, body) FROM stdin;
> n1 b1
> n2 b2
> foo this will fail 'fer sher;' on the characters inside the string
> \.
>
> COMMIT;
>
> By default COPY expects one line per row, with columns separated by
> tab characters. You can also have '/path/to/file/name' instead of
> stdin, but the file will need to be accessible from the backend
> process. If that's not the case (and it probably isn't) then you
> want to use psql's "\copy" variant which views the world from the
> client end of things.
>
> COPY is faster than separate inserts and the only characters you
> need to worry about are tab, carriage-return and newline. These
> would be replaced by the sequences "\t", "\r", "\n".
>
> I don't know what format your strings are in initially, but a bit of
> perl/python/ruby can easily tidy them up.
>
> Finally, more recent versions of PG have a COPY that supports CSV
> formatting too. See the manuals for more details on this.
>
> --
> Richard Huxton
> Archonet Ltd
>

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Derek Arnold 2010-07-30 21:45:09 plpython feature idea: an option to return row results as lists
Previous Message Andreas Joseph Krogh 2010-07-30 20:30:44 Re: Dynamic data model, locks and performance