Re: Bad Data back Door

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: "David E(dot) Wheeler" <david(at)justatheory(dot)com>
Cc: Pg Hackers <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: Bad Data back Door
Date: 2012-10-06 01:12:01
Message-ID: 20303.1349485921@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

"David E. Wheeler" <david(at)justatheory(dot)com> writes:
> Ive discovered something a bit disturbing at $work. Were migrating (slowly) from Oracle to PostgreSQL, and in some cases are using oracle_fdw to copy data over. Alas, there are a fair number of text values in the Oracle database that, although the database is UTF-8, are actually something else (CP1252 or Latin1). When we copy from an oracle_fdw foreign table into a PostgreSQL table, PostgreSQL does not complain, but ends up storing the mis-encoded strings, even though the database is UTF-8.

> I assume that this is because the foreign table, as a table, is assumed by the system to have valid data, and therefor additional character encoding validation is skipped, yes?

Probably not so much "assumed" as "nobody thought about it". In
e.g. plperl we expend the cycles to do encoding validity checking on
*every* string entering the system from Perl. I'm not sure why foreign
tables ought to get a pass on that, especially when you consider the
communication overhead that the encoding check would be amortized
against.

Now, having said that, I think it has to be the reponsibility of the FDW
to apply any required check ... which makes this a bug report against
oracle_fdw, not the core system. (FWIW, contrib/file_fdw depends on the
COPY code, which will check encoding.)

regards, tom lane

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Tatsuo Ishii 2012-10-06 01:44:39 Re: 64-bit API for large object
Previous Message Peter Eisentraut 2012-10-06 01:00:19 why repl_gram.h?