Re: JSON for PG 9.2

From: Joey Adams <joeyadams3(dot)14159(at)gmail(dot)com>
To: Robert Haas <robertmhaas(at)gmail(dot)com>
Cc: Andrew Dunstan <andrew(at)dunslane(dot)net>, Simon Riggs <simon(at)2ndquadrant(dot)com>, Bruce Momjian <bruce(at)momjian(dot)us>, PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: JSON for PG 9.2
Date: 2011-12-16 17:13:56
Message-ID: CAARyMpC_rsMgR2=39ExLkN36d8sozA_7TQHK1bJSA5PCDDBeuQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On Fri, Dec 16, 2011 at 8:52 AM, Robert Haas <robertmhaas(at)gmail(dot)com> wrote:
> But I think the important point is that this is an obscure corner case. Let me say that one
more time: obscure corner case!

+1

> The only reason JSON needs to care about this at all is that it allows
> \u1234 to mean Unicode code point 0x1234.  But for that detail, JSON
> would be encoding-agnostic.  So I think it's sufficient for us to
> simply decide that that particular feature may not work (or even, will
> not work) for non-ASCII characters if you use a non-UTF8 encoding.
> There's still plenty of useful things that can be done with JSON even
> if that particular feature is not available; and that way we don't
> have to completely disable the data type just because someone wants to
> use EUC-JP or something.

So, if the server encoding is not UTF-8, should we ban Unicode escapes:

"\u00FCber"

or non-ASCII characters?

"über"

Also:

* What if the server encoding is SQL_ASCII?

* What if the server encoding is UTF-8, but the client encoding is
something else (e.g. SQL_ASCII)?

- Joey

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Peter Geoghegan 2011-12-16 17:17:31 Re: CommitFest 2011-11 Update
Previous Message Dimitri Fontaine 2011-12-16 16:58:59 Re: Prep object creation hooks, and related sepgsql updates