Skip site navigation (1) Skip section navigation (2)

Re: JSON for PG 9.2

From: Andrew Dunstan <andrew(at)dunslane(dot)net>
To: Pavel Stehule <pavel(dot)stehule(at)gmail(dot)com>
Cc: Joey Adams <joeyadams3(dot)14159(at)gmail(dot)com>, Robert Haas <robertmhaas(at)gmail(dot)com>, "David E(dot) Wheeler" <david(at)kineticode(dot)com>, Claes Jakobsson <claes(at)surfar(dot)nu>, Dimitri Fontaine <dimitri(at)2ndquadrant(dot)fr>, Merlin Moncure <mmoncure(at)gmail(dot)com>, Magnus Hagander <magnus(at)hagander(dot)net>, Jan Urbański <wulczer(at)wulczer(dot)org>, Simon Riggs <simon(at)2ndquadrant(dot)com>, Bruce Momjian <bruce(at)momjian(dot)us>, PostgreSQL-development Hackers <pgsql-hackers(at)postgresql(dot)org>, Jan Wieck <janwieck(at)yahoo(dot)com>
Subject: Re: JSON for PG 9.2
Date: 2012-01-14 20:06:27
Message-ID: 4F11E043.1040107@dunslane.net (view raw or flat)
Thread:
Lists: pgsql-hackers

On 01/12/2012 10:51 AM, Andrew Dunstan wrote:
>
>
> On 01/12/2012 10:44 AM, Pavel Stehule wrote:
>> 2012/1/12 Andrew Dunstan<andrew(at)dunslane(dot)net>:
>>>
>>> On 01/12/2012 09:00 AM, Joey Adams wrote:
>>>> I wrote an array_to_json function during GSoC 2010:
>>>>
>>>>
>>>> http://git.postgresql.org/gitweb/?p=json-datatype.git;a=blob;f=json_io.c#l289 
>>>>
>>>>
>>>> It's not exposed as a procedure called array_to_json: it's part of the
>>>> to_json function, which decides what to do based on the argument type.
>>>>
>>>
>>> Excellent, this is just the point at which I stopped work last 
>>> night, so
>>> with your permission I'll steal this and it will save me a good 
>>> chunk of
>>> time.
>>>
>> this should be little bit more enhanced to support a row arrays - it
>> can be merged with some routines from pst tool
>> http://okbob.blogspot.com/2010/11/new-version-of-pst-collection-is.html
>>
>>
>
> I will be covering composites.
>
>

OK, here's a patch that does both query_to_json and array_to_json, along 
with docs and regression tests. It include Robert's original patch, 
although I can produce a differential patch if required. It can also be 
pulled from <https://bitbucket.org/adunstan/pgdevel>

A couple of things to note. First, the problem about us losing column 
names that I noted a couple of months ago and Tom did a bit of work on 
is exercised by this. We really need to fix it. Example:

    andrew=#  select array_to_json(array_agg(row(z.*)))
        from (select $$a$$ || x as b,
                 y as c,
                 array[row(x.*,array[1,2,3]),
                       row(y.*,array[4,5,6])] as z
              from generate_series(1,1) x,
                   generate_series(4,4) y) z;
                                   array_to_json
    -------------------------------------------------------------------------
      [{"f1":"a1","f2":4,"f3":[{"f1":1,"f2":[1,2,3]},{"f1":4,"f2":[4,5,6]}]}]
    (1 row)


Here we've lost b, c and z as column names.

Second, what should be do when the database encoding isn't UTF8? I'm 
inclined to emit a \unnnn escape for any non-ASCII character (assuming 
it has a unicode code point - are there any code points in the 
non-unicode encodings that don't have unicode equivalents?). The 
alternative would be to fail on non-ASCII characters, which might be 
ugly. Of course, anyone wanting to deal with JSON should be using UTF8 
anyway, but we still have to deal with these things. What about 
SQL_ASCII? If there's a non-ASCII sequence there we really have no way 
of telling what it should be. There at least I think we should probably 
error out.

cheers

andrew




Attachment: adjson.patch
Description: text/x-patch (53.1 KB)

In response to

Responses

pgsql-hackers by date

Next:From: Pavel StehuleDate: 2012-01-14 20:31:54
Subject: Re: JSON for PG 9.2
Previous:From: Jaime CasanovaDate: 2012-01-14 19:40:46
Subject: Re: Measuring relation free space

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group