Re: JSON in 9.2 - Could we have just one to_json() function instead of two separate versions ?

From: Hannu Krosing <hannu(at)2ndQuadrant(dot)com>
To: Merlin Moncure <mmoncure(at)gmail(dot)com>
Cc: Andrew Dunstan <andrew(at)dunslane(dot)net>, PavelStehule <pavel(dot)stehule(at)gmail(dot)com>, pgsql-hackers(at)postgresql(dot)org
Subject: Re: JSON in 9.2 - Could we have just one to_json() function instead of two separate versions ?
Date: 2012-05-01 15:02:27
Message-ID: 1335884547.3106.126.camel@hvost
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On Tue, 2012-05-01 at 08:18 -0500, Merlin Moncure wrote:
> On Tue, May 1, 2012 at 7:02 AM, Hannu Krosing <hannu(at)2ndquadrant(dot)com> wrote:
> > Hi hackers
> >
> > After playing around with array_to_json() and row_to_json() functions a
> > bit it I have a question - why do we even have 2 variants *_to_json()
> >
> > Collapsing array_to_json() and row_to_json() into just to_json()
>
> I asked the same question. It was noted that the xml functions aren't
> overloaded like that and that it's cleaner to introduce datum specific
> behaviors if you don't overload.

XML, being an "enterprise" thing is a large and complex beast.

Javascript - and by extension json - comes from the other end, being
lightweight and elegant at core.

Also, the the *_to_xml functions present still don't match what is there
for json, they don't even overlap !

Thus I see no reason why deciding on how to_json() functions (or cast to
json) should work needs to be based on how xml works.

We currently don't have any of the "database_to_json()" or
"querystring_to_json()" and we don't need these either.

I'd be much more happy by just having a working cast to json from all
types, not a myriad of functions for all possible types -
int4_to_json(), text_to_json(), bool_to_json(), record_to_json(),
array_to_json(), pg_user_to_json, etc. etc. etc.

What we currently have exposed to userspace are two arbitrarily chosen
"compex type" functions -

array_to_json() for converting arrays of ANY element type to json ,
inluding arrays consisting of records which may again contain arrays and
records.

and

row_to_json() for converting "rows" again potentially consisting of ANY
TYPE, including arrays of any type and any complex type. It handles even
the row() type :)

hannu=# select row_to_json(row(1,2,3));
row_to_json
------------------------
{"f1":1,"f2":2,"f3":3}
(1 row)

What we currently lack is direct conversion for simple types, though
they are easily achieved by converting to a single-element array and
then stripping outer [] from the result

It would be really nice to also have the casts from json to any type,
including records though.

And perhaps one functions for converting schema elements to some json
representation, so that a json_dump could easily be constructed :)

We really do not need footguns similar to database_to_xml() or
schema_to_xml() which just to consume all memory in the server on any
real database.

> I don't really agree with that or any of the naming styles that are in
> the form inputtype_func() but I think most people are on the other
> side of the argument.

I think that most people have not given this any thought yet, so they
simply lack any reasoned opinion ;)

> merlin

--
-------
Hannu Krosing
PostgreSQL Unlimited Scalability and Performance Consultant
2ndQuadrant Nordic
PG Admin Book: http://www.2ndQuadrant.com/books/

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Robert Haas 2012-05-01 15:06:11 Re: extending relations more efficiently
Previous Message Robert Haas 2012-05-01 14:56:49 Re: extending relations more efficiently