diff --git a/doc/src/sgml/biblio.sgml b/doc/src/sgml/biblio.sgml
index 4953024..a7771dc 100644
--- a/doc/src/sgml/biblio.sgml
+++ b/doc/src/sgml/biblio.sgml
@@ -136,6 +136,17 @@
1988
+
+ SQL Technical Report
+ Part 6: SQL support for JavaScript Object
+ Notation (JSON)
+ First Edition.
+
+ .
+
+ 2017.
+
+
diff --git a/doc/src/sgml/datatype.sgml b/doc/src/sgml/datatype.sgml
index 3d36cca..f30baa7 100644
--- a/doc/src/sgml/datatype.sgml
+++ b/doc/src/sgml/datatype.sgml
@@ -149,6 +149,12 @@
+ jsonpath
+
+ binary JSON path
+
+
+ lineinfinite line on a plane
diff --git a/doc/src/sgml/filelist.sgml b/doc/src/sgml/filelist.sgml
index f010cd4..a60de6c 100644
--- a/doc/src/sgml/filelist.sgml
+++ b/doc/src/sgml/filelist.sgml
@@ -18,6 +18,7 @@
+
diff --git a/doc/src/sgml/func-sqljson.sgml b/doc/src/sgml/func-sqljson.sgml
new file mode 100644
index 0000000..1f2881d
--- /dev/null
+++ b/doc/src/sgml/func-sqljson.sgml
@@ -0,0 +1,3562 @@
+
+
+
+ JSON Functions, Operators and Expressions
+
+
+ The functions, operators, and expressions described in this section
+ operate on JSON data:
+
+
+
+
+
+ SQL/JSON functions and expressions conforming to the
+ SQL/JSON standard (see ).
+
+
+
+
+ PostgreSQL-specific functions and operators for JSON
+ data types (see ).
+
+
+
+
+
+ To learn more about the SQL/JSON standard, see .
+ For details on JSON types supported in PostgreSQL, see .
+
+
+
+ SQL/JSON Functions and Expressions
+
+ SQL/JSON
+ functions and expressions
+
+
+
+ To provide native support for JSON data types within the SQL environment,
+ PostgreSQL implements the SQL/JSON data model.
+ This model comprises sequences of items. Each item can hold SQL scalar values,
+ with an additional SQL/JSON null value, and composite data structures that use JSON
+ arrays and objects.
+
+
+
+ SQL/JSON enables you to handle JSON data alongside regular SQL data,
+ with transaction support:
+
+
+
+
+
+ Upload JSON data into a relational database and store it in
+ regular SQL columns as character or binary strings.
+
+
+
+
+ Generate JSON objects and arrays from relational data.
+
+
+
+
+ Query JSON data using SQL/JSON query functions and SQL/JSON path
+ language expressions.
+
+
+
+
+
+ All SQL/JSON functions fall into one of the two groups.
+ Constructor functions
+ generate JSON data from values of SQL types. Query functions
+ evaluate SQL/JSON path language expressions against JSON values
+ and produce values of SQL/JSON types, which are converted to SQL types.
+
+
+
+ Producing JSON Content
+
+
+ PostgreSQL provides several functions
+ that generate JSON data. Taking values of SQL types as input, these
+ functions construct JSON objects or JSON arrays represented as
+ SQL character or binary strings.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ JSON_OBJECT
+ create a JSON object
+
+
+
+
+JSON_OBJECT (
+[ { key_expression { VALUE | ':' }
+ value_expression [ FORMAT JSON [ ENCODING UTF8 ] ] }[, ...] ]
+[ { NULL | ABSENT } ON NULL ]
+[ { WITH | WITHOUT } UNIQUE [ KEYS ] ]
+[ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+)
+
+
+
+
+
+ Description
+
+
+ JSON_OBJECT function generates a JSON
+ object from SQL or JSON data.
+
+
+
+
+ Parameters
+
+
+
+
+ key_expression { VALUE | ':' }
+ value_expression [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The input clause that provides the data for constructing a JSON object:
+
+
+
+
+ key_expression is a scalar expression defining the
+ JSON key, which is implicitly converted
+ to the text type.
+ The provided expression cannot be NULL or belong to a type that has a cast to json.
+
+
+
+
+ value_expression is an expression
+ that provides the input for the JSON value.
+
+
+
+
+ The optional FORMAT clause is provided to conform to the SQL/JSON standard.
+
+
+
+
+ You must use a colon or the VALUE keyword as a delimiter between
+ the key and the value. Multiple key/value pairs are separated by commas.
+
+
+
+
+
+
+ { NULL | ABSENT } ON NULL
+
+
+
+ Defines whether NULL values are allowed in the constructed
+ JSON object:
+
+
+
+ NULL
+
+
+ Default. NULL values are allowed.
+
+
+
+
+ ABSENT
+
+
+ If the value is NULL,
+ the corresponding key/value pair is omitted from the generated
+ JSON object.
+
+
+
+
+
+
+
+
+
+ { WITH | WITHOUT } UNIQUE [ KEYS ]
+
+
+ Defines whether duplicate keys are allowed:
+
+
+
+ WITHOUT
+
+
+ Default. The constructed
+ JSON object can contain duplicate keys.
+
+
+
+
+ WITH
+
+
+ Duplicate keys are not allowed.
+ If the input data contains duplicate keys, an error is returned.
+ This check is performed before removing JSON items with NULL values.
+
+
+
+
+
+ Optionally, you can add the KEYS keyword for semantic clarity.
+
+
+
+
+
+
+ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The output clause that specifies the type of the generated JSON object.
+ For details, see .
+
+
+
+
+
+
+
+
+ Notes
+ Alternatively, you can construct JSON objects by using
+ PostgreSQL-specific json_build_object()/
+ jsonb_build_object() functions.
+ See for details.
+
+
+
+
+ Examples
+
+ Construct a JSON object from the provided key/value pairs of various types:
+
+
+SELECT JSON_OBJECT(
+-- scalar JSON types
+ 'key1': 'string',
+ 'key2': '[1, 2]',
+ 'key3' VALUE 123, -- alternative syntax for key-value delimiter
+ 'key4': NULL,
+-- other types
+ 'key5': ARRAY[1, 2, 3], -- postgres array
+ 'key6': jsonb '{"a": ["b", 1]}', -- composite json/jsonb
+ 'key7': date '2017-09-30', -- datetime type
+ 'key8': row(1, 'a'), -- row type
+ 'key9': '[1, 2]' FORMAT JSON, -- same value as for key2, but with FORMAT
+-- key can be an expression
+ 'key' || 'last' : TRUE
+ABSENT ON NULL) AS json;
+ json
+----------------------------------------------------
+{"key1" : "string", "key2" : "[1, 2]", "key3" : 123,
+ "key5" : [1,2,3], "key6" : {"a": ["b", 1]},
+ "key7" : "2017-09-30", "key8" : {"f1":1,"f2":"a"},
+ "key9" : [1, 2], "keylast" : true}
+(1 row)
+
+
+
+ From the films table, select some data
+ about the films distributed by Paramount Pictures
+ (did = 103) and return JSON objects:
+
+
+SELECT
+JSON_OBJECT(
+ 'code' VALUE f.code,
+ 'title' VALUE f.title,
+ 'did' VALUE f.did
+) AS paramount
+FROM films AS f
+WHERE f.did = 103;
+ paramount
+----------------------------------------------------
+{"code" : "P_301", "title" : "Vertigo", "did" : 103}
+{"code" : "P_302", "title" : "Becket", "did" : 103}
+{"code" : "P_303", "title" : "48 Hrs", "did" : 103}
+(3 rows)
+
+
+
+
+
+
+ JSON_OBJECTAGG
+ create a JSON object as an aggregate of the provided data
+
+
+
+JSON_OBJECTAGG (
+[ { key_expression { VALUE | ':' } value_expression } ]
+[ { NULL | ABSENT } ON NULL ]
+[ { WITH | WITHOUT } UNIQUE [ KEYS ] ]
+[ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+)
+
+
+
+
+
+ Description
+
+
+ JSON_OBJECTAGG function aggregates the provided data
+ into a JSON object. You can use this function to combine values
+ stored in different table columns into pairs. If you specify a GROUP BY
+ or an ORDER BY clause, this function returns a separate JSON object
+ for each table row.
+
+
+
+
+ Parameters
+
+
+
+
+ key_expression { VALUE | ':' } value_expression
+
+
+
+
+ The input clause that provides the data to be aggregated as a JSON object:
+
+
+
+
+ key_expression is a scalar expression defining the
+ JSON key, which is implicitly converted
+ to the text type.
+ The provided expression cannot be NULL or belong to a type that has a cast to json.
+
+
+
+
+ value_expression is an expression
+ that provides the input for the JSON value preceded by its type.
+ For JSON scalar types, you can omit the type.
+
+
+
+ The input value of the bytea type must be stored in UTF8
+ and contain a valid UTF8 string. Otherwise, an error occurs.
+ PostgreSQL currently supports only UTF8.
+
+
+
+
+
+ You must use a colon or the VALUE keyword as a delimiter between
+ keys and values. Multiple key/value pairs are separated by commas.
+
+
+
+
+
+
+ { NULL | ABSENT } ON NULL
+
+
+
+ Defines whether NULL values are allowed in the constructed
+ JSON object:
+
+
+
+ NULL
+
+
+ Default. NULL values are allowed.
+
+
+
+
+ ABSENT
+
+
+ If the value is NULL,
+ the corresponding key/value pair is omitted from the generated
+ JSON object.
+
+
+
+
+
+
+
+
+
+ { WITH | WITHOUT } UNIQUE [ KEYS ]
+
+
+ Defines whether duplicate keys are allowed:
+
+
+
+ WITHOUT
+
+
+ Default. The constructed
+ JSON object can contain duplicate keys.
+
+
+
+
+ WITH
+
+
+ Duplicate keys are not allowed.
+ If the input data contains duplicate keys, an error is returned.
+ This check is performed before removing JSON items with NULL values.
+
+
+
+
+
+ Optionally, you can add the KEYS keyword for semantic clarity.
+
+
+
+
+
+
+ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The output clause that specifies the type of the generated JSON object.
+ For details, see .
+
+
+
+
+
+
+
+
+ Notes
+ Alternatively, you can create JSON objects by using
+ PostgreSQL-specific json_object_agg()/
+ jsonb_object_agg() aggregate functions.
+ See for details.
+
+
+
+
+ Examples
+
+
+ For films with did = 103, aggregate key/value pairs
+ of film genre (f.kind) and title (f.title)
+ into a single object:
+
+
+SELECT
+JSON_OBJECTAGG(
+ f.kind VALUE f.title)
+ AS films_list
+FROM films AS f
+where f.did = 103;
+ films_list
+----------------------------------------------------
+{ "Action" : "Vertigo", "Drama" : "Becket", "Action" : "48 Hrs" }
+
+
+
+ Return the same object as jsonb. Note that only a single film of
+ the action genre is included as the jsonb type does not allow duplicate keys.
+
+
+SELECT
+JSON_OBJECTAGG(
+ f.kind VALUE f.title
+ RETURNING jsonb)
+AS films_list
+FROM films AS f
+where f.did = 103;
+ films_list
+----------------------------------------------------
+{"Drama": "Becket", "Action": "48 Hrs"}
+
+
+
+ Return objects of film titles and length, grouped by the film genre:
+
+
+SELECT
+ f.kind,
+ JSON_OBJECTAGG(
+ f.title VALUE f.len
+) AS films_list
+FROM films AS f
+GROUP BY f.kind;
+
+ kind | films_list
+-------------+----------------------------------
+Musical | { "West Side Story" : "02:32:00", "The King and I" : "02:13:00", "Bed Knobs and Broomsticks" : "01:57:00" }
+Romantic | { "The African Queen" : "01:43:00", "Une Femme est une Femme" : "01:25:00", "Storia di una donna" : "01:30:00" }
+Comedy | { "Bananas" : "01:22:00", "There's a Girl in my Soup" : "01:36:00" }
+Drama | { "The Third Man" : "01:44:00", "Becket" : "02:28:00", "War and Peace" : "05:57:00", "Yojimbo" : "01:50:00", "Das Boot" : "02:29:00" }
+Action | { "Vertigo" : "02:08:00", "48 Hrs" : "01:37:00", "Taxi Driver" : "01:54:00", "Absence of Malice" : "01:55:00" }
+(5 rows)
+
+
+
+
+
+
+ JSON_ARRAY
+ create a JSON array
+
+
+
+JSON_ARRAY (
+[ { value_expression [ FORMAT JSON ] } [, ...] ]
+[ { NULL | ABSENT } ON NULL ]
+[ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+)
+
+JSON_ARRAY (
+[ query_expression ]
+[ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+)
+
+
+
+
+ Description
+
+
+ JSON_ARRAY function constructs a JSON array from
+ the provided SQL or JSON data.
+
+
+
+
+ Parameters
+
+
+
+
+ value_expression
+
+
+
+
+ The input clause that provides the data for constructing a JSON array.
+ The value_expression is an expression
+ that provides the input for the JSON value preceded by its type.
+ For JSON scalar types, you can omit the type.
+
+
+
+ The input value of the bytea type must be stored in UTF8
+ and contain a valid UTF8 string. Otherwise, an error occurs.
+ PostgreSQL currently supports only UTF8.
+
+
+
+
+
+
+
+
+ query_expression
+
+
+
+ An SQL query that provides the data for constructing a JSON array.
+ The query must return a single column that holds the values to be
+ used in the array.
+
+
+
+
+
+
+ { NULL | ABSENT } ON NULL
+
+
+
+ Defines whether NULL values are allowed in the generated JSON array:
+
+
+
+ NULL
+
+
+ NULL values are allowed.
+
+
+
+
+ ABSENT
+
+
+ Default. If the value is NULL,
+ the corresponding key/value pair is omitted from the generated
+ JSON object.
+
+
+
+
+
+ This clause is only supported for arrays built from an explicit list of values.
+ If you are using an SQL query to generate an array, NULL values are always
+ omitted.
+
+
+
+
+
+
+ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The output clause that specifies the return type of the constructed JSON array.
+ For details, see .
+
+
+
+
+
+
+
+
+ Notes
+ Alternatively, you can create JSON arrays by using
+ PostgreSQL-specific json_build_array()/
+ jsonb_build_array() functions.
+ See for details.
+
+
+
+
+ Examples
+
+ From the films table, select some data
+ about the films distributed by Paramount Pictures
+ (did = 103) and return JSON arrays:
+
+
+SELECT
+JSON_ARRAY(
+ f.code,
+ f.title,
+ f.did
+) AS films
+FROM films AS f
+WHERE f.did = 103;
+ films
+----------------------------------------------------
+["code" : "P_301", "title" : "Vertigo", "did" : 103]
+["code" : "P_302", "title" : "Becket", "did" : 103]
+["code" : "P_303", "title" : "48 Hrs", "did" : 103]
+(3 rows)
+
+
+ Construct a JSON array from the list of film titles returned from the
+ films table by a subquery:
+
+
+SELECT
+JSON_ARRAY(
+ SELECT
+ f.title
+FROM films AS f
+where f.did = 103)
+AS film_titles;
+ film_titles
+----------------------------------------------------
+["Vertigo", "Becket", "48 Hrs"]
+(1 row)
+
+
+
+
+
+
+ JSON_ARRAYAGG
+ aggregate a JSON array
+
+
+
+JSON_ARRAYAGG (
+[ value_expression ]
+[ ORDER BY sort_expression ]
+[ { NULL | ABSENT } ON NULL ]
+[ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+)
+
+
+
+
+
+ Description
+
+
+ JSON_ARRAYAGG function aggregates the provided SQL
+ or JSON data into a JSON array.
+
+
+
+
+ Parameters
+
+
+
+
+ value_expression
+
+
+
+
+ The input clause that provides the input data to be aggregated as a JSON array.
+ The value_expression
+ can be a value or a query returning the values to be used as input in array construction.
+ You can provide multiple input values separated by commas.
+
+
+
+
+
+
+ ORDER BY
+
+
+
+ Sorts the input data to be aggregated as a JSON array.
+ For details on the exact syntax of the ORDER BY clause, see .
+
+
+
+
+
+
+ { NULL | ABSENT } ON NULL
+
+
+
+ Defines whether NULL values are allowed in the constructed array:
+
+
+
+ NULL — NULL values are allowed.
+
+
+
+
+ ABSENT (default) — NULL
+ values are omitted from the generated array.
+
+
+
+
+
+
+
+
+
+ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The output clause that specifies the return type of the constructed JSON array.
+ For details, see .
+
+
+
+
+
+
+
+
+ Notes
+ Alternatively, you can create JSON arrays by using
+ PostgreSQL-specific json_agg()/
+ jsonb_agg() functions.
+ See for details.
+
+
+
+
+ Examples
+
+ Construct an array of film titles sorted in alphabetical order:
+
+
+SELECT
+JSON_ARRAYAGG(
+ f.title
+ORDER BY f.title ASC) AS film_titles
+FROM films AS f;
+ film_titles
+----------------------------------------------------
+["48 Hrs", "Absence of Malice", "Bananas", "Becket", "Bed Knobs and Broomsticks", "Das Boot", "Storia di una donna", "Taxi Driver", "The African Queen", "The King and I", "There's a Girl in my Soup", "The Third Man", "Une Femme est une Femme", "Vertigo", "War and Peace", "West Side Story", "Yojimbo"]
+(1 row)
+
+
+
+
+
+
+ Querying JSON
+
+
+ SQL/JSON query functions evaluate SQL/JSON path language expressions
+ against JSON values, producing values of SQL/JSON types, which are
+ converted to SQL types. All SQL/JSON query functions accept several
+ common clauses described in .
+ For details on the SQL/JSON path language,
+ see .
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ In some usage examples for these functions,
+ the following small table storing some JSON data will be used:
+
+CREATE TABLE my_films (
+ js text );
+
+INSERT INTO my_films VALUES (
+'{ "favorites" : [
+ { "kind" : "comedy", "films" : [
+ { "title" : "Bananas",
+ "director" : "Woody Allen"},
+ { "title" : "The Dinner Game",
+ "director" : "Francis Veber" } ] },
+ { "kind" : "horror", "films" : [
+ { "title" : "Psycho",
+ "director" : "Alfred Hitchcock" } ] },
+ { "kind" : "thriller", "films" : [
+ { "title" : "Vertigo",
+ "director" : "Alfred Hitchcock" } ] },
+ { "kind" : "drama", "films" : [
+ { "title" : "Yojimbo",
+ "director" : "Akira Kurosawa" } ] }
+ ] }');
+
+
+
+
+
+ JSON_EXISTS
+ check whether a JSON path expression can return any SQL/JSON items
+
+
+
+JSON_EXISTS (
+ json_api_common_syntax
+[ { TRUE | FALSE | UNKNOWN | ERROR } ON ERROR ]
+)
+
+
+
+
+ Description
+
+
+ JSON_EXISTS function checks whether the provided
+ JSON path expression can return any SQL/JSON items.
+
+
+
+
+ Parameters
+
+
+
+ json_api_common_syntax
+
+
+
+
+ The input data to query, the JSON path expression defining the query, and an optional PASSING clause.
+ See for details.
+
+
+
+
+
+
+ { TRUE | FALSE | UNKNOWN | ERROR } ON ERROR
+
+
+
+ Defines the return value if an error occurs. The default value is FALSE.
+
+
+
+
+
+
+
+
+ Examples
+
+
+ Check whether the provided jsonb data contains a
+ key/value pair with the key1 key, and its value
+ contains an array with one or more elements bigger than 2:
+
+
+SELECT JSON_EXISTS(jsonb '{"key1": [1,2,3]}', 'strict $.key1[*] ? (@ > 2)');
+ json_exists
+-------------
+ t
+(1 row)
+
+
+
+ Note the difference between strict and lax modes
+ if the required item does not exist:
+
+
+-- Strict mode with ERROR on ERROR clause
+SELECT JSON_EXISTS(jsonb '{"a": [1,2,3]}', 'strict $.a[5]' ERROR ON ERROR);
+ERROR: Invalid SQL/JSON subscript
+(1 row)
+
+
+
+-- Lax mode
+SELECT JSON_EXISTS(jsonb '{"a": [1,2,3]}', 'lax $.a[5]' ERROR ON ERROR);
+ json_exists
+-------------
+ f
+(1 row)
+
+
+
+-- Strict mode using the default value for the ON ERROR clause
+SELECT JSON_EXISTS(jsonb '{"a": [1,2,3]}', 'strict $.a[5]');
+ json_exists
+-------------
+ f
+(1 row)
+
+
+
+
+
+
+
+ JSON_VALUE
+ extract a value from JSON data and convert
+ it to an SQL scalar
+
+
+
+JSON_VALUE (
+ json_api_common_syntax
+[ RETURNING data_type ]
+[ { ERROR | NULL | DEFAULT expression } ON EMPTY ]
+[ { ERROR | NULL | DEFAULT expression } ON ERROR ]
+)
+
+
+
+
+ Description
+
+
+ JSON_VALUE function extracts a value from the provided
+ JSON data and converts it to an SQL scalar.
+ If the specified JSON path expression returns more than one
+ SQL/JSON item, an error occurs. To extract
+ an SQL/JSON array or object, use .
+
+
+
+
+ Parameters
+
+
+
+
+
+ json_api_common_syntax
+
+
+
+
+ The input data to query, the JSON path expression defining the query, and an optional PASSING clause.
+ For details, see .
+
+
+
+
+
+
+ RETURNING data_type
+
+
+
+ The output clause that specifies the data type of the returned value.
+ Out of the box, PostgreSQL
+ supports the following types: json, jsonb,
+ bytea, and character string types (text, char,
+ varchar, and nchar).
+ The extracted value must be a single SQL/JSON scalar item
+ and have a cast to the specified type. Otherwise, an error occurs.
+ By default, JSON_VALUE returns a string
+ of the text type.
+
+
+
+
+
+
+ { ERROR | NULL | DEFAULT expression } ON EMPTY
+
+
+
+ Defines the return value if no JSON value is found. The default is NULL.
+ If you use DEFAULT expression, the provided
+ expression is evaluated and cast to the type specified in the
+ RETURNING clause.
+
+
+
+
+
+
+ { ERROR | NULL | DEFAULT expression } ON ERROR
+
+
+
+ Defines the return value if an unhandled error occurs. The default is NULL.
+ If you use DEFAULT expression, the provided
+ expression is evaluated and cast to the type specified in the
+ RETURNING clause.
+
+
+
+
+
+
+
+
+ Examples
+
+
+ Extract an SQL/JSON value and return it as an SQL
+ scalar of the specified type. Note that
+ JSON_VALUE can only return a
+ single scalar, and the returned value must have a
+ cast to the specified return type:
+
+
+
+SELECT JSON_VALUE('"123.45"', '$' RETURNING float);
+ json_value
+------------
+ 123.45
+(1 row)
+
+SELECT JSON_VALUE('123.45', '$' RETURNING int ERROR ON ERROR);
+ json_value
+------------
+ 123
+(1 row)
+
+SELECT JSON_VALUE('"03:04 2015-02-01"', '$.datetime("HH24:MI YYYY-MM-DD")' RETURNING date);
+ json_value
+------------
+ 2015-02-01
+(1 row)
+
+SELECT JSON_VALUE('"123.45"', '$' RETURNING int ERROR ON ERROR);
+ERROR: invalid input syntax for integer: "123.45"
+
+SELECT JSON_VALUE(jsonb '[1]', 'strict $' ERROR ON ERROR);
+ERROR: SQL/JSON scalar required
+
+SELECT JSON_VALUE(jsonb '[1,2]', 'strict $[*]' ERROR ON ERROR);
+ERROR: more than one SQL/JSON item
+
+
+
+ If the path expression returns an array, an object, or
+ multiple SQL/JSON items, an error is returned, as specified
+ in the ON ERROR clause:
+
+
+SELECT JSON_VALUE(jsonb '[1]', 'strict $' ERROR ON ERROR);
+ERROR: SQL/JSON scalar required
+
+SELECT JSON_VALUE(jsonb '{"a": 1}', 'strict $' ERROR ON ERROR);
+ERROR: SQL/JSON scalar required
+
+SELECT JSON_VALUE(jsonb '[1,2]', 'strict $[*]' ERROR ON ERROR);
+ERROR: more than one SQL/JSON item
+
+SELECT JSON_VALUE(jsonb '[1,2]', 'strict $[*]' DEFAULT 1 ON ERROR);
+1
+
+
+
+
+
+
+
+ JSON_QUERY
+ extract an SQL/JSON array or object from JSON data
+ and return a JSON string
+
+
+
+JSON_QUERY (
+ json_api_common_syntax
+[ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+[ { WITHOUT | WITH { CONDITIONAL | [UNCONDITIONAL] } } [ ARRAY ] WRAPPER ]
+[ { KEEP | OMIT } QUOTES [ ON SCALAR STRING ] ]
+[ { ERROR | NULL | EMPTY { ARRAY | OBJECT } } ON EMPTY ]
+[ { ERROR | NULL | EMPTY { ARRAY | OBJECT } } ON ERROR ]
+)
+
+
+
+
+ Description
+
+
+ JSON_QUERY function extracts an SQL/JSON
+ array or object from JSON data. This function must return
+ a JSON string, so if the path expression returns a scalar or multiple SQL/JSON
+ items, you must wrap the result using the WITH WRAPPER clause
+ or enclose the path expression into square brackets for automatic wrapping.
+ To extract a single SQL/JSON value, you can use .
+
+
+
+
+ Parameters
+
+
+
+
+
+ json_api_common_syntax
+
+
+
+
+ The input data to query, the JSON path expression defining the query, and an optional PASSING clause.
+ For details, see .
+
+
+
+
+
+
+ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The output clause that specifies the data type of the returned value.
+ For details, see .
+
+
+
+
+
+
+ WITHOUT | WITH { CONDITIONAL | [UNCONDITIONAL] } [ ARRAY ] WRAPPER
+
+
+
+ Defines whether to wrap a returned result as an array.
+
+
+
+ WITH CONDITIONAL WRAPPER
+
+
+ Wrap the results if the path
+ expression returns anything other than a singleton SQL/JSON array or object.
+
+
+
+
+ WITH UNCONDITIONAL WRAPPER
+
+
+ Always wrap the result.
+ This is the default behavior if WITH WRAPPER is
+ specified.
+
+
+
+
+ WITHOUT WRAPPER
+
+
+ Do not wrap the result.
+ This is the default behavior if the WRAPPER
+ clause is omitted.
+
+
+
+
+
+ Optionally, you can add the ARRAY keyword for semantic clarity.
+
+
+ You cannot use this clause together with the ON EMPTY clause.
+
+
+
+
+
+
+
+ { KEEP | OMIT } QUOTES [ ON SCALAR STRING ]
+
+
+
+ Defines whether to keep or omit quotes if a scalar string is returned.
+ By default, scalar strings are returned with quotes. Using this
+ clause together with the WITH WRAPPER clause is not allowed.
+
+
+ Optionally, you can add the ON SCALAR STRING keywords for semantic clarity.
+
+
+
+
+
+
+ { ERROR | NULL | EMPTY { ARRAY | OBJECT } } ON EMPTY
+
+
+
+ Defines the return value if no JSON value is found. The default is NULL.
+ If you use EMPTY ARRAY or EMPTY OBJECT,
+ an empty JSON array [] or object {} is returned, respectively.
+ You cannot use this clause together with the WRAPPER clause.
+
+
+
+
+
+
+ { ERROR | NULL | EMPTY { ARRAY | OBJECT } } ON ERROR
+
+
+
+ Defines the return value if an unhandled error occurs. The default is NULL.
+ If you use EMPTY ARRAY or EMPTY OBJECT,
+ an empty JSON array [] or object {} are returned, respectively.
+
+
+
+
+
+
+
+
+ Examples
+
+
+ Extract all film genres listed in the my_films table:
+
+
+SELECT
+ JSON_QUERY(js, '$.favorites[*].kind' WITH WRAPPER ERROR ON ERROR)
+FROM my_films;
+ json_query
+------------
+ ["comedy", "horror", "thriller", "drama"]
+(1 row)
+
+
+
+ Note that the same query will result in an error if you omit the
+ WITH WRAPPER clause, as it returns multiple SQL/JSON items:
+
+
+SELECT
+ JSON_QUERY(js, '$.favorites[*].kind' ERROR ON ERROR)
+FROM my_films;
+ERROR: more than one SQL/JSON item
+
+
+
+ Compare the effect of different WRAPPER clauses:
+
+
+SELECT
+ js,
+ JSON_QUERY(js, 'lax $[*]') AS "without",
+ JSON_QUERY(js, 'lax $[*]' WITH WRAPPER) AS "with uncond",
+ JSON_QUERY(js, 'lax $[*]' WITH CONDITIONAL WRAPPER) AS "with cond"
+FROM
+ (VALUES (jsonb '[]'), ('[1]'), ('[[1,2,3]]'), ('[{"a": 1}]'), ('[1, null, "2"]')) foo(js);
+ js | without | with uncond | with cond
+----------------+-----------+----------------+----------------
+ [] | (null) | (null) | (null)
+ [1] | 1 | [1] | [1]
+ [[1, 2, 3]] | [1, 2, 3] | [[1, 2, 3]] | [1, 2, 3]
+ [{"a": 1}] | {"a": 1} | [{"a": 1}] | {"a": 1}
+ [1, null, "2"] | (null) | [1, null, "2"] | [1, null, "2"]
+(5 rows)
+
+
+Compare quote handling for scalar types with and without the OMIT QUOTES clause:
+
+
+SELECT JSON_QUERY(jsonb '"aaa"', '$' RETURNING text);
+ json_query
+------------
+ "aaa"
+(1 row)
+
+SELECT JSON_QUERY(jsonb '"aaa"', '$' RETURNING text OMIT QUOTES);
+ json_query
+------------
+ aaa
+(1 row)
+
+
+
+
+
+
+ JSON_TABLE
+ display JSON data as an SQL relation
+
+
+
+
+JSON_TABLE (
+ json_api_common_syntax [ AS path_name ]
+ COLUMNS ( json_table_column [, ...] )
+ [ PLAN ( json_table_plan ) |
+ PLAN DEFAULT ( { INNER | OUTER } [ , { CROSS | UNION } ]
+ | { CROSS | UNION } [ , { INNER | OUTER } ] )
+ ]
+)
+
+where json_table_column is:
+
+ nametype [ PATH json_path_specification ]
+ [ { ERROR | NULL | DEFAULT expression } ON EMPTY ]
+ [ { ERROR | NULL | DEFAULT expression } ON ERROR ]
+ | nametype FORMAT json_representation
+ [ PATH json_path_specification ]
+ [ { WITHOUT | WITH { CONDITIONAL | [UNCONDITIONAL] } }
+ [ ARRAY ] WRAPPER ]
+ [ { KEEP | OMIT } QUOTES [ ON SCALAR STRING ] ]
+ [ { ERROR | NULL | EMPTY { ARRAY | OBJECT } } ON EMPTY ]
+ [ { ERROR | NULL | EMPTY { ARRAY | OBJECT } } ON ERROR ]
+ | NESTED PATH json_path_specification [ AS path_name ]
+ COLUMNS ( json_table_column [, ...] )
+ | name FOR ORDINALITY
+
+json_table_plan is:
+
+ path_name [ { OUTER | INNER } json_table_plan_primary ]
+ | json_table_plan_primary { UNION json_table_plan_primary } [...]
+ | json_table_plan_primary { CROSS json_table_plan_primary } [...]
+
+json_table_plan_primary is:
+
+ path_name | ( json_table_plan )
+
+
+
+
+
+ Description
+
+
+ JSON_TABLE function queries JSON data
+ and presents the results as a relational view, which can be accessed as a
+ regular SQL table. You can only use JSON_TABLE inside the
+ FROM clause of the SELECT statement
+ for an SQL table.
+
+
+
+ Taking JSON data as input, JSON_TABLE uses
+ a path expression to extract a part of the provided data that
+ will be used as a row pattern for the
+ constructed view. Each SQL/JSON item at the top level of the row pattern serves
+ as the source for a separate row in the constructed relational view.
+
+
+
+ To split the row pattern into columns, JSON_TABLE
+ provides the COLUMNS clause that defines the
+ schema of the created view. For each column to be constructed,
+ this clause provides a separate path expression that evaluates
+ the row pattern, extracts a JSON item, and returns it as a
+ separate SQL value for the specified column. If the required value
+ is stored in a nested level of the row pattern, it can be extracted
+ using the NESTED PATH subclause. Joining the
+ columns returned by NESTED PATH can add multiple
+ new rows to the constructed view. Such rows are called
+ child rows, as opposed to the parent row
+ that generates them.
+
+
+
+ The rows produced by JSON_TABLE are laterally
+ joined to the row that generated them, so you do not have to explicitly join
+ the constructed view with the original table holding JSON
+ data. Optionally, you can specify how to join the columns returned
+ by NESTED PATH using the PLAN clause.
+
+
+
+
+
+ Parameters
+
+
+
+
+ json_api_common_syntax
+
+
+
+
+ The input data to query, the JSON path expression defining the query,
+ and an optional PASSING clause, as described in
+ . The result of the input data
+ evaluation is called the row pattern. The row
+ pattern is used as the source for row values in the constructed view.
+
+
+
+
+
+
+ COLUMNS( { json_table_column } [, ...] )
+
+
+
+
+ The COLUMNS clause defining the schema of the
+ constructed view. In this clause, you must specify all the columns
+ to be filled with SQL/JSON items. Only scalar column types are supported.
+ The json_table_column
+ expression has the following syntax variants:
+
+
+
+
+
+ nametype
+ [ PATH json_path_specification ]
+
+
+
+
+ Inserts a single SQL/JSON item into each row of
+ the specified column.
+
+
+ The provided PATH expression parses the
+ row pattern defined by json_api_common_syntax
+ and fills the column with produced SQL/JSON items, one for each row.
+ If the PATH expression is omitted,
+ JSON_TABLE uses the
+ $.name path expression,
+ where name is the provided column name.
+ In this case, the column name must correspond to one of the
+ keys within the SQL/JSON item produced by the row pattern.
+
+
+ Optionally, you can add ON EMPTY and
+ ON ERROR clauses to define how to handle
+ missing values or structural errors. These clauses have the same syntax
+ and semantics as in .
+
+
+
+
+
+
+ nametype FORMAT json_representation
+ [ PATH json_path_specification ]
+
+
+
+
+ Gerenates a column and inserts a composite SQL/JSON
+ item into each row of this column.
+
+
+ The provided PATH expression parses the
+ row pattern defined by json_api_common_syntax
+ and fills the column with produced SQL/JSON items, one for each row.
+ If the PATH expression is omitted,
+ JSON_TABLE uses the
+ $.name path expression,
+ where name is the provided column name.
+ In this case, the column name must correspond to one of the
+ keys within the SQL/JSON item produced by the row pattern.
+
+
+ Optionally, you can add WRAPPER, QUOTES,
+ ON EMPTY and ON ERROR clauses
+ to define additional settings for the returned SQL/JSON items.
+ These clauses have the same syntax and semantics as
+ in .
+
+
+
+
+
+
+ NESTED PATH json_path_specification [ AS json_path_name ]
+ COLUMNS ( json_table_column [, ...] )
+
+
+
+
+ Extracts SQL/JSON items from nested levels of the row pattern,
+ gerenates one or more columns as defined by the COLUMNS
+ subclause, and inserts the extracted SQL/JSON items into each row of these columns.
+ The json_table_column expression in the
+ COLUMNS subclause uses the same syntax as in the
+ parent COLUMNS clause.
+
+
+
+ The NESTED PATH syntax is recursive,
+ so you can go down multiple nested levels by specifying several
+ NESTED PATH subclauses within each other.
+ It allows to unnest the hierarchy of JSON objects and arrays
+ in a single function invocation rather than chaining several
+ JSON_TABLE expressions in an SQL statement.
+
+
+
+ You can use the PLAN clause to define how
+ to join the columns returned by NESTED PATH clauses.
+
+
+
+
+
+
+ name FOR ORDINALITY
+
+
+
+
+ Adds an ordinality column that provides sequential row numbering.
+ You can have only one ordinality column per table. Row numbering
+ is 1-based. For child rows that result from the NESTED PATH
+ clauses, the parent row number is repeated.
+
+
+
+
+
+
+
+
+
+
+ json_path_name
+
+
+
+
+ The optional json_path_name serves as an
+ identifier of the provided json_path_specification.
+ The path name must be unique and cannot coincide with column names.
+ When using the PLAN clause, you must specify the names
+ for all the paths, including the row pattern. Each path name can appear in
+ the PLAN clause only once.
+
+
+
+
+
+
+ PLAN ( json_table_plan )
+
+
+
+
+ Defines how to join the data returned by NESTED PATH
+ clauses to the constructed view.
+
+
+ Each NESTED PATH clause can generate one or more
+ columns, which are considered to be siblings
+ to each other. In relation to the columns returned directly from the row
+ expression or by the NESTED PATH clause of a
+ higher level, these columns are child columns.
+ Sibling columns are always joined first. Once they are processed,
+ the resulting rows are joined to the parent row.
+
+
+ To join columns with parent/child relationship, you can use:
+
+
+
+
+ INNER
+
+
+
+
+ Use INNER JOIN, so that the parent row
+ is omitted from the output if it does not have any child rows
+ after joining the data returned by NESTED PATH.
+
+
+
+
+
+
+ OUTER
+
+
+
+
+ Use LEFT OUTER JOIN, so that the parent row
+ is always included into the output even if it does not have any child rows
+ after joining the data returned by NESTED PATH, with NULL values
+ inserted into the child columns if the corresponding
+ values are missing.
+
+
+ This is the default option for joining columns with parent/child relationship.
+
+
+
+
+
+
+ To join sibling columns, you can use:
+
+
+
+
+
+ UNION
+
+
+
+
+ Use FULL OUTER JOIN ON FALSE, so that both parent and child
+ rows are included into the output, with NULL values inserted
+ into both child and parrent columns for all missing values.
+
+
+ This is the default option for joining sibling columns.
+
+
+
+
+
+
+ CROSS
+
+
+
+
+ Use CROSS JOIN, so that the output includes
+ a row for every possible combination of rows from the left-hand
+ and the right-hand columns.
+
+
+
+
+
+
+
+
+
+
+
+ PLAN DEFAULT ( option [, ... ] )
+
+
+
+ Overrides the default joining plans. The INNER and
+ OUTER options define the joining plan for parent/child
+ columns, while UNION and CROSS
+ affect the sibling columns. You can override the default plans for all columns at once.
+ Even though the path names are not incuded into the PLAN DEFAULT
+ clause, they must be provided for all the paths to conform to
+ the SQL/JSON standard.
+
+
+
+
+
+
+
+ Examples
+
+
+ Query the my_films table holding
+ some JSON data about the films and create a view that
+ distributes the film genre, title, and director between separate columns:
+
+SELECT jt.* FROM
+ my_films,
+ JSON_TABLE ( js, '$.favorites[*]' COLUMNS (
+ id FOR ORDINALITY,
+ kind text PATH '$.kind',
+ NESTED PATH '$.films[*]' COLUMNS (
+ title text PATH '$.title',
+ director text PATH '$.director'))) AS jt;
+----+----------+------------------+-------------------
+ id | kind | title | director
+----+----------+------------------+-------------------
+ 1 | comedy | Bananas | Woody Allen
+ 1 | comedy | The Dinner Game | Francis Veber
+ 2 | horror | Psycho | Alfred Hitchcock
+ 3 | thriller | Vertigo | Hitchcock
+ 4 | drama | Yojimbo | Akira Kurosawa
+ (5 rows)
+
+
+
+
+ Find a director that has done films in two different genres:
+
+SELECT
+ director1 AS director, title1, kind1, title2, kind2
+FROM
+ my_films,
+ JSON_TABLE ( js, '$.favorites' AS favs COLUMNS (
+ NESTED PATH '$[*]' AS films1 COLUMNS (
+ kind1 text PATH '$.kind',
+ NESTED PATH '$.films[*]' AS film1 COLUMNS (
+ title1 text PATH '$.title',
+ director1 text PATH '$.director')
+ ),
+ NESTED PATH '$[*]' AS films2 COLUMNS (
+ kind2 text PATH '$.kind',
+ NESTED PATH '$.films[*]' AS film2 COLUMNS (
+ title2 text PATH '$.title',
+ director2 text PATH '$.director'
+ )
+ )
+ )
+ PLAN (favs OUTER ((films1 INNER film1) CROSS (films2 INNER film2)))
+ ) AS jt
+ WHERE kind1 > kind2 AND director1 = director2;
+
+
+
+
+
+
+
+ IS JSON
+ test whether the provided value is valid JSON data
+
+
+
+
+expression
+ IS [ NOT ] JSON
+ [ { VALUE | SCALAR | ARRAY | OBJECT } ]
+ [ { WITH | WITHOUT } UNIQUE [ KEYS ] ]
+
+
+
+
+ Description
+
+
+ IS JSON predicate tests whether the provided value is valid
+ JSON data. If you provide a specific JSON data type as a parameter,
+ you can check whether the value belongs to this type.
+ You can also use this predicate in the IS NOT JSON form.
+ The return values are:
+
+
+
+ t if the value satisfies the specified condition.
+
+
+
+
+ f if the value does not satisfy the specified condition.
+
+
+
+
+
+
+
+ Parameters
+
+
+
+
+
+ expression
+
+
+
+
+ The input clause defining the value to test. You can provide the values of json,
+ jsonb, bytea, or character string types.
+
+
+
+
+
+
+ VALUE | SCALAR | ARRAY | OBJECT
+
+
+
+
+ Specifies the JSON data type to test for:
+
+
+
+ VALUE (default) — any JSON type.
+
+
+
+
+ SCALAR — JSON number, string, or boolean.
+
+
+
+
+ ARRAY — JSON array.
+
+
+
+
+ OBJECT — JSON object.
+
+
+
+
+
+
+
+
+
+ { WITH | WITHOUT } UNIQUE [ KEYS ]
+
+
+ Defines whether duplicate keys are allowed:
+
+
+
+ WITHOUT (default) — the
+ JSON object can contain duplicate keys.
+
+
+
+
+ WITH — duplicate keys are not allowed.
+ If the input data contains duplicate keys, it is considered to be invalid JSON.
+
+
+
+ Optionally, you can add the KEYS keyword for semantic clarity.
+
+
+
+
+
+
+
+
+ Examples
+
+
+ Compare the result returned by the IS JSON
+ predicate for different data types:
+
+
+SELECT
+ js,
+ js IS JSON "is json",
+ js IS NOT JSON "is not json",
+ js IS JSON SCALAR "is scalar",
+ js IS JSON OBJECT "is object",
+ js IS JSON ARRAY "is array"
+FROM
+ (VALUES ('123'), ('"abc"'), ('{"a": "b"}'), ('[1,2]'), ('abc')) foo(js);
+
+ js | is json | is not json | is scalar | is object | is array
+------------+---------+-------------+-----------+-----------|-------------
+ 123 | t | f | t | f | f
+ "abc" | t | f | t | f | f
+ {"a": "b"} | t | f | f | t | f
+ [1,2] | t | f | f | f | t
+ abc | f | t | f | f | f
+(5 rows)
+
+
+
+
+
+
+ SQL/JSON Common Clauses
+
+
+ SQL/JSON Input Clause
+
+
+
+
+ context_item, path_expression
+[ PASSING { value AS varname } [, ...]]
+
+
+
+ The input clause specifies the JSON data to query and
+ the exact query path to be passed to SQL/JSON query functions:
+
+
+
+
+ The context_item is the JSON data to query.
+
+
+
+
+ The path_expression is an SQL/JSON path
+ expression that specifies the items to be retrieved from the JSON
+ data. For details on path expression syntax, see
+ .
+
+
+
+
+ The optional PASSING clause provides the values for
+ the named variables used in the SQL/JSON path expression.
+
+
+
+
+ The input clause is common for all SQL/JSON query functions.
+
+
+
+
+
+
+
+
+ SQL/JSON Output Clause
+
+
+
+
+ RETURNING data_type [ FORMAT JSON [ ENCODING UTF8 ] ]
+
+
+
+ The output clause that specifies the return type of the generated
+ JSON object. Out of the box, PostgreSQL
+ supports the following types: json, jsonb,
+ bytea, and character string types (text, char,
+ varchar, and nchar).
+ To use other types, you must create the CAST from json for this type.
+ By default, the json type is returned.
+
+
+ The optional FORMAT clause is provided to conform to the SQL/JSON standard.
+
+
+ The output clause is common for both constructor and query SQL/JSON functions.
+
+
+
+
+
+
+
+
+
+
+
+ SQL/JSON Path Expressions
+
+
+ SQL/JSON path expressions specify the items to be retrieved
+ from the JSON data, which are passed to SQL/JSON query functions
+ as one of the parameters. Path expressions belong to a special
+ jsonpath type described in .
+
+
+
+ The SQL/JSON query functions pass the provided path expression to
+ the path engine for evaluation.
+ The path expression is evaluated from left to right.
+ You can use parentheses to change the order of operations.
+ If the evaluation is successful, an SQL/JSON sequence is produced.
+ The evaluation result is returned to the SQL/JSON query function
+ that completes the specified computation. If the query result
+ must be JSON text, you have to use the WITH WRAPPER clause
+ or enclose the path expression in square brackets to ensure
+ the evaluation result is an array.
+
+
+
+ A typical path expression has the following structure:
+
+
+
+'[strict | lax] path_specification [? (filter_expression) ...]'
+
+
+
+ where:
+
+
+
+
+
+ The optional strict or lax mode
+ defines how to handle structural errors, as explained in
+ .
+
+
+
+
+ The path_specification defines the parts
+ of JSON data to be retrieved by the SQL/JSON query functions.
+ To learn the syntax of the path specification, see
+ .
+
+
+
+
+ The optional filter_expression can include
+ one or more filtering conditions to apply to the result of the
+ path evaluation. For details, see .
+
+
+
+
+
+ Strict and Lax Modes
+
+ When you query JSON data, the path expression may not match the
+ actual JSON data structure. An attempt to access a non-existent
+ member of an object or element of an array results in a
+ structural error. SQL/JSON path expressions have two modes
+ of handling structural errors:
+
+
+
+
+ lax (default) — the path engine implicitly adapts
+ the queried data to the specified path.
+ Any remaining structural errors are suppressed and converted
+ to empty SQL/JSON sequences.
+
+
+
+
+ strict — if a structural error occurs,
+ an error is raised.
+
+
+
+
+
+ The lax mode facilitates matching of a JSON document structure and path
+ expression if the JSON data does not conform to the expected schema.
+ If an operand does not match the requirements of a particular
+ operation, it can be automatically wrapped as an SQL/JSON array or unwrapped
+ by converting its elements into an SQL/JSON sequence before performing
+ this operation. Besides, comparison operators automatically unwrap their
+ operands in the lax mode, so you can compare SQL/JSON arrays out-of-the-box.
+ Arrays of size 1 are interchangeable with a singleton.
+
+
+
+
+ In the lax mode, implicit unwrapping only goes one level down.
+ If the arrays are nested, only the outermost array is unwrapped,
+ while all the inner arrays remain unchanged.
+
+
+
+
+ If you prefer using the strict mode, the specified path must exactly match
+ the structure of the queried JSON document. You can still get some
+ error-handling flexibility by using the JSON_EXISTS
+ predicate, which checks whether the element to be accessed is
+ available. It allows to convert structural errors to empty SQL/JSON
+ sequences on a selective basis, achieving lax semantics in the strict
+ mode as required.
+
+
+
+ In both strict and lax modes, the actual interpretation of the returned value
+ depends on the ON ERROR or ON EMPTY
+ clauses of the SQL/JSON query functions, as explained in .
+
+
+
+
+ Path Specification
+
+
+ A path specification defines the exact path to access one or more items
+ within JSON data using the SQL/JSON path language.
+
+
+
+ The general structure of a path specification is as follows:
+
+
+
+ Each path specification starts with a $ sign,
+ which denotes the JSON text to be queried (the context item).
+
+
+
+
+ The context item can be followed
+ by one or more accessor operators.
+ Going down the JSON structure level by level,
+ these operators return an SQL/JSON sequence if path evaluation is successful.
+
+
+
+
+ Path evaluation results can be further processed by one or more jsonpath
+ operators and methods listed in .
+ Each method must be preceded by a dot, while arithmetic and boolean
+ operators are separated from the operands by spaces.
+
+
+
+
+ If the path specification is enclosed into square brackets [],
+ path evaluation result is automatically wrapped into an array.
+ This is a PostgreSQL extension of the SQL/JSON standard.
+
+
+
+
+
+
+
+
+ Consider the following path specification examples:
+
+'$.floor'
+'($+1)'
+'$+1'
+'($.floor[*].apt[*].area > 10)'
+
+
+
+ Writing the path as an expression is also a valid path specification:
+
+'$' || '.' || 'a'
+
+
+
+
+ If you use any named variables in the path specification, you must define
+ their values in the PASSING clause of the SQL/JSON query functions.
+
+
+
+
+
+ Filter Clause
+
+
+ The optional filter clause is similar to the WHERE
+ clause in SQL. The filter clause can provide one or more
+ filter expressions, each including one or more
+ filtering conditions to apply to the result of the path evaluation.
+ Filter expressions are applied from left to right and can be nested.
+ The @ variable denotes the current item returned
+ by the path evaluation to which the filtering condition should be applied.
+
+
+
+ Filter expressions must be enclosed in parentheses and preceded by
+ a question mark ?. Functions and operators that can be used in
+ filter expressions are listed in .
+ The result of the filter expression may be true, false, or unknown.
+
+
+
+
+
+
+
+ SQL/JSON Path Operators and Methods
+
+
+ jsonpath Operators and Methods
+
+
+
+ Operator/Method
+ Description
+ Example JSON
+ Example Query
+ Result
+
+
+
+
+ + (unary)
+ Plus operator that iterates over the json sequence
+ {"x": [2.85, -14.7, -9.4]}
+ + $.x.floor()
+ 2, -15, -10
+
+
+ - (unary)
+ Minus operator that iterates over the json sequence
+ {"x": [2.85, -14.7, -9.4]}
+ - $.x.floor()
+ -2, 15, 10
+
+
+ + (binary)
+ Addition
+ [2]
+ 2 + $[0]
+ 4
+
+
+ - (binary)
+ Subtraction
+ [2]
+ 4 - $[0]
+ 2
+
+
+ *
+ Multiplication
+ [4]
+ 2 * $[0]
+ 8
+
+
+ /
+ Division
+ [8]
+ $[0] / 2
+ 4
+
+
+ %
+ Modulus
+ [32]
+ $[0] % 10
+ 2
+
+
+ type()
+ Type of the SQL/JSON item
+ [1, "2", {}]
+ $[*].type()
+ "number", "string", "object"
+
+
+ size()
+ Size of the SQL/JSON item
+ {"m": [11, 15]}
+ $.m.size()
+ 2
+
+
+ double()
+ Approximate numeric value converted from a string
+ {"len": "1.9"}
+ $.len.double() * 2
+ 3.8
+
+
+ ceiling()
+ Nearest integer greater than or equal to the SQL/JSON number
+ {"h": 1.3}
+ $.h.ceiling()
+ 2
+
+
+ floor()
+ Nearest integer less than or equal to the SQL/JSON number
+ {"h": 1.3}
+ $.h.floor()
+ 1
+
+
+ abs()
+ Absolute value of the SQL/JSON number
+ {"z": -0.3}
+ $.z.abs()
+ 0.3
+
+
+ datetime()
+ Datetime value converted from a string
+ ["2015-8-1", "2015-08-12"]
+ $[*] ? (@.datetime() < "2015-08-2". datetime())
+ 2015-8-1
+
+
+ datetime(template)
+ Datetime value converted from a string with a specified template
+ ["12:30", "18:40"]
+ $[*].datetime("HH24:MI")
+ "12:30:00", "18:40:00"
+
+
+ keyvalue()
+ Array of objects containing two members ("key" and "value" of the SQL/JSON item)
+ {"x": "20", "y": 32}
+ $.keyvalue()
+ {"key": "x", "value": "20"}, {"key": "y", "value": 32}
+
+
+
+
+ Extended jsonpath Methods
+
+
+
+ Method
+ Description
+ Example JSON
+ Example Query
+ Result
+
+
+
+
+ min()
+ Minimum value in the json array
+ [1, 2, 0, 3, 1]
+ $.min()
+ 0
+
+
+ max()
+ Maximum value in the json array
+ [1, 2, 0, 3, 1]
+ $.max()
+ 3
+
+
+ map()
+ Calculate an expression by applying a given function
+ to each element of the json array
+
+ [1, 2, 0]
+ $.map(@ * 2)
+ [2, 4, 0]
+
+
+ reduce()
+ Calculate an aggregate expression by combining elements
+ of the json array using a given function
+ ($1 references the current result, $2 references the current element)
+
+ [3, 5, 9]
+ $.reduce($1 + $2)
+ 17
+
+
+ fold()
+ Calculate an aggregate expression by combining elements
+ of the json array using a given function
+ with the specified initial value
+ ($1 references the current result, $2 references the current element)
+
+ [2, 3, 4]
+ $.fold($1 * $2, 1)
+ 24
+
+
+ foldl()
+ Calculate an aggregate expression by combining elements
+ of the json array using a given function from left to right
+ with the specified initial value
+ ($1 references the current result, $2 references the current element)
+
+ [1, 2, 3]
+ $.foldl([$1, $2], [])
+ [[[[], 1], 2], 3]
+
+
+ foldr()
+ Calculate an aggregate expression by combining elements
+ of the json array using a given function from right to left
+ with the specified initial value
+ ($1 references the current result, $2 references the current element)
+
+ [1, 2, 3]
+ $.foldr([$2, $1], [])
+ [[[[], 3], 2], 1]
+
+
+
+
+
+
+
+
+
+ PostgreSQL-specific JSON Functions and Operators
+
+
+ JSON
+ functions and operators
+
+
+
+ shows the operators that
+ are available for use with JSON data types (see ).
+
+
+
+ json and jsonb Operators
+
+
+
+ Operator
+ Right Operand Type
+ Return type
+ Description
+ Example
+ Example Result
+
+
+
+
+ ->
+ int
+ json or jsonb
+ Get JSON array element (indexed from zero, negative
+ integers count from the end)
+ '[{"a":"foo"},{"b":"bar"},{"c":"baz"}]'::json->2
+ {"c":"baz"}
+
+
+ ->
+ text
+ json or jsonb
+ Get JSON object field by key
+ '{"a": {"b":"foo"}}'::json->'a'
+ {"b":"foo"}
+
+
+ ->>
+ int
+ text
+ Get JSON array element as text
+ '[1,2,3]'::json->>2
+ 3
+
+
+ ->>
+ text
+ text
+ Get JSON object field as text
+ '{"a":1,"b":2}'::json->>'b'
+ 2
+
+
+ #>
+ text[]
+ json or jsonb
+ Get JSON object at the specified path
+ '{"a": {"b":{"c": "foo"}}}'::json#>'{a,b}'
+ {"c": "foo"}
+
+
+ #>>
+ text[]
+ text
+ Get JSON object at the specified path as text
+ '{"a":[1,2,3],"b":[4,5,6]}'::json#>>'{a,2}'
+ 3
+
+
+ @*
+ jsonpath
+ setof json or setof jsonb
+ Get all JSON items returned by JSON path for the specified JSON value
+ '{"a":[1,2,3,4,5]}'::json @* '$.a[*] ? (@ > 2)'
+
+3
+4
+5
+
+
+
+ @#
+ jsonpath
+ json or jsonb
+ Get all JSON items returned by JSON path for the specified JSON value. If there is more than one item, they will be wrapped into an array.
+ '{"a":[1,2,3,4,5]}'::json @# '$.a[*] ? (@ > 2)'
+ [3, 4, 5]
+
+
+ @?
+ jsonpath
+ boolean
+ Check whether JSON path returns any item for the specified JSON value
+ '{"a":[1,2,3,4,5]}'::json @? '$.a[*] ? (@ > 2)'
+ true
+
+
+ @~
+ jsonpath
+ boolean
+ Get JSON path predicate result for the specified JSON value
+ '{"a":[1,2,3,4,5]}'::json @~ '$.a[*] > 2'
+ true
+
+
+
+
+
+
+
+ There are parallel variants of these operators for both the
+ json and jsonb types.
+ The field/element/path extraction operators
+ return the same type as their left-hand input (either json
+ or jsonb), except for those specified as
+ returning text, which coerce the value to text.
+ The field/element/path extraction operators return NULL, rather than
+ failing, if the JSON input does not have the right structure to match
+ the request; for example if no such element exists. The
+ field/element/path extraction operators that accept integer JSON
+ array subscripts all support negative subscripting from the end of
+ arrays.
+
+
+
+ The standard comparison operators shown in are available for
+ jsonb, but not for json. They follow the
+ ordering rules for B-tree operations outlined at .
+
+
+ Some further operators also exist only for jsonb, as shown
+ in .
+ Many of these operators can be indexed by
+ jsonb operator classes. For a full description of
+ jsonb containment and existence semantics, see .
+ describes how these operators can be used to effectively index
+ jsonb data.
+
+
+ Additional jsonb Operators
+
+
+
+ Operator
+ Right Operand Type
+ Description
+ Example
+
+
+
+
+ @>
+ jsonb
+ Does the left JSON value contain the right JSON
+ path/value entries at the top level?
+ '{"a":1, "b":2}'::jsonb @> '{"b":2}'::jsonb
+
+
+ <@
+ jsonb
+ Are the left JSON path/value entries contained at the top level within
+ the right JSON value?
+ '{"b":2}'::jsonb <@ '{"a":1, "b":2}'::jsonb
+
+
+ ?
+ text
+ Does the string exist as a top-level
+ key within the JSON value?
+ '{"a":1, "b":2}'::jsonb ? 'b'
+
+
+ ?|
+ text[]
+ Do any of these array strings
+ exist as top-level keys?
+ '{"a":1, "b":2, "c":3}'::jsonb ?| array['b', 'c']
+
+
+ ?&
+ text[]
+ Do all of these array strings exist
+ as top-level keys?
+ '["a", "b"]'::jsonb ?& array['a', 'b']
+
+
+ ||
+ jsonb
+ Concatenate two jsonb values into a new jsonb value
+ '["a", "b"]'::jsonb || '["c", "d"]'::jsonb
+
+
+ -
+ text
+ Delete key/value pair or string
+ element from left operand. Key/value pairs are matched based
+ on their key value.
+ '{"a": "b"}'::jsonb - 'a'
+
+
+ -
+ text[]
+ Delete multiple key/value pairs or string
+ elements from left operand. Key/value pairs are matched based
+ on their key value.
+ '{"a": "b", "c": "d"}'::jsonb - '{a,c}'::text[]
+
+
+ -
+ integer
+ Delete the array element with specified index (Negative
+ integers count from the end). Throws an error if top level
+ container is not an array.
+ '["a", "b"]'::jsonb - 1
+
+
+ #-
+ text[]
+ Delete the field or element with specified path (for
+ JSON arrays, negative integers count from the end)
+ '["a", {"b":1}]'::jsonb #- '{1,b}'
+
+
+
+
+
+
+
+ The || operator concatenates the elements at the top level of
+ each of its operands. It does not operate recursively. For example, if
+ both operands are objects with a common key field name, the value of the
+ field in the result will just be the value from the right hand operand.
+
+
+
+
+ shows the functions that are
+ available for creating json and jsonb values.
+ (There are no equivalent functions for jsonb, of the row_to_json
+ and array_to_json functions. However, the to_jsonb
+ function supplies much the same functionality as these functions would.)
+
+
+
+ to_json
+
+
+ array_to_json
+
+
+ row_to_json
+
+
+ json_build_array
+
+
+ json_build_object
+
+
+ json_object
+
+
+ to_jsonb
+
+
+ jsonb_build_array
+
+
+ jsonb_build_object
+
+
+ jsonb_object
+
+
+
+ JSON Creation Functions
+
+
+
+ Function
+ Description
+ Example
+ Example Result
+
+
+
+
+ to_json(anyelement)
+ to_jsonb(anyelement)
+
+
+ Returns the value as json or jsonb.
+ Arrays and composites are converted
+ (recursively) to arrays and objects; otherwise, if there is a cast
+ from the type to json, the cast function will be used to
+ perform the conversion; otherwise, a scalar value is produced.
+ For any scalar type other than a number, a Boolean, or a null value,
+ the text representation will be used, in such a fashion that it is a
+ valid json or jsonb value.
+
+ to_json('Fred said "Hi."'::text)
+ "Fred said \"Hi.\""
+
+
+
+ array_to_json(anyarray [, pretty_bool])
+
+
+ Returns the array as a JSON array. A PostgreSQL multidimensional array
+ becomes a JSON array of arrays. Line feeds will be added between
+ dimension-1 elements if pretty_bool is true.
+
+ array_to_json('{{1,5},{99,100}}'::int[])
+ [[1,5],[99,100]]
+
+
+
+ row_to_json(record [, pretty_bool])
+
+
+ Returns the row as a JSON object. Line feeds will be added between
+ level-1 elements if pretty_bool is true.
+
+ row_to_json(row(1,'foo'))
+ {"f1":1,"f2":"foo"}
+
+
+ json_build_array(VARIADIC "any")
+ jsonb_build_array(VARIADIC "any")
+
+
+ Builds a possibly-heterogeneously-typed JSON array out of a variadic
+ argument list.
+
+ json_build_array(1,2,'3',4,5)
+ [1, 2, "3", 4, 5]
+
+
+ json_build_object(VARIADIC "any")
+ jsonb_build_object(VARIADIC "any")
+
+
+ Builds a JSON object out of a variadic argument list. By
+ convention, the argument list consists of alternating
+ keys and values.
+
+ json_build_object('foo',1,'bar',2)
+ {"foo": 1, "bar": 2}
+
+
+ json_object(text[])
+ jsonb_object(text[])
+
+
+ Builds a JSON object out of a text array. The array must have either
+ exactly one dimension with an even number of members, in which case
+ they are taken as alternating key/value pairs, or two dimensions
+ such that each inner array has exactly two elements, which
+ are taken as a key/value pair.
+
+ json_object('{a, 1, b, "def", c, 3.5}')
+ json_object('{{a, 1},{b, "def"},{c, 3.5}}')
+ {"a": "1", "b": "def", "c": "3.5"}
+
+
+ json_object(keys text[], values text[])
+ jsonb_object(keys text[], values text[])
+
+
+ This form of json_object takes keys and values pairwise from two separate
+ arrays. In all other respects it is identical to the one-argument form.
+
+ json_object('{a, b}', '{1,2}')
+ {"a": "1", "b": "2"}
+
+
+
+
+
+
+
+ array_to_json and row_to_json have the same
+ behavior as to_json except for offering a pretty-printing
+ option. The behavior described for to_json likewise applies
+ to each individual value converted by the other JSON creation functions.
+
+
+
+
+
+ The extension has a cast
+ from hstore to json, so that
+ hstore values converted via the JSON creation functions
+ will be represented as JSON objects, not as primitive string values.
+
+
+
+
+ shows the functions that
+ are available for processing json and jsonb values.
+
+
+
+ json_array_length
+
+
+ jsonb_array_length
+
+
+ json_each
+
+
+ jsonb_each
+
+
+ json_each_text
+
+
+ jsonb_each_text
+
+
+ json_extract_path
+
+
+ jsonb_extract_path
+
+
+ json_extract_path_text
+
+
+ jsonb_extract_path_text
+
+
+ json_object_keys
+
+
+ jsonb_object_keys
+
+
+ json_populate_record
+
+
+ jsonb_populate_record
+
+
+ json_populate_recordset
+
+
+ jsonb_populate_recordset
+
+
+ json_array_elements
+
+
+ jsonb_array_elements
+
+
+ json_array_elements_text
+
+
+ jsonb_array_elements_text
+
+
+ json_typeof
+
+
+ jsonb_typeof
+
+
+ json_to_record
+
+
+ jsonb_to_record
+
+
+ json_to_recordset
+
+
+ jsonb_to_recordset
+
+
+ json_strip_nulls
+
+
+ jsonb_strip_nulls
+
+
+ jsonb_set
+
+
+ jsonb_insert
+
+
+ jsonb_pretty
+
+
+
+ JSON Processing Functions
+
+
+
+ Function
+ Return Type
+ Description
+ Example
+ Example Result
+
+
+
+
+ json_array_length(json)
+ jsonb_array_length(jsonb)
+
+ int
+
+ Returns the number of elements in the outermost JSON array.
+
+ json_array_length('[1,2,3,{"f1":1,"f2":[5,6]},4]')
+ 5
+
+
+ json_each(json)
+ jsonb_each(jsonb)
+
+ setof key text, value json
+ setof key text, value jsonb
+
+
+ Expands the outermost JSON object into a set of key/value pairs.
+
+ select * from json_each('{"a":"foo", "b":"bar"}')
+
+
+ key | value
+-----+-------
+ a | "foo"
+ b | "bar"
+
+
+
+
+ json_each_text(json)
+ jsonb_each_text(jsonb)
+
+ setof key text, value text
+
+ Expands the outermost JSON object into a set of key/value pairs. The
+ returned values will be of type text.
+
+ select * from json_each_text('{"a":"foo", "b":"bar"}')
+
+
+ key | value
+-----+-------
+ a | foo
+ b | bar
+
+
+
+
+ json_extract_path(from_json json, VARIADIC path_elems text[])
+ jsonb_extract_path(from_json jsonb, VARIADIC path_elems text[])
+
+ jsonjsonb
+
+
+ Returns JSON value pointed to by path_elems
+ (equivalent to #> operator).
+
+ json_extract_path('{"f2":{"f3":1},"f4":{"f5":99,"f6":"foo"}}','f4')
+ {"f5":99,"f6":"foo"}
+
+
+ json_extract_path_text(from_json json, VARIADIC path_elems text[])
+ jsonb_extract_path_text(from_json jsonb, VARIADIC path_elems text[])
+
+ text
+
+ Returns JSON value pointed to by path_elems
+ as text
+ (equivalent to #>> operator).
+
+ json_extract_path_text('{"f2":{"f3":1},"f4":{"f5":99,"f6":"foo"}}','f4', 'f6')
+ foo
+
+
+ json_object_keys(json)
+ jsonb_object_keys(jsonb)
+
+ setof text
+
+ Returns set of keys in the outermost JSON object.
+
+ json_object_keys('{"f1":"abc","f2":{"f3":"a", "f4":"b"}}')
+
+
+ json_object_keys
+------------------
+ f1
+ f2
+
+
+
+
+ json_populate_record(base anyelement, from_json json)
+ jsonb_populate_record(base anyelement, from_json jsonb)
+
+ anyelement
+
+ Expands the object in from_json to a row
+ whose columns match the record type defined by base
+ (see note below).
+
+ select * from json_populate_record(null::myrowtype, '{"a": 1, "b": ["2", "a b"], "c": {"d": 4, "e": "a b c"}}')
+
+
+ a | b | c
+---+-----------+-------------
+ 1 | {2,"a b"} | (4,"a b c")
+
+
+
+
+ json_populate_recordset(base anyelement, from_json json)
+ jsonb_populate_recordset(base anyelement, from_json jsonb)
+
+ setof anyelement
+
+ Expands the outermost array of objects
+ in from_json to a set of rows whose
+ columns match the record type defined by base (see
+ note below).
+
+ select * from json_populate_recordset(null::myrowtype, '[{"a":1,"b":2},{"a":3,"b":4}]')
+
+
+ a | b
+---+---
+ 1 | 2
+ 3 | 4
+
+
+
+
+ json_array_elements(json)
+ jsonb_array_elements(jsonb)
+
+ setof json
+ setof jsonb
+
+
+ Expands a JSON array to a set of JSON values.
+
+ select * from json_array_elements('[1,true, [2,false]]')
+
+
+ value
+-----------
+ 1
+ true
+ [2,false]
+
+
+
+
+ json_array_elements_text(json)
+ jsonb_array_elements_text(jsonb)
+
+ setof text
+
+ Expands a JSON array to a set of text values.
+
+ select * from json_array_elements_text('["foo", "bar"]')
+
+
+ value
+-----------
+ foo
+ bar
+
+
+
+
+ json_typeof(json)
+ jsonb_typeof(jsonb)
+
+ text
+
+ Returns the type of the outermost JSON value as a text string.
+ Possible types are
+ object, array, string, number,
+ boolean, and null.
+
+ json_typeof('-123.4')
+ number
+
+
+ json_to_record(json)
+ jsonb_to_record(jsonb)
+
+ record
+
+ Builds an arbitrary record from a JSON object (see note below). As
+ with all functions returning record, the caller must
+ explicitly define the structure of the record with an AS
+ clause.
+
+ select * from json_to_record('{"a":1,"b":[1,2,3],"c":[1,2,3],"e":"bar","r": {"a": 123, "b": "a b c"}}') as x(a int, b text, c int[], d text, r myrowtype)
+
+
+ a | b | c | d | r
+---+---------+---------+---+---------------
+ 1 | [1,2,3] | {1,2,3} | | (123,"a b c")
+
+
+
+
+ json_to_recordset(json)
+ jsonb_to_recordset(jsonb)
+
+ setof record
+
+ Builds an arbitrary set of records from a JSON array of objects (see
+ note below). As with all functions returning record, the
+ caller must explicitly define the structure of the record with
+ an AS clause.
+
+ select * from json_to_recordset('[{"a":1,"b":"foo"},{"a":"2","c":"bar"}]') as x(a int, b text);
+
+
+ a | b
+---+-----
+ 1 | foo
+ 2 |
+
+
+
+
+ json_strip_nulls(from_json json)
+ jsonb_strip_nulls(from_json jsonb)
+
+ jsonjsonb
+
+ Returns from_json
+ with all object fields that have null values omitted. Other null values
+ are untouched.
+
+ json_strip_nulls('[{"f1":1,"f2":null},2,null,3]')
+ [{"f1":1},2,null,3]
+
+
+ jsonb_set(target jsonb, path text[], new_value jsonb, create_missingboolean)
+
+ jsonb
+
+ Returns target
+ with the section designated by path
+ replaced by new_value, or with
+ new_value added if
+ create_missing is true ( default is
+ true) and the item
+ designated by path does not exist.
+ As with the path orientated operators, negative integers that
+ appear in path count from the end
+ of JSON arrays.
+
+ jsonb_set('[{"f1":1,"f2":null},2,null,3]', '{0,f1}','[2,3,4]', false)
+ jsonb_set('[{"f1":1,"f2":null},2]', '{0,f3}','[2,3,4]')
+
+ [{"f1":[2,3,4],"f2":null},2,null,3]
+ [{"f1": 1, "f2": null, "f3": [2, 3, 4]}, 2]
+
+
+
+
+
+ jsonb_insert(target jsonb, path text[], new_value jsonb, insert_afterboolean)
+
+
+ jsonb
+
+ Returns target with
+ new_value inserted. If
+ target section designated by
+ path is in a JSONB array,
+ new_value will be inserted before target or
+ after if insert_after is true (default is
+ false). If target section
+ designated by path is in JSONB object,
+ new_value will be inserted only if
+ target does not exist. As with the path
+ orientated operators, negative integers that appear in
+ path count from the end of JSON arrays.
+
+
+
+ jsonb_insert('{"a": [0,1,2]}', '{a, 1}', '"new_value"')
+
+
+ jsonb_insert('{"a": [0,1,2]}', '{a, 1}', '"new_value"', true)
+
+
+ {"a": [0, "new_value", 1, 2]}
+ {"a": [0, 1, "new_value", 2]}
+
+
+
+ jsonb_pretty(from_json jsonb)
+
+ text
+
+ Returns from_json
+ as indented JSON text.
+
+ jsonb_pretty('[{"f1":1,"f2":null},2,null,3]')
+
+
+[
+ {
+ "f1": 1,
+ "f2": null
+ },
+ 2,
+ null,
+ 3
+]
+
+
+
+
+
+
+
+
+
+ Many of these functions and operators will convert Unicode escapes in
+ JSON strings to the appropriate single character. This is a non-issue
+ if the input is type jsonb, because the conversion was already
+ done; but for json input, this may result in throwing an error,
+ as noted in .
+
+
+
+
+
+ While the examples for the functions
+ json_populate_record,
+ json_populate_recordset,
+ json_to_record and
+ json_to_recordset use constants, the typical use
+ would be to reference a table in the FROM clause
+ and use one of its json or jsonb columns
+ as an argument to the function. Extracted key values can then be
+ referenced in other parts of the query, like WHERE
+ clauses and target lists. Extracting multiple values in this
+ way can improve performance over extracting them separately with
+ per-key operators.
+
+
+
+ JSON keys are matched to identical column names in the target
+ row type. JSON type coercion for these functions is best
+ effort and may not result in desired values for some types.
+ JSON fields that do not appear in the target row type will be
+ omitted from the output, and target columns that do not match any
+ JSON field will simply be NULL.
+
+
+
+
+
+ All the items of the path parameter of jsonb_set
+ as well as jsonb_insert except the last item must be present
+ in the target. If create_missing is false, all
+ items of the path parameter of jsonb_set must be
+ present. If these conditions are not met the target is
+ returned unchanged.
+
+
+ If the last path item is an object key, it will be created if it
+ is absent and given the new value. If the last path item is an array
+ index, if it is positive the item to set is found by counting from
+ the left, and if negative by counting from the right - -1
+ designates the rightmost element, and so on.
+ If the item is out of the range -array_length .. array_length -1,
+ and create_missing is true, the new value is added at the beginning
+ of the array if the item is negative, and at the end of the array if
+ it is positive.
+
+
+
+
+
+ The json_typeof function's null return value
+ should not be confused with a SQL NULL. While
+ calling json_typeof('null'::json) will
+ return null, calling json_typeof(NULL::json)
+ will return a SQL NULL.
+
+
+
+
+
+ If the argument to json_strip_nulls contains duplicate
+ field names in any object, the result could be semantically somewhat
+ different, depending on the order in which they occur. This is not an
+ issue for jsonb_strip_nulls since jsonb values never have
+ duplicate object field names.
+
+
+
+
+ See also for the aggregate
+ function json_agg which aggregates record
+ values as JSON, and the aggregate function
+ json_object_agg which aggregates pairs of values
+ into a JSON object, and their jsonb equivalents,
+ jsonb_agg and jsonb_object_agg.
+
+
+
+
+
+
diff --git a/doc/src/sgml/func.sgml b/doc/src/sgml/func.sgml
index 5dce8ef..414dc22 100644
--- a/doc/src/sgml/func.sgml
+++ b/doc/src/sgml/func.sgml
@@ -11183,896 +11183,7 @@ table2-mapping
-
- JSON Functions and Operators
-
-
- JSON
- functions and operators
-
-
-
- shows the operators that
- are available for use with the two JSON data types (see ).
-
-
-
- json and jsonb Operators
-
-
-
- Operator
- Right Operand Type
- Description
- Example
- Example Result
-
-
-
-
- ->
- int
- Get JSON array element (indexed from zero, negative
- integers count from the end)
- '[{"a":"foo"},{"b":"bar"},{"c":"baz"}]'::json->2
- {"c":"baz"}
-
-
- ->
- text
- Get JSON object field by key
- '{"a": {"b":"foo"}}'::json->'a'
- {"b":"foo"}
-
-
- ->>
- int
- Get JSON array element as text
- '[1,2,3]'::json->>2
- 3
-
-
- ->>
- text
- Get JSON object field as text
- '{"a":1,"b":2}'::json->>'b'
- 2
-
-
- #>
- text[]
- Get JSON object at specified path
- '{"a": {"b":{"c": "foo"}}}'::json#>'{a,b}'
- {"c": "foo"}
-
-
- #>>
- text[]
- Get JSON object at specified path as text
- '{"a":[1,2,3],"b":[4,5,6]}'::json#>>'{a,2}'
- 3
-
-
-
-
-
-
-
- There are parallel variants of these operators for both the
- json and jsonb types.
- The field/element/path extraction operators
- return the same type as their left-hand input (either json
- or jsonb), except for those specified as
- returning text, which coerce the value to text.
- The field/element/path extraction operators return NULL, rather than
- failing, if the JSON input does not have the right structure to match
- the request; for example if no such element exists. The
- field/element/path extraction operators that accept integer JSON
- array subscripts all support negative subscripting from the end of
- arrays.
-
-
-
- The standard comparison operators shown in are available for
- jsonb, but not for json. They follow the
- ordering rules for B-tree operations outlined at .
-
-
- Some further operators also exist only for jsonb, as shown
- in .
- Many of these operators can be indexed by
- jsonb operator classes. For a full description of
- jsonb containment and existence semantics, see .
- describes how these operators can be used to effectively index
- jsonb data.
-
-
- Additional jsonb Operators
-
-
-
- Operator
- Right Operand Type
- Description
- Example
-
-
-
-
- @>
- jsonb
- Does the left JSON value contain the right JSON
- path/value entries at the top level?
- '{"a":1, "b":2}'::jsonb @> '{"b":2}'::jsonb
-
-
- <@
- jsonb
- Are the left JSON path/value entries contained at the top level within
- the right JSON value?
- '{"b":2}'::jsonb <@ '{"a":1, "b":2}'::jsonb
-
-
- ?
- text
- Does the string exist as a top-level
- key within the JSON value?
- '{"a":1, "b":2}'::jsonb ? 'b'
-
-
- ?|
- text[]
- Do any of these array strings
- exist as top-level keys?
- '{"a":1, "b":2, "c":3}'::jsonb ?| array['b', 'c']
-
-
- ?&
- text[]
- Do all of these array strings exist
- as top-level keys?
- '["a", "b"]'::jsonb ?& array['a', 'b']
-
-
- ||
- jsonb
- Concatenate two jsonb values into a new jsonb value
- '["a", "b"]'::jsonb || '["c", "d"]'::jsonb
-
-
- -
- text
- Delete key/value pair or string
- element from left operand. Key/value pairs are matched based
- on their key value.
- '{"a": "b"}'::jsonb - 'a'
-
-
- -
- text[]
- Delete multiple key/value pairs or string
- elements from left operand. Key/value pairs are matched based
- on their key value.
- '{"a": "b", "c": "d"}'::jsonb - '{a,c}'::text[]
-
-
- -
- integer
- Delete the array element with specified index (Negative
- integers count from the end). Throws an error if top level
- container is not an array.
- '["a", "b"]'::jsonb - 1
-
-
- #-
- text[]
- Delete the field or element with specified path (for
- JSON arrays, negative integers count from the end)
- '["a", {"b":1}]'::jsonb #- '{1,b}'
-
-
-
-
-
-
-
- The || operator concatenates the elements at the top level of
- each of its operands. It does not operate recursively. For example, if
- both operands are objects with a common key field name, the value of the
- field in the result will just be the value from the right hand operand.
-
-
-
-
- shows the functions that are
- available for creating json and jsonb values.
- (There are no equivalent functions for jsonb, of the row_to_json
- and array_to_json functions. However, the to_jsonb
- function supplies much the same functionality as these functions would.)
-
-
-
- to_json
-
-
- array_to_json
-
-
- row_to_json
-
-
- json_build_array
-
-
- json_build_object
-
-
- json_object
-
-
- to_jsonb
-
-
- jsonb_build_array
-
-
- jsonb_build_object
-
-
- jsonb_object
-
-
-
- JSON Creation Functions
-
-
-
- Function
- Description
- Example
- Example Result
-
-
-
-
- to_json(anyelement)
- to_jsonb(anyelement)
-
-
- Returns the value as json or jsonb.
- Arrays and composites are converted
- (recursively) to arrays and objects; otherwise, if there is a cast
- from the type to json, the cast function will be used to
- perform the conversion; otherwise, a scalar value is produced.
- For any scalar type other than a number, a Boolean, or a null value,
- the text representation will be used, in such a fashion that it is a
- valid json or jsonb value.
-
- to_json('Fred said "Hi."'::text)
- "Fred said \"Hi.\""
-
-
-
- array_to_json(anyarray [, pretty_bool])
-
-
- Returns the array as a JSON array. A PostgreSQL multidimensional array
- becomes a JSON array of arrays. Line feeds will be added between
- dimension-1 elements if pretty_bool is true.
-
- array_to_json('{{1,5},{99,100}}'::int[])
- [[1,5],[99,100]]
-
-
-
- row_to_json(record [, pretty_bool])
-
-
- Returns the row as a JSON object. Line feeds will be added between
- level-1 elements if pretty_bool is true.
-
- row_to_json(row(1,'foo'))
- {"f1":1,"f2":"foo"}
-
-
- json_build_array(VARIADIC "any")
- jsonb_build_array(VARIADIC "any")
-
-
- Builds a possibly-heterogeneously-typed JSON array out of a variadic
- argument list.
-
- json_build_array(1,2,'3',4,5)
- [1, 2, "3", 4, 5]
-
-
- json_build_object(VARIADIC "any")
- jsonb_build_object(VARIADIC "any")
-
-
- Builds a JSON object out of a variadic argument list. By
- convention, the argument list consists of alternating
- keys and values.
-
- json_build_object('foo',1,'bar',2)
- {"foo": 1, "bar": 2}
-
-
- json_object(text[])
- jsonb_object(text[])
-
-
- Builds a JSON object out of a text array. The array must have either
- exactly one dimension with an even number of members, in which case
- they are taken as alternating key/value pairs, or two dimensions
- such that each inner array has exactly two elements, which
- are taken as a key/value pair.
-
- json_object('{a, 1, b, "def", c, 3.5}')
- json_object('{{a, 1},{b, "def"},{c, 3.5}}')
- {"a": "1", "b": "def", "c": "3.5"}
-
-
- json_object(keys text[], values text[])
- jsonb_object(keys text[], values text[])
-
-
- This form of json_object takes keys and values pairwise from two separate
- arrays. In all other respects it is identical to the one-argument form.
-
- json_object('{a, b}', '{1,2}')
- {"a": "1", "b": "2"}
-
-
-
-
-
-
-
- array_to_json and row_to_json have the same
- behavior as to_json except for offering a pretty-printing
- option. The behavior described for to_json likewise applies
- to each individual value converted by the other JSON creation functions.
-
-
-
-
-
- The extension has a cast
- from hstore to json, so that
- hstore values converted via the JSON creation functions
- will be represented as JSON objects, not as primitive string values.
-
-
-
-
- shows the functions that
- are available for processing json and jsonb values.
-
-
-
- json_array_length
-
-
- jsonb_array_length
-
-
- json_each
-
-
- jsonb_each
-
-
- json_each_text
-
-
- jsonb_each_text
-
-
- json_extract_path
-
-
- jsonb_extract_path
-
-
- json_extract_path_text
-
-
- jsonb_extract_path_text
-
-
- json_object_keys
-
-
- jsonb_object_keys
-
-
- json_populate_record
-
-
- jsonb_populate_record
-
-
- json_populate_recordset
-
-
- jsonb_populate_recordset
-
-
- json_array_elements
-
-
- jsonb_array_elements
-
-
- json_array_elements_text
-
-
- jsonb_array_elements_text
-
-
- json_typeof
-
-
- jsonb_typeof
-
-
- json_to_record
-
-
- jsonb_to_record
-
-
- json_to_recordset
-
-
- jsonb_to_recordset
-
-
- json_strip_nulls
-
-
- jsonb_strip_nulls
-
-
- jsonb_set
-
-
- jsonb_insert
-
-
- jsonb_pretty
-
-
-
- JSON Processing Functions
-
-
-
- Function
- Return Type
- Description
- Example
- Example Result
-
-
-
-
- json_array_length(json)
- jsonb_array_length(jsonb)
-
- int
-
- Returns the number of elements in the outermost JSON array.
-
- json_array_length('[1,2,3,{"f1":1,"f2":[5,6]},4]')
- 5
-
-
- json_each(json)
- jsonb_each(jsonb)
-
- setof key text, value json
- setof key text, value jsonb
-
-
- Expands the outermost JSON object into a set of key/value pairs.
-
- select * from json_each('{"a":"foo", "b":"bar"}')
-
-
- key | value
------+-------
- a | "foo"
- b | "bar"
-
-
-
-
- json_each_text(json)
- jsonb_each_text(jsonb)
-
- setof key text, value text
-
- Expands the outermost JSON object into a set of key/value pairs. The
- returned values will be of type text.
-
- select * from json_each_text('{"a":"foo", "b":"bar"}')
-
-
- key | value
------+-------
- a | foo
- b | bar
-
-
-
-
- json_extract_path(from_json json, VARIADIC path_elems text[])
- jsonb_extract_path(from_json jsonb, VARIADIC path_elems text[])
-
- jsonjsonb
-
-
- Returns JSON value pointed to by path_elems
- (equivalent to #> operator).
-
- json_extract_path('{"f2":{"f3":1},"f4":{"f5":99,"f6":"foo"}}','f4')
- {"f5":99,"f6":"foo"}
-
-
- json_extract_path_text(from_json json, VARIADIC path_elems text[])
- jsonb_extract_path_text(from_json jsonb, VARIADIC path_elems text[])
-
- text
-
- Returns JSON value pointed to by path_elems
- as text
- (equivalent to #>> operator).
-
- json_extract_path_text('{"f2":{"f3":1},"f4":{"f5":99,"f6":"foo"}}','f4', 'f6')
- foo
-
-
- json_object_keys(json)
- jsonb_object_keys(jsonb)
-
- setof text
-
- Returns set of keys in the outermost JSON object.
-
- json_object_keys('{"f1":"abc","f2":{"f3":"a", "f4":"b"}}')
-
-
- json_object_keys
-------------------
- f1
- f2
-
-
-
-
- json_populate_record(base anyelement, from_json json)
- jsonb_populate_record(base anyelement, from_json jsonb)
-
- anyelement
-
- Expands the object in from_json to a row
- whose columns match the record type defined by base
- (see note below).
-
- select * from json_populate_record(null::myrowtype, '{"a": 1, "b": ["2", "a b"], "c": {"d": 4, "e": "a b c"}}')
-
-
- a | b | c
----+-----------+-------------
- 1 | {2,"a b"} | (4,"a b c")
-
-
-
-
- json_populate_recordset(base anyelement, from_json json)
- jsonb_populate_recordset(base anyelement, from_json jsonb)
-
- setof anyelement
-
- Expands the outermost array of objects
- in from_json to a set of rows whose
- columns match the record type defined by base (see
- note below).
-
- select * from json_populate_recordset(null::myrowtype, '[{"a":1,"b":2},{"a":3,"b":4}]')
-
-
- a | b
----+---
- 1 | 2
- 3 | 4
-
-
-
-
- json_array_elements(json)
- jsonb_array_elements(jsonb)
-
- setof json
- setof jsonb
-
-
- Expands a JSON array to a set of JSON values.
-
- select * from json_array_elements('[1,true, [2,false]]')
-
-
- value
------------
- 1
- true
- [2,false]
-
-
-
-
- json_array_elements_text(json)
- jsonb_array_elements_text(jsonb)
-
- setof text
-
- Expands a JSON array to a set of text values.
-
- select * from json_array_elements_text('["foo", "bar"]')
-
-
- value
------------
- foo
- bar
-
-
-
-
- json_typeof(json)
- jsonb_typeof(jsonb)
-
- text
-
- Returns the type of the outermost JSON value as a text string.
- Possible types are
- object, array, string, number,
- boolean, and null.
-
- json_typeof('-123.4')
- number
-
-
- json_to_record(json)
- jsonb_to_record(jsonb)
-
- record
-
- Builds an arbitrary record from a JSON object (see note below). As
- with all functions returning record, the caller must
- explicitly define the structure of the record with an AS
- clause.
-
- select * from json_to_record('{"a":1,"b":[1,2,3],"c":[1,2,3],"e":"bar","r": {"a": 123, "b": "a b c"}}') as x(a int, b text, c int[], d text, r myrowtype)
-
-
- a | b | c | d | r
----+---------+---------+---+---------------
- 1 | [1,2,3] | {1,2,3} | | (123,"a b c")
-
-
-
-
- json_to_recordset(json)
- jsonb_to_recordset(jsonb)
-
- setof record
-
- Builds an arbitrary set of records from a JSON array of objects (see
- note below). As with all functions returning record, the
- caller must explicitly define the structure of the record with
- an AS clause.
-
- select * from json_to_recordset('[{"a":1,"b":"foo"},{"a":"2","c":"bar"}]') as x(a int, b text);
-
-
- a | b
----+-----
- 1 | foo
- 2 |
-
-
-
-
- json_strip_nulls(from_json json)
- jsonb_strip_nulls(from_json jsonb)
-
- jsonjsonb
-
- Returns from_json
- with all object fields that have null values omitted. Other null values
- are untouched.
-
- json_strip_nulls('[{"f1":1,"f2":null},2,null,3]')
- [{"f1":1},2,null,3]
-
-
- jsonb_set(target jsonb, path text[], new_value jsonb, create_missingboolean)
-
- jsonb
-
- Returns target
- with the section designated by path
- replaced by new_value, or with
- new_value added if
- create_missing is true ( default is
- true) and the item
- designated by path does not exist.
- As with the path orientated operators, negative integers that
- appear in path count from the end
- of JSON arrays.
-
- jsonb_set('[{"f1":1,"f2":null},2,null,3]', '{0,f1}','[2,3,4]', false)
- jsonb_set('[{"f1":1,"f2":null},2]', '{0,f3}','[2,3,4]')
-
- [{"f1":[2,3,4],"f2":null},2,null,3]
- [{"f1": 1, "f2": null, "f3": [2, 3, 4]}, 2]
-
-
-
-
-
- jsonb_insert(target jsonb, path text[], new_value jsonb, insert_afterboolean)
-
-
- jsonb
-
- Returns target with
- new_value inserted. If
- target section designated by
- path is in a JSONB array,
- new_value will be inserted before target or
- after if insert_after is true (default is
- false). If target section
- designated by path is in JSONB object,
- new_value will be inserted only if
- target does not exist. As with the path
- orientated operators, negative integers that appear in
- path count from the end of JSON arrays.
-
-
-
- jsonb_insert('{"a": [0,1,2]}', '{a, 1}', '"new_value"')
-
-
- jsonb_insert('{"a": [0,1,2]}', '{a, 1}', '"new_value"', true)
-
-
- {"a": [0, "new_value", 1, 2]}
- {"a": [0, 1, "new_value", 2]}
-
-
-
- jsonb_pretty(from_json jsonb)
-
- text
-
- Returns from_json
- as indented JSON text.
-
- jsonb_pretty('[{"f1":1,"f2":null},2,null,3]')
-
-
-[
- {
- "f1": 1,
- "f2": null
- },
- 2,
- null,
- 3
-]
-
-
-
-
-
-
-
-
-
- Many of these functions and operators will convert Unicode escapes in
- JSON strings to the appropriate single character. This is a non-issue
- if the input is type jsonb, because the conversion was already
- done; but for json input, this may result in throwing an error,
- as noted in .
-
-
-
-
-
- While the examples for the functions
- json_populate_record,
- json_populate_recordset,
- json_to_record and
- json_to_recordset use constants, the typical use
- would be to reference a table in the FROM clause
- and use one of its json or jsonb columns
- as an argument to the function. Extracted key values can then be
- referenced in other parts of the query, like WHERE
- clauses and target lists. Extracting multiple values in this
- way can improve performance over extracting them separately with
- per-key operators.
-
-
-
- JSON keys are matched to identical column names in the target
- row type. JSON type coercion for these functions is best
- effort and may not result in desired values for some types.
- JSON fields that do not appear in the target row type will be
- omitted from the output, and target columns that do not match any
- JSON field will simply be NULL.
-
-
-
-
-
- All the items of the path parameter of jsonb_set
- as well as jsonb_insert except the last item must be present
- in the target. If create_missing is false, all
- items of the path parameter of jsonb_set must be
- present. If these conditions are not met the target is
- returned unchanged.
-
-
- If the last path item is an object key, it will be created if it
- is absent and given the new value. If the last path item is an array
- index, if it is positive the item to set is found by counting from
- the left, and if negative by counting from the right - -1
- designates the rightmost element, and so on.
- If the item is out of the range -array_length .. array_length -1,
- and create_missing is true, the new value is added at the beginning
- of the array if the item is negative, and at the end of the array if
- it is positive.
-
-
-
-
-
- The json_typeof function's null return value
- should not be confused with a SQL NULL. While
- calling json_typeof('null'::json) will
- return null, calling json_typeof(NULL::json)
- will return a SQL NULL.
-
-
-
-
-
- If the argument to json_strip_nulls contains duplicate
- field names in any object, the result could be semantically somewhat
- different, depending on the order in which they occur. This is not an
- issue for jsonb_strip_nulls since jsonb values never have
- duplicate object field names.
-
-
-
-
- See also for the aggregate
- function json_agg which aggregates record
- values as JSON, and the aggregate function
- json_object_agg which aggregates pairs of values
- into a JSON object, and their jsonb equivalents,
- jsonb_agg and jsonb_object_agg.
-
-
-
+ &func-sqljson;
Sequence Manipulation Functions
diff --git a/doc/src/sgml/gin.sgml b/doc/src/sgml/gin.sgml
index cc7cd1e..8c51e4e 100644
--- a/doc/src/sgml/gin.sgml
+++ b/doc/src/sgml/gin.sgml
@@ -102,6 +102,8 @@
?&?|@>
+ @?
+ @~
@@ -109,6 +111,8 @@
jsonb@>
+ @?
+ @~
diff --git a/doc/src/sgml/json.sgml b/doc/src/sgml/json.sgml
index e7b68fa..9c896c5 100644
--- a/doc/src/sgml/json.sgml
+++ b/doc/src/sgml/json.sgml
@@ -22,8 +22,16 @@
- There are two JSON data types: json and jsonb.
- They accept almost identical sets of values as
+ PostgreSQL offers two types for storing JSON
+ data: json and jsonb. To implement effective query
+ mechanisms for these data types, PostgreSQL
+ also provides the jsonpath data type described in
+ .
+
+
+
+ The json and jsonb data types
+ accept almost identical sets of values as
input. The major practical difference is one of efficiency. The
json data type stores an exact copy of the input text,
which processing functions must reparse on each execution; while
@@ -217,6 +225,11 @@ SELECT '{"reading": 1.230e-5}'::json, '{"reading": 1.230e-5}'::jsonb;
in this example, even though those are semantically insignificant for
purposes such as equality checks.
+
+
+ For the list of built-in functions and operators available for
+ constructing and processing JSON values, see .
+
@@ -536,6 +549,19 @@ SELECT jdoc->'guid', jdoc->'name' FROM api WHERE jdoc @> '{"tags": ["qu
+ jsonb_ops and jsonb_path_ops also
+ support queries with jsonpath operators @?
+ and @~. The previous example for @>
+ operator can be rewritten as follows:
+
+-- Find documents in which the key "tags" contains array element "qui"
+SELECT jdoc->'guid', jdoc->'name' FROM api WHERE jdoc @? '$.tags[*] ? (@ == "qui")';
+SELECT jdoc->'guid', jdoc->'name' FROM api WHERE jdoc @~ '$.tags[*] == "qui"';
+
+
+
+
+ jsonb also supports btree and hash
indexes. These are usually useful only if it's important to check
equality of complete JSON documents.
@@ -593,4 +619,185 @@ SELECT jdoc->'guid', jdoc->'name' FROM api WHERE jdoc @> '{"tags": ["qu
lists, and scalars, as appropriate.
+
+
+ jsonpath Type
+
+
+ jsonpath
+
+
+
+ The jsonpath type implements support for the SQL/JSON path language
+ in PostgreSQL to effectively query JSON data.
+ It provides a binary representation of the parsed SQL/JSON path
+ expression that specifies the items to be retrieved by the path
+ engine from the JSON data for further processing with the
+ SQL/JSON query functions.
+
+
+
+ The SQL/JSON path language is fully integrated into the SQL engine:
+ the semantics of its predicates and operators generally follow SQL.
+ At the same time, to provide a most natural way of working with JSON data,
+ SQL/JSON path syntax uses some of the JavaScript conventions:
+
+
+
+
+
+ Dot . is used for member access.
+
+
+
+
+ Square brackets [] are used for array access.
+
+
+
+
+ SQL/JSON arrays are 0-relative, unlike regular SQL arrays that start from 1.
+
+
+
+
+
+ An SQL/JSON path expression is an SQL character string literal,
+ so it must be enclosed in single quotes. Following the JavaScript
+ conventions, character string literals within the path expression
+ must be enclosed in double quotes. Any single quotes within this
+ character string literal must be escaped with a single quote
+ by the SQL convention.
+
+
+
+ A path expression consists of a sequence of path elements,
+ which can be the following:
+
+
+
+ Path literals of JSON primitive types:
+ Unicode text, numeric, true, false, or null.
+
+
+
+
+ Path variables listed in .
+
+
+
+
+ Accessor operators listed in .
+
+
+
+
+ jsonpath operators
+ and methods listed in
+
+
+
+
+ Parentheses, which can be used to provide filter expressions
+ or define the order of path evaluation.
+
+
+
+
+
+
+ For details on using jsonpath expressions with SQL/JSON
+ query functions, see .
+
+
+
+ jsonpath Variables
+
+
+
+ Variable
+ Description
+
+
+
+
+ $
+ A variable representing the JSON text to be queried
+ (the context item).
+
+
+
+ $varname
+ A named variable. Its value must be set in the
+ PASSING clause. See
+ for details.
+
+
+
+ @
+ A variable representing the result of path evaluation
+ in filter expressions.
+
+
+
+
+
+
+
+ jsonpath Accessors
+
+
+
+ Accessor Operator
+ Description
+
+
+
+
+ .key
+ .$"varname"
+
+ Member accessor that returns an object
+ member with the specified key. If the key name is a named
+ variable starting with $ or does not
+ meet the JavaScript rules of an identifier, it must be enclosed in
+ double quotes as a character string literal.
+
+
+ .*
+ Wildcard member accessor that returns the values of all
+ members located at the top level of the current object.
+
+
+ .**
+ Recursive wildcard member accessor that processes
+ all levels of the JSON hierarchy of the current object and
+ returns all the member values, regardless of their nesting
+ level. This is a PostgreSQL
+ extension of the SQL/JSON standard.
+
+
+
+ [subscript, ...]
+ [subscript to last]
+
+ Array element accessor. The provided numeric subscripts return
+ the corresponding array elements. The first element in an array is
+ accessed with [0]. The last keyword denotes the last subscript
+ in an array and can be used to handle arrays of unknown length.
+
+
+ [*]
+ Wildcard array element accessor that returns all array elements.
+
+
+
+
+
+
+ For details on using jsonpath expressions with SQL/JSON query
+ functions, see .
+
+
+