Best way to parse complex json string into table columns?

From: Sbob <sbob(at)quadratum-braccas(dot)com>
To: Pgsql-admin <pgsql-admin(at)lists(dot)postgresql(dot)org>
Subject: Best way to parse complex json string into table columns?
Date: 2023-12-13 16:55:15
Message-ID: d9b41f36-2b7f-4457-b8c2-54493ffa46ce@quadratum-braccas.com
Views: Whole Thread | Raw Message | Download mbox | Resend email
Thread:
Lists: pgsql-admin

All;

I have a client that is using a function to parse a set of rows from a
table with a json column into table columns in a materialized view.

The table is 2 columns, an id column (Primary key) and the json column
(jsonb data type)

Sample json string:

{"TXregid": "61xxx42d27xxx829g9faf414", "prices": {"MyPrice": "191.344",
"priceSource": "SAMS", "priceStatus": "P", "PriceDate": "2023-12-06",
"PriceType": "ABC     ", "lastUpdatedDate":
"2022-12-07T04:39:49.664+0000"}, "descriptive": {"internal_status":
{"factor": "0.33292", "codename": "XF867", "couponType": "PCTOFF",
"factorDate": "2022-12-01T00:00:00", "factorType": "N", "currentFactor":
"0.292", "accrualDayCount": "30", "paymentDelayDays": 355,
"factoredIndicator": "true", "puttableIndicator": "false",
"mortgageAgencyCode": "93", "tradingFlatIndicator": "false",
"xDefaultIndicator": "false", "originalIssueDiscountCode": "N",
"technicalDefaultIndicator": false}, "StatusCore": {"pxiValue": "1",
"datedDate": "2012-08-01T00:00:00", "issueDate": "2012-08-01T00:00:00",
"issueType": "CST", "legalName": "My National Cash", "maturityDate":
"2092-08-01T00:00:00", "fcgtIndicator": "false", "countryOfIssue": "US",
"couponFreqDesc": "Monthly", "lastCouponDate": "2041-08-01",
"otherIssuerId": "GHT7721841", "firstCouponDate": "2011-09-25T00:00:00",
"issueDescription": "My Mortgag 3.5% AO6867 08/01/2042", "DCxIndicator":
"false", "currentCouponRate": "3.5", "issuerDescription": "My National
Mortgage", "outstandingAmount": 9139371.2105, "principalCurrency":
"USD", "couponFreqTimeUnit": "MO", "firstSettlementDate":
"2012-08-01T00:00:00", "couponFreqUnitQuantity": "1",
"paymentinKindIndicator": false, "exchangeTradedIndicator": "false"},
"assetServicing": {"XCPndicator": false, "BackendStatus": "OPEN",
"STXIndicator": "false", "DTVPIndicator": true, "RCYPIndicator":
"true"}, "classTAG": "DEBT", "classTAGname": "DEBT Level 1",
"BackendLevelCode": "SRO", "OtherLevel": "BOND22", "Factor3Code":
"TRX-VV7", "Factor3Name": "MORTGAGE2", "AccessLevel": "GENERAL",
"ActiveSystemStatus": "WAIT", "BH9Code": "PGx79S", "BH9Name": "Martin",
"BH11Code": "S9a", "BH11Name": "SOUTH", "BH12Code": "USSR", "BH12Name":
"FARM", "BusinessStatus": "ISTX"}, "identifiers": {"csstrx":
"3138LXTZ2", "usstrx": "16475266"}

The parse function adds most elements as a column, such as:

CREATE materialized view new_view_mv as

SELECT id,

(((base_table.json_col -> 'TXregid'::text)))                        AS
reg_id,

(((base_table.json_col -> 'descriptive'::text) ->> 'BH9Code'::text))    
AS bh9_code,

(((base_table.json_col -> 'identifiers'::text) ->> 'usstrx'::text)) AS
uss_trx_code,

etc... for almost every json element

The above approach is painfully slow, is there a better performing
method of converting json strings to table columns?

Thanks in advance

Responses

Browse pgsql-admin by date

  From Date Subject
Next Message Tom Lane 2023-12-13 17:24:56 Re: Best way to parse complex json string into table columns?
Previous Message Ron Johnson 2023-12-13 15:18:45 Re: Verify data after backup and restore