allocation limit for encoding conversion

From: Alvaro Herrera <alvherre(at)2ndquadrant(dot)com>
To: pgsql-hackers(at)lists(dot)postgresql(dot)org
Subject: allocation limit for encoding conversion
Date: 2019-08-16 18:14:18
Message-ID: 20190816181418.GA898@alvherre.pgsql
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-hackers

Somebody ran into issues when generating large XML output (upwards of
256 MB) and then sending via a connection with a different
client_encoding. This occurs because we pessimistically allocate 4x as
much memory as the string needs, and we run into the 1GB palloc
limitation. ISTM we can do better now by using huge allocations, as per
the preliminary attached patch (which probably needs an updated overflow
check rather than have it removed altogether); but at least it is able
to process this query, which it wasn't without the patch:

select query_to_xml(
'select a, cash_words(a::text::money) from generate_series(0, 2000000) a',
true, false, '');

Álvaro Herrera

Attachment Content-Type Size
v1-0001-Use-huge-pallocs-on-encoding-conversions.patch text/x-diff 1.9 KB


Browse pgsql-hackers by date

  From Date Subject
Next Message Alvaro Herrera 2019-08-16 18:35:09 Re: allocation limit for encoding conversion
Previous Message Jonathan S. Katz 2019-08-16 18:11:57 Re: Add "password_protocol" connection parameter to libpq