| From: | Maxim Zibitsker <max(dot)zibitsker(at)gmail(dot)com> |
|---|---|
| To: | pgsql-hackers(at)postgresql(dot)org |
| Subject: | Support allocating memory for large strings |
| Date: | 2025-11-08 02:15:22 |
| Message-ID: | 5B146D62-35AE-4822-9B98-37E386AFB942@gmail.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers |
PostgreSQL's MaxAllocSize limit prevents storing individual variable-length character strings exceeding ~1GB, causing "invalid memory alloc request size" errors during INSERT operations on tables with large text columns. Example reproduction included in artifacts.md.
This limitation also affects pg_dump when exporting a PostgreSQL database with such data. The attached patches demonstrates a proof of concept using palloc_extended with MCXT_ALLOC_HUGE in the write path. For the read path, there are a couple of possible approaches: extending existing functions to handle huge allocations, or implementing a chunked storage mechanism that avoids single large allocations.
Thoughts?
Maxim
| Attachment | Content-Type | Size |
|---|---|---|
| 0001-Support-allocating-memory-for-large-strings.patch | application/octet-stream | 1001 bytes |
| unknown_filename | text/plain | 1 byte |
| artifacts.md | text/markdown | 6.1 KB |
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Tom Lane | 2025-11-08 02:32:45 | Re: Support allocating memory for large strings |
| Previous Message | Tom Lane | 2025-11-08 00:24:24 | Re: IO in wrong state on riscv64 |