Re: Trouble Upgrading Postgres

From: "Daniel Verite" <daniel(at)manitou-mail(dot)org>
To: "Charles Martin" <ssappeals(at)gmail(dot)com>
Cc: "Adrian Klaver" <adrian(dot)klaver(at)aklaver(dot)com>,"Tom Lane" <tgl(at)sss(dot)pgh(dot)pa(dot)us>,"pgsql-general" <pgsql-general(at)postgresql(dot)org>
Subject: Re: Trouble Upgrading Postgres
Date: 2018-11-05 17:54:21
Message-ID: d31de2ca-5a8d-468d-8b86-b66c294f9464@manitou-mail.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Charles Martin wrote:

> SELECT max(length(docfilecontents::text)) FROM docfile;
> and after a very long time, got:
> ERROR: invalid memory alloc request size 1636085512 SQL state: XX000

It would mean that at least one row has a "docfilecontents"
close to 0.5GB in size. Or that the size fields in certain rows
are corrupted, although that's less plausible if you have
no reason to suspect hardware errors.

Does the following query work:

SELECT max(octet_length(docfilecontents)) FROM docfile;

or maybe a histogram by size in hundred of megabytes:

SELECT octet_length(docfilecontents)/(1024*1024*100),
count(*)
FROM docfile
GROUP BY octet_length(docfilecontents)/(1024*1024*100);

Note that the error message above does not say that there's not enough
free memory, it says that it won't even try to allocate that much, because
1636085512 is over the "varlena limit" of 1GB.
AFAICS I'm afraid that this table as it is now cannot be exported
by pg_dump, even if you had enough free memory, because any individual
row in COPY cannot exceed 1GB in text format.

Best regards,
--
Daniel Vérité
PostgreSQL-powered mailer: http://www.manitou-mail.org
Twitter: @DanielVerite

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Charles Martin 2018-11-05 19:02:07 Re: Trouble Upgrading Postgres
Previous Message Ron 2018-11-05 17:19:52 Re: Trouble Upgrading Postgres