Re: Out of Memory errors while running pg_dump

From: Erik Jones <erik(at)myemma(dot)com>
To: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: pgsql mailing list <pgsql-general(at)postgresql(dot)org>
Subject: Re: Out of Memory errors while running pg_dump
Date: 2008-02-04 20:03:39
Message-ID: B03F50EB-EBF3-456E-9EE6-4A8A3126478F@myemma.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


On Feb 4, 2008, at 1:27 PM, Tom Lane wrote:

> We'd need to see more details to really give decent advice. Exactly
> what queries and exactly what was the error message (in particular
> I'm wondering how large the failed request was)? Which PG version?
> Can you get the memory context dump out of the postmaster log?

Sure. I've attached an archive with the full memory context and
error for each. Note that I'm already 99% sure that this is due to
our exorbitantly large relation set which is why I think pg_dump's
catalog queries are running out of work_mem (currently at just over
32MB).

Attachment Content-Type Size
errs.tar.gz application/x-gzip 2.2 KB

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Richard Huxton 2008-02-04 20:11:24 Re: Alternative to tableoids?
Previous Message Joshua D. Drake 2008-02-04 19:59:31 Re: [GENERAL] PostgreSQL Certification