pg_dump on large columns

From: Adam Siegel <adam(at)sycamorehq(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: pg_dump on large columns
Date: 2002-10-16 14:01:04
Message-ID: 1034776871.1053.10.camel@localhost.localdomain
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

I have a table with several columns, including a text column. It has
about 100 records. The text column general contains about 5MB of ascii
data that is escaped per record. pg_dump goes fine. However when I do
a psql -f dumpfile, postgres just spins it wheels and never finishes the
restore. The text was inserted into the table using the PQ API by
building an sql statement and using PQexec. Selects from the table work
just fine. Any thoughts would be appreciated. I am using 7.2.

Browse pgsql-general by date

  From Date Subject
Next Message Shridhar Daithankar 2002-10-16 14:16:01 Re: Queries take forever on ported database from MSSQL -> Postgresql
Previous Message Jeff Boes 2002-10-16 13:43:46 Re: cannot open segment 1 of relation...