From: | George Robinson II <george(dot)robinson(at)eurekabroadband(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | vacuumdb failed |
Date: | 2000-08-25 18:46:47 |
Message-ID: | 39A6BF17.9D60C9E@eurekabroadband.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Last night, while my perl script was doing a huge insert operation, I
got this error...
DBD::Pg::st execute failed: ERROR: copy: line 4857, pg_atoi: error
reading "2244904358": Result too large
Now, I'm not sure if this is related, but while trying to do vacuumdb
<dbname>, I got...
NOTICE: FlushRelationBuffers(all_flows, 500237): block 171439 is
referenced (private 0, global 1)
FATAL 1: VACUUM (vc_repair_frag): FlushRelationBuffers returned -2
pqReadData() -- backend closed the channel unexpectedly.
This probably means the backend terminated abnormally
before or while processing the request.
connection to server was lost
vacuumdb: vacuum failed
Any ideas? I'm trying a couple other things right now. By the way,
this database has one table that is HUGE. What is the limit on table
size in postgresql7? The faq says unlimited. If that's true, how do
you get around the 2G file size limit that (at least) I have in solaris
2.6?
Thank you.
-g2
From | Date | Subject | |
---|---|---|---|
Next Message | Ross J. Reedstrom | 2000-08-25 19:24:06 | Re: Creating a DB for another user (or...) (repost attempt) |
Previous Message | Martin A. Marques | 2000-08-25 18:30:31 | Re: alter table and constraints |