Re: pg_clog error

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: terry(at)greatgulfhomes(dot)com
Cc: "Postgres (E-mail)" <pgsql-general(at)postgresql(dot)org>
Subject: Re: pg_clog error
Date: 2002-07-25 15:16:48
Message-ID: 18587.1027610208@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

terry(at)greatgulfhomes(dot)com writes:
> Every night I pull data from a legacy system. Last night for the first time
> I got the error message:
> FATAL 2: open of /usr/local/pgsql/data/pg_clog/0081 failed: No such file or
> directory

The cases of this that have been seen so far have been traced to
hardware problems, as far as we can tell. (The specific error message
arises from trying to look up the commit status of a transaction
number that is far beyond the range of transaction numbers actually
used so far in your installation --- ie, some tuple someplace contained
a garbaged transaction-number field.)

Since you later report that the error was not reproducible, I'm
wondering about bad RAM that intermittently drops bits. Might be
time to try some memory diagnostics.

regards, tom lane

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Stephan Szabo 2002-07-25 15:18:19 Re: regression test
Previous Message pgsql-gen Newsgroup (@Basebeans.com) 2002-07-25 15:10:02 *FREE* slides of an upcoming web presendation : MVC+ STANDARD tags