I have been working on external replication on Postgresql 9.2 for a
(with too many interruptions blocking my progress!)
Who knows a good utility to aggressively analyze
and recover Postgresql Databases?
It seems the standard reply that I see
is "Make regular backups", but that guarantees maximum full data loss
defined by the backup time interval.
Our MariaDB Mysql/ExtraDB/Innodb friends and Aria_check and some other tools
to "recover" as much as possible up to the moment of failure.
While full replication is the ultimate safeguard, in "split brain" mode,
see a hardware failure causing loss of data up to the last replication
or last backup interval.
During a data crash, I want the recovery tool to HELP me get as much
and get back to operations. What I do not want to do is a bunch of
manual command line
file copy and deletes to "guess" my way back to operational mode (some
data loss is inevitable)
I could make a daily snapshot of the system catalog to assist the
recovery tool in
restoring the database.
Who has ideas on this?
In response to
pgsql-hackers by date
|Next:||From: John Lumby||Date: 2012-11-06 13:53:03|
|Subject: Re: [PATCH] Prefetch index pages for B-Tree index scans|
|Previous:||From: Amit Kapila||Date: 2012-11-06 11:26:36|
|Subject: Re: Proposal [modified] for Allow postgresql.conf values to be changed via SQL|