From: | Oliver Kohll <oliver(dot)lists(at)gtwm(dot)co(dot)uk> |
---|---|
To: | "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org> |
Subject: | Multi master use case? |
Date: | 2012-01-26 23:38:03 |
Message-ID: | 76D127FC-6CDF-4C5F-AE6D-CA6FF57C0121@gtwm.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hello,
A client of ours has always had problems with slow internet connectivity - they are in a part of the country where that is a problem. There are a few hundred staff sharing a couple of asymmetric (ADSL) connections. One issue is with accessing their web-based Postgres app, which we host. Now they don't want to run it internally for a lot of the usual reasons, not least they have many distributed workers and trying to serve data from an already congested spot would be a non starter.
Is this a case for multi master do you think? I.e. running one on the internet, one locally.
Looking through the wiki
http://wiki.postgresql.org/wiki/Replication,_Clustering,_and_Connection_Pooling
it seems there are a few solutions that have now gained maturity. Something like rubyrep sounds ideal. It would have to deal with
a) a flaky local connection
b) changing schemas (new tables, fields, views etc.) as well as data
Create/update/delete frequencies are reasonably low, generally individuals updating single records so of the order of thousands per day max.
Any experiences/thoughts?
Oliver Kohll
www.gtwm.co.uk
From | Date | Subject | |
---|---|---|---|
Next Message | Scott Marlowe | 2012-01-27 00:27:04 | Re: Preventing access temporarily. |
Previous Message | Chris Travers | 2012-01-26 23:36:15 | Re: Don't Thread On Me (PostgreSQL related) |