After a long battle with technology, Matthias Blohm <m(dot)blohm(at)digisec(dot)de>, an earthling, wrote:
> a question about a tool or a possibility how could something work.
> following situation:
> we have a database which is full of very sensitive information and needed that db to use our online website.
> but now we move the website to a server outside our office and needed to replicate only some datas to the online db.
> with the tool slony i found out , that some tables could be
> replicated, but in some tables are some information, which we do not
> wont to replicate.
> so we need a tool or an idea how to do that.
> I though about a dump and deleting the sensitive datas, but the
> database is about a half gig and we need the changed entries directly
> on the online db within seconds.
> Anybody how could help?
Is there some way you could separate out the "sensitive" material into
a separate table so that you'd have:
- Table "public_stuff" with the replicable data;
- Table "private_stuff" with the data that shouldn't be replicated;
- View "all_stuff", which unifies the data when it needs to be
You might use a rule to decide which data goes where, or perhaps a
stored procedure "create_stuff(elements)".
select 'cbbrowne' || '@' || 'ntlug.org';
Perhaps the purpose of categorical algebra is to show that that which is
trivial, is trivially trivial.
In response to
pgsql-general by date
|Next:||From: Oleg||Date: 2004-08-23 13:23:12|
|Subject: Re: problem with postgresql-dump while upgrading to 7.4|
|Previous:||From: Secrétariat||Date: 2004-08-23 13:03:33|
|Subject: Connection to a PG 8.0 Beta 1 win32 server|