Multi master use case?

Started by Oliver Kohll - Mailing Listsabout 14 years ago3 messagesgeneral
Jump to latest
#1Oliver Kohll - Mailing Lists
oliver.lists@gtwm.co.uk

Hello,

A client of ours has always had problems with slow internet connectivity - they are in a part of the country where that is a problem. There are a few hundred staff sharing a couple of asymmetric (ADSL) connections. One issue is with accessing their web-based Postgres app, which we host. Now they don't want to run it internally for a lot of the usual reasons, not least they have many distributed workers and trying to serve data from an already congested spot would be a non starter.

Is this a case for multi master do you think? I.e. running one on the internet, one locally.

Looking through the wiki

http://wiki.postgresql.org/wiki/Replication,_Clustering,_and_Connection_Pooling

it seems there are a few solutions that have now gained maturity. Something like rubyrep sounds ideal. It would have to deal with
a) a flaky local connection
b) changing schemas (new tables, fields, views etc.) as well as data

Create/update/delete frequencies are reasonably low, generally individuals updating single records so of the order of thousands per day max.

Any experiences/thoughts?

Oliver Kohll
www.gtwm.co.uk

#2Greg Sabino Mullane
greg@turnstep.com
In reply to: Oliver Kohll - Mailing Lists (#1)
Re: Multi master use case?

-----BEGIN PGP SIGNED MESSAGE-----
Hash: RIPEMD160

Is this a case for multi master do you think?
I.e. running one on the internet, one locally.

Yes, could be.

b) changing schemas (new tables, fields, views etc.) as well as data

That's a tall order; I don't think anything will do that automatically,
although rubyrep claims to at least pick up new tables.

Any experiences/thoughts?

My experience is with Bucardo, which should do the job admirably
(but with the data only). My advice would be to just set up a test
system and try rubyrep and Bucardo out. For the latter, use the
latest Bucardo5 beta, as Bucardo4 will be deprecated soon:

http://bucardo.org/downloads/Bucardo-4.99.3.tar.gz

- --
Greg Sabino Mullane greg@turnstep.com
End Point Corporation http://www.endpoint.com/
PGP Key: 0x14964AC8 201201281026
http://biglumber.com/x/web?pk=2529DF6AB8F79407E94445B4BC9B906714964AC8
-----BEGIN PGP SIGNATURE-----

iEYEAREDAAYFAk8kE6gACgkQvJuQZxSWSshlVQCfXe2WdI58CRhKGIIF7mhfgNGb
hqAAn2tK1ALFwGwsspapzMKa3l728Sz4
=aYwt
-----END PGP SIGNATURE-----

#3Oliver Kohll - Mailing Lists
oliver.lists@gtwm.co.uk
In reply to: Greg Sabino Mullane (#2)
Re: Multi master use case?

On 28 Jan 2012, at 15:27, "Greg Sabino Mullane" <greg@turnstep.com> wrote:

Is this a case for multi master do you think?
I.e. running one on the internet, one locally.

Yes, could be.

b) changing schemas (new tables, fields, views etc.) as well as data

That's a tall order; I don't think anything will do that automatically,
although rubyrep claims to at least pick up new tables.

OK, I guess I could treat one as 'schema master' and pg_dump schema + data across to the other once a night, once all activity has stopped and standard replication completed.

Any experiences/thoughts?

My experience is with Bucardo, which should do the job admirably
(but with the data only). My advice would be to just set up a test
system and try rubyrep and Bucardo out. For the latter, use the
latest Bucardo5 beta, as Bucardo4 will be deprecated soon:

http://bucardo.org/downloads/Bucardo-4.99.3.tar.gz

Thanks, I'll do that.

Oliver
www.agilebase.co.uk