Inherited an 18TB DB & need to backup

Started by Suhail Bamzenaalmost 6 years ago34 messagesgeneral
Jump to latest
#1Suhail Bamzena
suhailsalem@gmail.com

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.
Apparently this database has never been backed up and I have been tasked to
set in a periodic backup routine (weekly full & daily incremental) and dump
it into a NAS. What is the best way to go about this? Did some reading and
hear that pgbackrest does a good job with such huge sizes. Your expert
advise is needed.

--
Cheers,

Suhail
Cell# +97150 8194870

#2Rory Campbell-Lange
rory@campbell-lange.net
In reply to: Suhail Bamzena (#1)
Re: Inherited an 18TB DB & need to backup

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.
Apparently this database has never been backed up and I have been tasked to
set in a periodic backup routine (weekly full & daily incremental) and dump
it into a NAS. What is the best way to go about this? Did some reading and
hear that pgbackrest does a good job with such huge sizes. Your expert
advise is needed.

Incremental backups suggest the need to backup WAL archives. See
https://www.postgresql.org/docs/9.2/continuous-archiving.html

pgbackrest looks very cool but we haven't used it.

A very simple solution could be just to dump the database daily with
pg_dump, if you have the space and machine capacity to do it. Depending
on what you are storing, you can achieve good compression with this, and
it is a great way of having a simple file from which to restore a
database.

Our ~200GB cluster resolves to under 10GB of pg_dump files, although
18TB is a whole different order of size.

Rory

#3Christoph Berg
myon@debian.org
In reply to: Rory Campbell-Lange (#2)
Re: Inherited an 18TB DB & need to backup

Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.

Push hard to get that upgraded to a supported version.

Christoph

#4Suhail Bamzena
suhailsalem@gmail.com
In reply to: Christoph Berg (#3)
Re: Inherited an 18TB DB & need to backup

Hi Christoph
Thats very high on my agenda.. but need to make sure i can backup this
beast to start with..

On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de>
wrote:

Show quoted text

Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.

Push hard to get that upgraded to a supported version.

Christoph

#5Jan Karremans
jan.karremans@enterprisedb.com
In reply to: Suhail Bamzena (#4)
Re: Inherited an 18TB DB & need to backup

Hi Suhail,

That is not an issue. We have customers backing up Postgres databases up to 80 TB.

Mit freundlichem Gruß, kind regards,

Jan Karremans
Director of Sales Engineering, EMEA
Senior Sales Engineer DACH-Region
EDB Postgres Advanced Server Professional
-- Postgres Everywhere --

Oracle ACE Alumni

- Deutsch ist nicht meine Muttersprache, bitte verzeihen Sie Verstöße gegen die Rechtschreibung -

Mobile: +31-(0)6-1638 9607

http://www.enterprisedb.com/

Don't walk behind me; I may not lead.
Don't walk in front of me; I may not follow.
Just walk beside me and be my friend.
+*+ Albert Camus +*+

Show quoted text

Op 15 mei 2020, om 15:23 heeft Suhail Bamzena <suhailsalem@gmail.com> het volgende geschreven:

Hi Christoph
Thats very high on my agenda.. but need to make sure i can backup this beast to start with..

On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de <mailto:christoph.berg@credativ.de>> wrote:
Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com <mailto:suhailsalem@gmail.com>) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.

Push hard to get that upgraded to a supported version.

Christoph

Attachments:

uc.pngimage/png; name=uc.png; x-unix-mode=0666Download
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiff; x-unix-mode=0666Download
#6Wolff, Ken L
ken.l.wolff@lmco.com
In reply to: Jan Karremans (#5)
Re: Inherited an 18TB DB & need to backup

Depending on your storage subsystem, perhaps storage-level snapshots might be an option? They often seem to be the best choice for VLDBs.

From: Suhail Bamzena <suhailsalem@gmail.com>
Sent: Friday, May 15, 2020 7:23 AM
To: Christoph Berg <christoph.berg@credativ.de>
Cc: Rory Campbell-Lange <rory@campbell-lange.net>; pgsql-general@lists.postgresql.org; pgeu-general@lists.postgresql.org
Subject: EXTERNAL: Re: Inherited an 18TB DB & need to backup

Hi Christoph
Thats very high on my agenda.. but need to make sure i can backup this beast to start with..
On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de<mailto:christoph.berg@credativ.de>> wrote:
Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com<mailto:suhailsalem@gmail.com>) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.

Push hard to get that upgraded to a supported version.

Christoph

#7Suhail Bamzena
suhailsalem@gmail.com
In reply to: Rory Campbell-Lange (#2)
Re: Inherited an 18TB DB & need to backup

Thanks Rory, the machine has the capacity to pull through pg_dumps but like
u rightly mentioned incremental backups mean that we will need to work with
the wal's.. 18TB is what is the scary part and with compression I dont see
it being less than 2TB a day...

On Fri, 15 May 2020, 17:02 Rory Campbell-Lange, <rory@campbell-lange.net>
wrote:

Show quoted text

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.
Apparently this database has never been backed up and I have been tasked

to

set in a periodic backup routine (weekly full & daily incremental) and

dump

it into a NAS. What is the best way to go about this? Did some reading

and

hear that pgbackrest does a good job with such huge sizes. Your expert
advise is needed.

Incremental backups suggest the need to backup WAL archives. See
https://www.postgresql.org/docs/9.2/continuous-archiving.html

pgbackrest looks very cool but we haven't used it.

A very simple solution could be just to dump the database daily with
pg_dump, if you have the space and machine capacity to do it. Depending
on what you are storing, you can achieve good compression with this, and
it is a great way of having a simple file from which to restore a
database.

Our ~200GB cluster resolves to under 10GB of pg_dump files, although
18TB is a whole different order of size.

Rory

#8Suhail Bamzena
suhailsalem@gmail.com
In reply to: Jan Karremans (#5)
Re: Inherited an 18TB DB & need to backup

Thanks Jan.. just to know more is it using the native pg_dump or another
tool like pgbackrest or barman ??

On Fri, 15 May 2020, 17:26 Jan Karremans, <jan.karremans@enterprisedb.com>
wrote:

Show quoted text

Hi Suhail,

That is not an issue. We have customers backing up Postgres databases up
to 80 TB.

Mit freundlichem Gruß, kind regards,

*Jan Karremans*Director of Sales Engineering, EMEA
Senior Sales Engineer DACH-Region
EDB Postgres Advanced Server Professional
-- Postgres Everywhere --

Oracle ACE Alumni

- Deutsch ist nicht meine Muttersprache, bitte verzeihen Sie Verstöße
gegen die Rechtschreibung -

Mobile: +31-(0)6-1638 9607

http://www.enterprisedb.com/

*Don't walk behind me*; I may not lead.
*Don't walk in front of me*; I may not follow.
*Just walk beside me* and be my friend.
+*+ Albert Camus +*+

Op 15 mei 2020, om 15:23 heeft Suhail Bamzena <suhailsalem@gmail.com> het
volgende geschreven:

Hi Christoph
Thats very high on my agenda.. but need to make sure i can backup this
beast to start with..

On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de>
wrote:

Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version

9.2.

Push hard to get that upgraded to a supported version.

Christoph

Attachments:

uc.pngimage/png; name=uc.pngDownload
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiffDownload
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiffDownload
uc.pngimage/png; name=uc.pngDownload
#9Jan Karremans
jan.karremans@enterprisedb.com
In reply to: Suhail Bamzena (#8)
Re: Inherited an 18TB DB & need to backup

Hi Suhail,

I do not know personally, but I can figure out / get you setup with someone who can tell you more.

Mit freundlichem Gruß, kind regards,

Jan Karremans
Director of Sales Engineering, EMEA
Senior Sales Engineer DACH-Region
EDB Postgres Advanced Server Professional
-- Postgres Everywhere --

Oracle ACE Alumni

- Deutsch ist nicht meine Muttersprache, bitte verzeihen Sie Verstöße gegen die Rechtschreibung -

Mobile: +31-(0)6-1638 9607

http://www.enterprisedb.com/

Don't walk behind me; I may not lead.
Don't walk in front of me; I may not follow.
Just walk beside me and be my friend.
+*+ Albert Camus +*+

Show quoted text

Op 15 mei 2020, om 15:31 heeft Suhail Bamzena <suhailsalem@gmail.com> het volgende geschreven:

Thanks Jan.. just to know more is it using the native pg_dump or another tool like pgbackrest or barman ??

On Fri, 15 May 2020, 17:26 Jan Karremans, <jan.karremans@enterprisedb.com <mailto:jan.karremans@enterprisedb.com>> wrote:
Hi Suhail,

That is not an issue. We have customers backing up Postgres databases up to 80 TB.

Mit freundlichem Gruß, kind regards,

Jan Karremans
Director of Sales Engineering, EMEA
Senior Sales Engineer DACH-Region
EDB Postgres Advanced Server Professional
-- Postgres Everywhere --

Oracle ACE Alumni

- Deutsch ist nicht meine Muttersprache, bitte verzeihen Sie Verstöße gegen die Rechtschreibung -

Mobile: +31-(0)6-1638 9607

http://www.enterprisedb.com/ <http://www.enterprisedb.com/&gt;

Don't walk behind me; I may not lead.
Don't walk in front of me; I may not follow.
Just walk beside me and be my friend.
+*+ Albert Camus +*+

Op 15 mei 2020, om 15:23 heeft Suhail Bamzena <suhailsalem@gmail.com <mailto:suhailsalem@gmail.com>> het volgende geschreven:

Hi Christoph
Thats very high on my agenda.. but need to make sure i can backup this beast to start with..

On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de <mailto:christoph.berg@credativ.de>> wrote:
Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com <mailto:suhailsalem@gmail.com>) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.

Push hard to get that upgraded to a supported version.

Christoph

<PastedGraphic-4.tiff><uc.png><uc.png><uc.png>

Attachments:

uc.pngimage/png; name=uc.png; x-unix-mode=0666Download
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiff; x-unix-mode=0666Download
#10Suhail Bamzena
suhailsalem@gmail.com
In reply to: Wolff, Ken L (#6)
Re: Inherited an 18TB DB & need to backup

I can look into that.. not very sure abt the storage infra..

On Fri, 15 May 2020, 17:26 Wolff, Ken L, <ken.l.wolff@lmco.com> wrote:

Show quoted text

Depending on your storage subsystem, perhaps storage-level snapshots might
be an option? They often seem to be the best choice for VLDBs.

*From:* Suhail Bamzena <suhailsalem@gmail.com>
*Sent:* Friday, May 15, 2020 7:23 AM
*To:* Christoph Berg <christoph.berg@credativ.de>
*Cc:* Rory Campbell-Lange <rory@campbell-lange.net>;
pgsql-general@lists.postgresql.org; pgeu-general@lists.postgresql.org
*Subject:* EXTERNAL: Re: Inherited an 18TB DB & need to backup

Hi Christoph

Thats very high on my agenda.. but need to make sure i can backup this
beast to start with..

On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de>
wrote:

Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version 9.2.

Push hard to get that upgraded to a supported version.

Christoph

#11Suhail Bamzena
suhailsalem@gmail.com
In reply to: Jan Karremans (#9)
Re: Inherited an 18TB DB & need to backup

Thanks Jan.. would appreciate all the info I can get.

On Fri, 15 May 2020, 17:32 Jan Karremans, <jan.karremans@enterprisedb.com>
wrote:

Show quoted text

Hi Suhail,

I do not know personally, but I can figure out / get you setup with
someone who can tell you more.

Mit freundlichem Gruß, kind regards,

*Jan Karremans*Director of Sales Engineering, EMEA
Senior Sales Engineer DACH-Region
EDB Postgres Advanced Server Professional
-- Postgres Everywhere --

Oracle ACE Alumni

- Deutsch ist nicht meine Muttersprache, bitte verzeihen Sie Verstöße
gegen die Rechtschreibung -

Mobile: +31-(0)6-1638 9607

http://www.enterprisedb.com/

*Don't walk behind me*; I may not lead.
*Don't walk in front of me*; I may not follow.
*Just walk beside me* and be my friend.
+*+ Albert Camus +*+

Op 15 mei 2020, om 15:31 heeft Suhail Bamzena <suhailsalem@gmail.com> het
volgende geschreven:

Thanks Jan.. just to know more is it using the native pg_dump or another
tool like pgbackrest or barman ??

On Fri, 15 May 2020, 17:26 Jan Karremans, <jan.karremans@enterprisedb.com>
wrote:

Hi Suhail,

That is not an issue. We have customers backing up Postgres databases up
to 80 TB.

Mit freundlichem Gruß, kind regards,

*Jan Karremans*Director of Sales Engineering, EMEA
Senior Sales Engineer DACH-Region
EDB Postgres Advanced Server Professional
-- Postgres Everywhere --

Oracle ACE Alumni

- Deutsch ist nicht meine Muttersprache, bitte verzeihen Sie Verstöße
gegen die Rechtschreibung -

Mobile: +31-(0)6-1638 9607

http://www.enterprisedb.com/

*Don't walk behind me*; I may not lead.
*Don't walk in front of me*; I may not follow.
*Just walk beside me* and be my friend.
+*+ Albert Camus +*+

Op 15 mei 2020, om 15:23 heeft Suhail Bamzena <suhailsalem@gmail.com>
het volgende geschreven:

Hi Christoph
Thats very high on my agenda.. but need to make sure i can backup this
beast to start with..

On Fri, 15 May 2020, 17:08 Christoph Berg, <christoph.berg@credativ.de>
wrote:

Re: Rory Campbell-Lange

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Hello All,
I have very recently inherited an 18 TB DB that is running version

9.2.

Push hard to get that upgraded to a supported version.

Christoph

<PastedGraphic-4.tiff><uc.png><uc.png><uc.png>

Attachments:

uc.pngimage/png; name=uc.pngDownload
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiffDownload
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiffDownload
uc.pngimage/png; name=uc.pngDownload
PastedGraphic-4.tiffimage/tiff; name=PastedGraphic-4.tiffDownload
uc.pngimage/png; name=uc.pngDownload
#12Rory Campbell-Lange
rory@campbell-lange.net
In reply to: Suhail Bamzena (#7)
Re: Inherited an 18TB DB & need to backup

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Thanks Rory, the machine has the capacity to pull through pg_dumps but like
u rightly mentioned incremental backups mean that we will need to work with
the wal's.. 18TB is what is the scary part and with compression I dont see
it being less than 2TB a day...

I suggest you try one immediately, assuming you can dump it somewhere
and the machine has the resources. Then you will at least have a backup
and a sense of what is possible. (Which will help inform your upgrade
strategy too.)

Rory

#13Suhail Bamzena
suhailsalem@gmail.com
In reply to: Rory Campbell-Lange (#12)
Re: Inherited an 18TB DB & need to backup

Yeah Rory want to pull one asap..hopefully by COB tonight..

On Fri, 15 May 2020, 17:35 Rory Campbell-Lange, <rory@campbell-lange.net>
wrote:

Show quoted text

On 15/05/20, Suhail Bamzena (suhailsalem@gmail.com) wrote:

Thanks Rory, the machine has the capacity to pull through pg_dumps but

like

u rightly mentioned incremental backups mean that we will need to work

with

the wal's.. 18TB is what is the scary part and with compression I dont

see

it being less than 2TB a day...

I suggest you try one immediately, assuming you can dump it somewhere
and the machine has the resources. Then you will at least have a backup
and a sense of what is possible. (Which will help inform your upgrade
strategy too.)

Rory

#14Ravi Krishna
srkrishna1@comcast.net
In reply to: Suhail Bamzena (#11)
Re: Inherited an 18TB DB & need to backup

IMO a database of this size should only be backed up in s3. pgbackrest
has support for backup to s3.

#15Gabriele Bartolini
gabriele.bartolini@gmail.com
In reply to: Ravi Krishna (#14)
Re: Inherited an 18TB DB & need to backup

Hi Suhail,

We developed Barman (www.pgbarman.org), which works with older versions of
PostgreSQL too. It works with very large databases and is open source (GPL).

This is an old article in which I could publicly mention the size of the
database:

https://www.2ndquadrant.com/en/blog/incremental-backup-barman-1-4-0/

More articles about Barman from our blog:
https://www.2ndquadrant.com/en/blog/tag/barman/

Good luck.

Cheers,
Gabriele

Il giorno ven 15 mag 2020 alle ore 15:49 Ravi Krishna <
srkrishna1@comcast.net> ha scritto:

Show quoted text

IMO a database of this size should only be backed up in s3. pgbackrest
has support for backup to s3.

#16Suhail Bamzena
suhailsalem@gmail.com
In reply to: Gabriele Bartolini (#15)
Re: Inherited an 18TB DB & need to backup

Thanks Gabriele... i will review this in detail.

On Fri, 15 May 2020, 18:41 Gabriele Bartolini, <gabriele.bartolini@gmail.com>
wrote:

Show quoted text

Hi Suhail,

We developed Barman (www.pgbarman.org), which works with older versions
of PostgreSQL too. It works with very large databases and is open source
(GPL).

This is an old article in which I could publicly mention the size of the
database:

https://www.2ndquadrant.com/en/blog/incremental-backup-barman-1-4-0/

More articles about Barman from our blog:
https://www.2ndquadrant.com/en/blog/tag/barman/

Good luck.

Cheers,
Gabriele

Il giorno ven 15 mag 2020 alle ore 15:49 Ravi Krishna <
srkrishna1@comcast.net> ha scritto:

IMO a database of this size should only be backed up in s3. pgbackrest
has support for backup to s3.

#17Andreas 'ads' Scherbaum
adsmail@wars-nicht.de
In reply to: Ravi Krishna (#14)
Re: Inherited an 18TB DB & need to backup

On Fri, May 15, 2020 at 3:49 PM Ravi Krishna <srkrishna1@comcast.net> wrote:

IMO a database of this size should only be backed up in s3. pgbackrest
has support for backup to s3.

Why should the backup land in S3, and not local somewhere?
Any good reason why one should pay for the additional storage and transfer
costs?

--
Andreas 'ads' Scherbaum
German PostgreSQL User Group
European PostgreSQL User Group - Board of Directors
Volunteer Regional Contact, Germany - PostgreSQL Project

#18Ravi Krishna
srkrishna1@comcast.net
In reply to: Andreas 'ads' Scherbaum (#17)
Re: Inherited an 18TB DB & need to backup

Why should the backup land in S3, and not local somewhere?
Any good reason why one should pay for the additional storage and
transfer costs?

Good question. The key point in my statement was "db of this size".

The problem with local backup is that space is not infinite. If your
business requires you to
store backups for say 7 years, storing it locally will be a problem. In
one large financial
company I use to work, full backup was used to store old data.
(except last 30 days where WAL logs were used for a real PIT). We use
to store full backups
for about 60 days and then send older backup to an off site storage.
Nothing is free.

I remember a case where we were requested by business to restore a db of
a given date two yrs
prior as they had to look at old data. It took us close to 96 hrs to
give the users the required database.

S3 storage is ridiculously cheap. Off site storage companies like Iron
Mountain should find their client base
ditching them big time.

#19Andreas 'ads' Scherbaum
adsmail@wars-nicht.de
In reply to: Ravi Krishna (#18)
Re: Inherited an 18TB DB & need to backup

On Fri, May 15, 2020 at 7:52 PM Ravi Krishna <srkrishna1@comcast.net> wrote:

Why should the backup land in S3, and not local somewhere?
Any good reason why one should pay for the additional storage and transfer
costs?

Good question. The key point in my statement was "db of this size".

The problem with local backup is that space is not infinite. If your
business requires you to
store backups for say 7 years, storing it locally will be a problem. In
one large financial
company I use to work, full backup was used to store old data.
(except last 30 days where WAL logs were used for a real PIT). We use to
store full backups
for about 60 days and then send older backup to an off site storage.
Nothing is free.

I remember a case where we were requested by business to restore a db of a
given date two yrs
prior as they had to look at old data. It took us close to 96 hrs to give
the users the required database.

S3 storage is ridiculously cheap. Off site storage companies like Iron
Mountain should find their client base
ditching them big time.

If your database is running somewhere in the cloud, then yes, that might
make
sense. If your database runs in your own data center, then usually you also
have
disk space available there. Plus a transfer out of your data center will
take time.

There is no "per se" recommendation to move data to S3. And there might be
additional requirements like data protection laws, encryption requirements
ect.

--
Andreas 'ads' Scherbaum
German PostgreSQL User Group
European PostgreSQL User Group - Board of Directors
Volunteer Regional Contact, Germany - PostgreSQL Project

#20Scottix
scottix@gmail.com
In reply to: Andreas 'ads' Scherbaum (#19)
Re: Inherited an 18TB DB & need to backup

Also when you get in the multi TB data storage the bill gets a little
harder to digest in S3.

On Fri, May 15, 2020 at 11:49 Andreas 'ads' Scherbaum <adsmail@wars-nicht.de>
wrote:

On Fri, May 15, 2020 at 7:52 PM Ravi Krishna <srkrishna1@comcast.net>
wrote:

Why should the backup land in S3, and not local somewhere?
Any good reason why one should pay for the additional storage and
transfer costs?

Good question. The key point in my statement was "db of this size".

The problem with local backup is that space is not infinite. If your
business requires you to
store backups for say 7 years, storing it locally will be a problem. In
one large financial
company I use to work, full backup was used to store old data.
(except last 30 days where WAL logs were used for a real PIT). We use to
store full backups
for about 60 days and then send older backup to an off site storage.
Nothing is free.

I remember a case where we were requested by business to restore a db of
a given date two yrs
prior as they had to look at old data. It took us close to 96 hrs to give
the users the required database.

S3 storage is ridiculously cheap. Off site storage companies like Iron
Mountain should find their client base
ditching them big time.

If your database is running somewhere in the cloud, then yes, that might
make
sense. If your database runs in your own data center, then usually you
also have
disk space available there. Plus a transfer out of your data center will
take time.

There is no "per se" recommendation to move data to S3. And there might be
additional requirements like data protection laws, encryption requirements
ect.

--
Andreas 'ads' Scherbaum
German PostgreSQL User Group
European PostgreSQL User Group - Board of Directors
Volunteer Regional Contact, Germany - PostgreSQL Project

--
T: @Thaumion
IG: Thaumion
Scottix@Gmail.com

#21Christophe Pettus
xof@thebuild.com
In reply to: Scottix (#20)
#22Ron
ronljohnsonjr@gmail.com
In reply to: Suhail Bamzena (#7)
#23Michael Nolan
htfoot@gmail.com
In reply to: Ravi Krishna (#18)
#24Suhail Bamzena
suhailsalem@gmail.com
In reply to: Ron (#22)
#25Ron
ronljohnsonjr@gmail.com
In reply to: Michael Nolan (#23)
#26Rob Sargent
robjsargent@gmail.com
In reply to: Ron (#25)
#27Peter J. Holzer
hjp-pgsql@hjp.at
In reply to: Rory Campbell-Lange (#2)
#28Suhail Bamzena
suhailsalem@gmail.com
In reply to: Peter J. Holzer (#27)
#29Ron
ronljohnsonjr@gmail.com
In reply to: Rob Sargent (#26)
#30Gavin Flower
GavinFlower@archidevsys.co.nz
In reply to: Ron (#29)
#31Ron
ronljohnsonjr@gmail.com
In reply to: Gavin Flower (#30)
#32James Sewell
james.sewell@jirotech.com
In reply to: Suhail Bamzena (#1)
#33Steve Litt
slitt@troubleshooters.com
In reply to: Christoph Berg (#3)
#34Suhail Bamzena
suhailsalem@gmail.com
In reply to: James Sewell (#32)