db-connections (application architecture)
So, I am working on some system designs for a web application, and I wonder
if there is any definitive answer on how to best connect to a postgres
database.
I could have it so that each time a query, or set of queries, for a
particular request, needs to be run, a new connection is opened, queries
are run, and then connection is closed / dropped.
OR, I could create a persistent connection that will remain open as long as
a user is logged in and then any queries are run against the open
connection.
I can see how, for only a few (hundreds to thousands) of users, the latter
might make more sense but if I need to scale up to millions, I might not
want all of those connections open.
Any idea of how much time / overhead is added by opening and closing a
connection everytime?
Any and all information is welcome.
Thanks in advance
-- Mark M
Am 15.11.2018 um 16:09 schrieb Mark Moellering:
I can see how, for only a few (hundreds to thousands) of users, the
latter might make more sense but if I need to scale up to millions, I
might not want all of those connections open.
consider a connection-pooler like phbouncer.
Regards, Andreas
--
2ndQuadrant - The PostgreSQL Support Company.
www.2ndQuadrant.com
On 11/15/18 7:09 AM, Mark Moellering wrote:
So, I am working on some system designs for a web application, and I
wonder if there is any definitive answer on how to best connect to a
postgres database.I could have it so that each time a query, or set of queries, for a
particular request, needs to be run, a new connection is opened, queries
are run, and then connection is closed / dropped.OR, I could create a persistent connection that will remain open as long
as a user is logged in and then any queries are run against the open
connection.I can see how, for only a few (hundreds to thousands) of users, the
latter might make more sense but if I need to scale up to millions, I
might not want all of those connections open.Any idea of how much time / overhead is added by opening and closing a
connection everytime?Any and all information is welcome.
Connection pooling?
In no particular order:
http://www.pgpool.net/mediawiki/index.php/Main_Page
Thanks in advance
-- Mark M
--
Adrian Klaver
adrian.klaver@aklaver.com
Oh, excellent. I knew I was about to reinvent the wheel.
Sometimes, there are just too many new things to keep up on.
Thank you so much!
On Thu, Nov 15, 2018 at 10:16 AM Adrian Klaver <adrian.klaver@aklaver.com>
wrote:
Show quoted text
On 11/15/18 7:09 AM, Mark Moellering wrote:
So, I am working on some system designs for a web application, and I
wonder if there is any definitive answer on how to best connect to a
postgres database.I could have it so that each time a query, or set of queries, for a
particular request, needs to be run, a new connection is opened, queries
are run, and then connection is closed / dropped.OR, I could create a persistent connection that will remain open as long
as a user is logged in and then any queries are run against the open
connection.I can see how, for only a few (hundreds to thousands) of users, the
latter might make more sense but if I need to scale up to millions, I
might not want all of those connections open.Any idea of how much time / overhead is added by opening and closing a
connection everytime?Any and all information is welcome.
Connection pooling?
In no particular order:
http://www.pgpool.net/mediawiki/index.php/Main_Page
Thanks in advance
-- Mark M
--
Adrian Klaver
adrian.klaver@aklaver.com
Am 15.11.2018 um 16:14 schrieb Andreas Kretschmer:
Am 15.11.2018 um 16:09 schrieb Mark Moellering:
I can see how, for only a few (hundreds to thousands) of users, the
latter might make more sense but if I need to scale up to millions, I
might not want all of those connections open.consider a connection-pooler like phbouncer.
typo, should be pgbouncer ;-)
Regards, Andreas
--
2ndQuadrant - The PostgreSQL Support Company.
www.2ndQuadrant.com