Looking for software to 'enqueue' connections
Hi!
I want find a software to 'enqueue' the client connections to database, so
if i reach the max limit the query must be holding in a queue until one
connection is released.
I have many devices (100+) saving their state to a database, each
minute, but the table is too large more than 13,000,000 of records and
many indexes, so, insert one record takes 3 or more minutes.
Then, there is a moment at connection limit is reached :( and lose
information
I tried with pgbouncer to 'enqueue' the connections but I get no
success, maybe I missing something...
by the way:
I use postgres 9.4 with max_connections 100
and pgbouncer max_connections to 100 and reserve_pool_size=50
I hope you can help me...
thanks.
On 8/15/2016 1:30 PM, Edmundo Robles wrote:
I want find a software to 'enqueue' the client connections to
database, so if i reach the max limit the query must be holding in a
queue until one connection is released.
pgbouncer is the correct answer, you may need to play about with the
configuration a bit. there's a few modes that might work, ideally,
write your apps to connect to postgres, do a transaction, and
disconnect, and limit the pool size so only so many connections can be
active at a time. the other mode is to allow the clients to stay
connected to the pool, but have a limited number of actual database
connections that you allocate on a transaction basis.
I have many devices (100+) saving their state to a database, each
minute, but the table is too large more than 13,000,000 of records
and many indexes, so, insert one record takes 3 or more minutes.
that sounds terrible. single row inserts shouldn't *ever* take 3
minutes, if you have clients inserting a row a minute. you may need
faster disk storage, you may need to improve postgres tuning.
'many indexes' ? how many ? too many indexes would definitely slow
inserts down.
--
john r pierce, recycling bits in santa cruz
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
On 08/15/2016 01:30 PM, Edmundo Robles wrote:
Hi!
I want find a software to 'enqueue' the client connections to database,
so if i reach the max limit the query must be holding in a queue until
one connection is released.I have many devices (100+) saving their state to a database, each
minute, but the table is too large more than 13,000,000 of records and
many indexes, so, insert one record takes 3 or more minutes.Then, there is a moment at connection limit is reached :( and lose
informationI tried with pgbouncer to 'enqueue' the connections but I get no
success, maybe I missing something...by the way:
I use postgres 9.4 with max_connections 100
and pgbouncer max_connections to 100 and reserve_pool_size=50I hope you can help me...
To really help it would be nice to know the hardware specifications you
are working with:
CPU type and number.
RAM
Storage subsystem
Also some indication of what the load on you system as whole is. Cannot
remember what your OS is, but information from something like top and
iostat. The reasoning being that fooling with connections may not be of
much help if the system is running at its max limits already. In other
words it is possible a hardware upgrade is what is needed.
thanks.
--
Adrian Klaver
adrian.klaver@aklaver.com
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
Hello.
From:
pgpool-II also has a limit on the maximum number of connections, but extra connections will be queued instead of returning an error immediately.
But your configuration does not look optimal for me. Here are some things you may try:
1) Get rid of indexes. Use this table as OLTP, then denormalize data and load it to OLAP table, build indecies and analyze it.
2) Find bottleneck using your OS tools (is it I/O or CPU?) and improve appropriate subsystem)
3) Use several servers (multimaster configuration like https://wiki.postgresql.org/wiki/Bucardo)
Ilya Kazakevich
JetBrains
<http://www.jetbrains.com/> http://www.jetbrains.com
The Drive to Develop
From: pgsql-general-owner@postgresql.org [mailto:pgsql-general-owner@postgresql.org] On Behalf Of Edmundo Robles
Sent: Monday, August 15, 2016 11:30 PM
To: pgsql-general@postgresql.org
Subject: [GENERAL] Looking for software to 'enqueue' connections
Hi!
I want find a software to 'enqueue' the client connections to database, so if i reach the max limit the query must be holding in a queue until one connection is released.
I have many devices (100+) saving their state to a database, each minute, but the table is too large more than 13,000,000 of records and many indexes, so, insert one record takes 3 or more minutes.
Then, there is a moment at connection limit is reached :( and lose information
I tried with pgbouncer to 'enqueue' the connections but I get no success, maybe I missing something...
by the way:
I use postgres 9.4 with max_connections 100
and pgbouncer max_connections to 100 and reserve_pool_size=50
I hope you can help me...
thanks.
Adrian i have hosted in a rackspace a Debian 7 with 2G RAM.
John, the table have 8 constraints and 5 indexes.
Ilya thanks for the tip, i will search about OLTP.
On Mon, Aug 15, 2016 at 3:47 PM, Ilya Kazakevich <
Ilya.Kazakevich@jetbrains.com> wrote:
Show quoted text
Hello.
From:
*pgpool-II* also has a limit on the maximum number of connections*, but
extra connections will be queued instead of returning an error immediately.*But your configuration does not look optimal for me. Here are some things
you may try:1) Get rid of indexes. Use this table as OLTP, then denormalize data
and load it to OLAP table, build indecies and analyze it.2) Find bottleneck using your OS tools (is it I/O or CPU?) and
improve appropriate subsystem)3) Use several servers (multimaster configuration like
https://wiki.postgresql.org/wiki/Bucardo)Ilya Kazakevich
JetBrains
The Drive to Develop
*From:* pgsql-general-owner@postgresql.org [mailto:pgsql-general-owner@
postgresql.org] *On Behalf Of *Edmundo Robles
*Sent:* Monday, August 15, 2016 11:30 PM
*To:* pgsql-general@postgresql.org
*Subject:* [GENERAL] Looking for software to 'enqueue' connectionsHi!
I want find a software to 'enqueue' the client connections to database,
so if i reach the max limit the query must be holding in a queue until
one connection is released.I have many devices (100+) saving their state to a database, each
minute, but the table is too large more than 13,000,000 of records and
many indexes, so, insert one record takes 3 or more minutes.Then, there is a moment at connection limit is reached :( and lose
informationI tried with pgbouncer to 'enqueue' the connections but I get no
success, maybe I missing something...by the way:
I use postgres 9.4 with max_connections 100
and pgbouncer max_connections to 100 and reserve_pool_size=50
I hope you can help me...
thanks.
On 08/15/2016 01:59 PM, Edmundo Robles wrote:
Please do not top post:
https://en.wikipedia.org/wiki/Posting_style
The preferred style is bottom or interleaved as it makes the thread
easier to follow
Adrian i have hosted in a rackspace a Debian 7 with 2G RAM.
I assume that would be one of their virtual machines. The above is a
start, but what would be helpful is actual system load data from the
machine over time. As a start something on the order of:
aklaver@panda:~> uptime
15:47pm up 9:30, 3 users, load average: 0.20, 0.35, 0.31
aklaver@panda:~> iostat 5
Linux 3.16.7-35-desktop (panda) 08/15/2016 _i686_ (3 CPU)
avg-cpu: %user %nice %system %iowait %steal %idle
4.79 0.03 3.04 1.52 0.00 90.62
Device: tps kB_read/s kB_wrtn/s kB_read kB_wrtn
sda 16.54 51.80 379.13 1780484 13032770
avg-cpu: %user %nice %system %iowait %steal %idle
2.69 0.00 3.23 0.07 0.00 94.01
Device: tps kB_read/s kB_wrtn/s kB_read kB_wrtn
sda 2.00 0.00 113.60 0 568
avg-cpu: %user %nice %system %iowait %steal %idle
0.74 0.00 2.84 5.67 0.00 90.75
Device: tps kB_read/s kB_wrtn/s kB_read kB_wrtn
sda 71.40 9.60 1786.40 48 8932
This is on my old desktop machine, so is not strictly representative of
what you would see.
What we are looking for is a choke point. I am fairly certain that
connections are not it and that your problem lies further upstream.
Namely that your machine(virtual or otherwise) does not have the system
resources(CPU, RAM, disk I/O) to keep up with the load you are placing
on it. Until that is resolved anything you try to do downstream of the
system resources is not going to solve the problem.
John, the table have 8 constraints and 5 indexes.
Ilya thanks for the tip, i will search about OLTP.
--
Adrian Klaver
adrian.klaver@aklaver.com
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general