ERROR: out of shared memory

Started by Sorin N. Ciolofanabout 19 years ago7 messagesgeneral
Jump to latest
#1Sorin N. Ciolofan
ciolofan@ics.forth.gr

Hello!

I have to manage an application written in java which call another module
written in java which uses Postgre DBMS in a Linux environment. I'm new to
Postgres. The problem is that for large amounts of data the application
throws an:

org.postgresql.util.PSQLException: ERROR: out of shared memory

Please, have you any idea why this error appears and what can I do in order
to fix this?

Are there some Postgre related parameters I should tune (if yes what
parameters) or is something related to the Linux OS?

Thank you very much

With best regards,

Sorin

#2Tom Lane
tgl@sss.pgh.pa.us
In reply to: Sorin N. Ciolofan (#1)
Re: [GENERAL] ERROR: out of shared memory

"Sorin N. Ciolofan" <ciolofan@ics.forth.gr> writes:

I have to manage an application written in java which call another module
written in java which uses Postgre DBMS in a Linux environment. I'm new to
Postgres. The problem is that for large amounts of data the application
throws an:
org.postgresql.util.PSQLException: ERROR: out of shared memory

AFAIK the only very likely way to cause that is to touch enough
different tables in one transaction that you run out of lock entries.
While you could postpone the problem by increasing the
max_locks_per_transaction setting, I suspect there may be some basic
application misdesign involved here. How many tables have you got?

regards, tom lane

#3Merlin Moncure
mmoncure@gmail.com
In reply to: Tom Lane (#2)
Re: [GENERAL] ERROR: out of shared memory

On 3/26/07, Tom Lane <tgl@sss.pgh.pa.us> wrote:

"Sorin N. Ciolofan" <ciolofan@ics.forth.gr> writes:

I have to manage an application written in java which call another module
written in java which uses Postgre DBMS in a Linux environment. I'm new to
Postgres. The problem is that for large amounts of data the application
throws an:
org.postgresql.util.PSQLException: ERROR: out of shared memory

AFAIK the only very likely way to cause that is to touch enough
different tables in one transaction that you run out of lock entries.
While you could postpone the problem by increasing the
max_locks_per_transaction setting, I suspect there may be some basic
application misdesign involved here. How many tables have you got?

or advisory locks...these are easy to spot. query pg_locks and look
for entries of locktype 'advisory'. I've already seen some apps in
the wild that use them, openads is one.

merlin

#4Sorin N. Ciolofan
ciolofan@ics.forth.gr
In reply to: Tom Lane (#2)
Re: [GENERAL] ERROR: out of shared memory

Dear Mr. Tom Lane,

Thank you very much for your answer.
It seems that the legacy application creates tables dynamically and the
number of the created tables depends on the size of the input of the
application. For the specific input which generated that error I've
estimated a number of created tables of about 4000.
Could be this the problem?

With best regards,
Sorin

-----Original Message-----
From: Tom Lane [mailto:tgl@sss.pgh.pa.us]
Sent: Tuesday, March 27, 2007 6:37 AM
To: Sorin N. Ciolofan
Cc: pgsql-general@postgresql.org; pgsql-admin@postgresql.org;
pgsql-performance@postgresql.org
Subject: Re: [GENERAL] ERROR: out of shared memory

"Sorin N. Ciolofan" <ciolofan@ics.forth.gr> writes:

I have to manage an application written in java which call another

module

written in java which uses Postgre DBMS in a Linux environment. I'm new to
Postgres. The problem is that for large amounts of data the application
throws an:
org.postgresql.util.PSQLException: ERROR: out of shared memory

AFAIK the only very likely way to cause that is to touch enough
different tables in one transaction that you run out of lock entries.
While you could postpone the problem by increasing the
max_locks_per_transaction setting, I suspect there may be some basic
application misdesign involved here. How many tables have you got?

regards, tom lane

#5Tom Lane
tgl@sss.pgh.pa.us
In reply to: Sorin N. Ciolofan (#4)
Re: [GENERAL] ERROR: out of shared memory

"Sorin N. Ciolofan" <ciolofan@ics.forth.gr> writes:

It seems that the legacy application creates tables dynamically and the
number of the created tables depends on the size of the input of the
application. For the specific input which generated that error I've
estimated a number of created tables of about 4000.
Could be this the problem?

If you have transactions that touch many of them within one transaction,
then yup, you could be out of locktable space. Try increasing
max_locks_per_transaction.

regards, tom lane

#6Sorin N. Ciolofan
ciolofan@ics.forth.gr
In reply to: Tom Lane (#5)
Re: ERROR: out of shared memory

Dear Mr. Tom Lane,

From what I've read from the postgresql.conf file I've understood that
which each unit increasing of the "max_locks_per_transaction" parameter the
shared memory used is also increased.
But the shared memory looks to be already fully consumed according to the
error message, or is the error message irrelevant and improper in this
situation?

With best regards,
Sorin

-----Original Message-----
From: Tom Lane [mailto:tgl@sss.pgh.pa.us]
Sent: Tuesday, March 27, 2007 4:59 PM
To: Sorin N. Ciolofan
Cc: pgsql-general@postgresql.org; pgsql-admin@postgresql.org;
pgsql-performance@postgresql.org
Subject: Re: [GENERAL] ERROR: out of shared memory

"Sorin N. Ciolofan" <ciolofan@ics.forth.gr> writes:

It seems that the legacy application creates tables dynamically and the
number of the created tables depends on the size of the input of the
application. For the specific input which generated that error I've
estimated a number of created tables of about 4000.
Could be this the problem?

If you have transactions that touch many of them within one transaction,
then yup, you could be out of locktable space. Try increasing
max_locks_per_transaction.

regards, tom lane

#7Joseph Shraibman
jks@selectacast.net
In reply to: Sorin N. Ciolofan (#6)
Re: ERROR: out of shared memory

Try doing select * from pg_locks to see how many locks you have out.