Advice on logging strategy

Started by Mike Martinover 7 years ago6 messagesgeneral
Jump to latest
#1Mike Martin
redtux1@gmail.com

I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I
can import it to a logging table

However the database is populated via a nightly routine downloading data
via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy
becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

thanks

#2Rob Sargent
robjsargent@gmail.com
In reply to: Mike Martin (#1)
Re: Advice on logging strategy

On Oct 11, 2018, at 4:26 AM, Mike Martin <redtux1@gmail.com> wrote:

I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table

However the database is populated via a nightly routine downloading data via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

thanks

Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the possibility of multiple processes COPYing to same table.

#3Mike Martin
redtux1@gmail.com
In reply to: Rob Sargent (#2)
Re: Advice on logging strategy

I suppose the ideal would be to log the prepared statement once and detail
only if error rather than one per execution

On Thu, 11 Oct 2018 at 11:33, Rob Sargent <robjsargent@gmail.com> wrote:

Show quoted text

On Oct 11, 2018, at 4:26 AM, Mike Martin <redtux1@gmail.com> wrote:

I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I

can import it to a logging table

However the database is populated via a nightly routine downloading data

via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy

becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

thanks

Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the
possibility of multiple processes COPYing to same table.

#4Jeff Janes
jeff.janes@gmail.com
In reply to: Mike Martin (#1)
Re: Advice on logging strategy

On Thu, Oct 11, 2018 at 6:27 AM Mike Martin <redtux1@gmail.com> wrote:

I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I
can import it to a logging table

However the database is populated via a nightly routine downloading data
via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy
becuase each execute statement is logged with the parameters chosen

Is there any way around this?

One option is to convert to using COPY...FROM STDIN rather than prepared
INSERTs.

Another is to create a user specifically for bulk population, and do a
'ALTER USER bulk_load SET log_statement=none` to override the global
log_statement setting.

Cheers,

Jeff

#5David Steele
david@pgmasters.net
In reply to: Mike Martin (#1)
Re: Advice on logging strategy

On 10/11/18 11:26 AM, Mike Martin wrote:

This results in enormous log files which take ages to import using copy
becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

pgAudit (https://github.com/pgaudit/pgaudit) gives you fine-grain
control over what is logged by command type, table, or user as well as a
lot more detail.

--
-David
david@pgmasters.net

#6Mike Martin
redtux1@gmail.com
In reply to: David Steele (#5)
Re: Advice on logging strategy

Thanks!

On Fri, 12 Oct 2018 at 14:33, David Steele <david@pgmasters.net> wrote:

Show quoted text

On 10/11/18 11:26 AM, Mike Martin wrote:

This results in enormous log files which take ages to import using copy
becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

pgAudit (https://github.com/pgaudit/pgaudit) gives you fine-grain
control over what is logged by command type, table, or user as well as a
lot more detail.

--
-David
david@pgmasters.net