Tool for determining field usage of database tables
Does anyone know of a tool (or a way to use the database catalogs) that can analyze function code/queries accessing the database to pull out a list of the fields used in a set of tables. Basically we are importing a lot of data from another source, and we are trying to determine what percentage of the data we are actually using at this point . We have hundreds of stored procedures, and combing through the code would not be practical.
Thanks.
--Andy
Writing an audit trigger for the operations you'd like to monitor
(then assign it to all your application's tables) to perform the
auditing may be one easy way of doing so, this trigger would log the
operations to some other table.
Allan.
Show quoted text
On Thu, Feb 25, 2010 at 7:36 PM, Andy Yoder <ayoder@airfacts.com> wrote:
Does anyone know of a tool (or a way to use the database catalogs) that can
analyze function code/queries accessing the database to pull out a list of
the fields used in a set of tables. Basically we are importing a lot of
data from another source, and we are trying to determine what percentage of
the data we are actually using at this point . We have hundreds of stored
procedures, and combing through the code would not be practical.Thanks.
--Andy
Thanks Allan for the input - I guess I didn't specify enough details. I am looking for some type of tool/report that is already done. We have nearly 1000 tables, over 300 functions to look at a little over a day to provide the answers (all without dropping any other tasks, of course). I had considered the trigger idea, and may end of doing it anyway and just working later, but thought I would check for a "ready-made" solution first.
Andy
-----Original Message-----
From: Allan Kamau [mailto:kamauallan@gmail.com]
Sent: Thursday, February 25, 2010 12:41 PM
To: Andy Yoder
Cc: pgsql-general@postgresql.org
Subject: Re: [GENERAL] Tool for determining field usage of database tables
Writing an audit trigger for the operations you'd like to monitor
(then assign it to all your application's tables) to perform the
auditing may be one easy way of doing so, this trigger would log the
operations to some other table.
Allan.
Show quoted text
On Thu, Feb 25, 2010 at 7:36 PM, Andy Yoder <ayoder@airfacts.com> wrote:
Does anyone know of a tool (or a way to use the database catalogs) that can
analyze function code/queries accessing the database to pull out a list of
the fields used in a set of tables. Basically we are importing a lot of
data from another source, and we are trying to determine what percentage of
the data we are actually using at this point . We have hundreds of stored
procedures, and combing through the code would not be practical.Thanks.
--Andy
On Thu, Feb 25, 2010 at 12:56 PM, Andy Yoder <ayoder@airfacts.com> wrote:
Thanks Allan for the input - I guess I didn't specify enough details. I am looking for some type of tool/report that is already done. We have nearly 1000 tables, over 300 functions to look at a little over a day to provide the answers (all without dropping any other tasks, of course). I had considered the trigger idea, and may end of doing it anyway and just working later, but thought I would check for a "ready-made" solution first.
Nothing as fine grained as per field, but there are the
pg_stat_user_tables and pg_stat_user_indexes
type system tables, which will show how much individual tables and
indexes are getting used.