Tool for determining field usage of database tables

Started by Andy Yoderabout 16 years ago4 messagesgeneral
Jump to latest
#1Andy Yoder
ayoder@airfacts.com

Does anyone know of a tool (or a way to use the database catalogs) that can analyze function code/queries accessing the database to pull out a list of the fields used in a set of tables. Basically we are importing a lot of data from another source, and we are trying to determine what percentage of the data we are actually using at this point . We have hundreds of stored procedures, and combing through the code would not be practical.

Thanks.

--Andy

#2Allan Kamau
kamauallan@gmail.com
In reply to: Andy Yoder (#1)
Re: Tool for determining field usage of database tables

Writing an audit trigger for the operations you'd like to monitor
(then assign it to all your application's tables) to perform the
auditing may be one easy way of doing so, this trigger would log the
operations to some other table.

Allan.

Show quoted text

On Thu, Feb 25, 2010 at 7:36 PM, Andy Yoder <ayoder@airfacts.com> wrote:

Does anyone know of a tool (or a way to use the database catalogs) that can
analyze function code/queries accessing the database to pull out a list of
the fields used in a set of tables.  Basically we are importing a lot of
data from another source, and we are trying to determine what percentage of
the data we are actually using at this point .  We have hundreds of stored
procedures, and combing through the code would not be practical.

Thanks.

--Andy

#3Andy Yoder
ayoder@airfacts.com
In reply to: Allan Kamau (#2)
Re: Tool for determining field usage of database tables

Thanks Allan for the input - I guess I didn't specify enough details. I am looking for some type of tool/report that is already done. We have nearly 1000 tables, over 300 functions to look at a little over a day to provide the answers (all without dropping any other tasks, of course). I had considered the trigger idea, and may end of doing it anyway and just working later, but thought I would check for a "ready-made" solution first.

Andy

-----Original Message-----
From: Allan Kamau [mailto:kamauallan@gmail.com]
Sent: Thursday, February 25, 2010 12:41 PM
To: Andy Yoder
Cc: pgsql-general@postgresql.org
Subject: Re: [GENERAL] Tool for determining field usage of database tables

Writing an audit trigger for the operations you'd like to monitor
(then assign it to all your application's tables) to perform the
auditing may be one easy way of doing so, this trigger would log the
operations to some other table.

Allan.

Show quoted text

On Thu, Feb 25, 2010 at 7:36 PM, Andy Yoder <ayoder@airfacts.com> wrote:

Does anyone know of a tool (or a way to use the database catalogs) that can
analyze function code/queries accessing the database to pull out a list of
the fields used in a set of tables.  Basically we are importing a lot of
data from another source, and we are trying to determine what percentage of
the data we are actually using at this point .  We have hundreds of stored
procedures, and combing through the code would not be practical.

Thanks.

--Andy

#4Scott Marlowe
scott.marlowe@gmail.com
In reply to: Andy Yoder (#3)
Re: Tool for determining field usage of database tables

On Thu, Feb 25, 2010 at 12:56 PM, Andy Yoder <ayoder@airfacts.com> wrote:

Thanks Allan for the input - I guess I didn't specify enough details.  I am looking for some type of tool/report that is already done.  We have nearly 1000 tables, over 300 functions to look at a little over a day to provide the answers (all without dropping any other tasks, of course).  I had considered the trigger idea, and may end of doing it anyway and just working later, but thought I would check for a "ready-made" solution first.

Nothing as fine grained as per field, but there are the

pg_stat_user_tables and pg_stat_user_indexes

type system tables, which will show how much individual tables and
indexes are getting used.