We have a legacy/production database. It has been in continuous development and use backing a website that has evolved over 10 years. The database is accessed by multiple technologies - perl cgi, php and java webapps.
Lots of cruft has accumulated. I wish to undertake a major refactoring. I am sure some parts are completely legacy (the obvious ones being tables with zero rows). To determine which parts are most used my preferred strategy is to instrument the database rather than go through a very large number of potential accessing components.
Does oracle (10g) have the capability to put a trigger on each table to report when and how often it is accessed? Can someone point me to how to do this or some other mechanism to achieve the same goal?
(Please comment with suggestions for other strategies to aid a database refactoring in this senario too).
If you just want to know which tables are accessed (or even if you want more details) then the easiest way would be to use Oracle's built-in AUDIT functionality. Find out more.
Bear in mind that some database objects may only be used quarterly or even annually. So you really need a decent set of documentation or a suite of regression tests with 100% coverage. Of course if you had either of those you probably wouldn't be asking your question here :)