I have a really big database (running on PostgreSQL) containing a lot of tables with sophisticated relations between them (foreign keys, on delete cascade and so on). I need remove some data from a number of tables, but I'm not sure what amount of data will be really deleted from database due to cascade removals.
How can I check that I'll not delete data that should not be deleted?
I have a test database - just a copy of real one where I can do what I want :)
The only idea I have is dump database before and after and check it. But it not looks comfortable. Another idea - dump part of database, that, as I think, should not be affected by my DELETE statements and check this part before and after data removal. But I see no simple ways to do it (there are hundreds of tables and removal should work with ~10 of them). Is there some way to do it?
Any other ideas how to solve the problem?
You can query the information_schema to draw yourself a picture on how the constraints are defined in the database. Then you'll know what is going to happen when you delete. This will be useful not only for this case, but always.
Something like (for constraints)
select table_catalog,table_schema,table_name,column_name,rc.* from
information_schema.constraint_column_usage ccu,
information_schema.referential_constraints rc
where ccu.constraint_name = rc.constraint_name