At the financial company I work at there are various separated deployment environment, including a staging and production environment. The data exposed by the databases of these environments can be modified using an in-house developed application. Currently, such data modification can be applied to staging and production by anyone with the appropriate access rights. Auditing compliancy, however, requires that key personnel accepts changes within the staging environment, thereby pushing it to the production environment.
To achieve this we will provide this key personnel with a user interface, in which they can see the differences between a staging and production table and accept them at the press of a button.
--------- synch ------------ |Staging| ----------> |Production| --------- ^ ------------ | Acceptance by key personnel
My question is: how would one best determine and/or store the differences between the two environment tables such that they can be pushed to production and (optionally) reverted when not accepted? So far I came up with two solutions: (1) compare two tables (or views of those tables) using sql scripts or tooling like Red Gate's SQL Data Compare, or (2) store changes made to the staging table as diffs, similar to diffs used by version control systems.
For both solutions I have thought of various implications, which I will not mention in this question, as I do not want to steer your thought process too much.
I would go with Red-Gate SQL Data Compare which I have been using successfully before in scenarios like you describe. Given SQL Data Compare you can easily create SQL-scripts that work in both directions (apply and remove). A nice SQL server feature to use is to create snapshots to compare towards (available in developer and enterprise editions).
However there are some issues that I run into: