Search code examples
sqlsql-serversql-server-2005

How to copy schema and some data from SQL Server to another instance?


My product uses a SQL Server database - each client has their own deployed instance on their own intranet. The db has about 200 tables. Most of them are configuration tables that have only a few rows, but there are a couple of transaction data tables which may have several million rows. Often I need to troubleshoot a customer's configuration problem so I need a copy of their database to work with locally on my dev system... However it may be very large due to the transaction data, which makes it very difficult for the customer to send me a backup to work with. I need a way to backup/copy/export everything EXCEPT for the large tables.

Ideally, the large tables really belong a separate database catalog, such that a customer could backup their "data" separately from their "configuration". However, I don't have the authority to make that type of change, so I'm looking for a good way to export or copy just the configuration parts without having to copy the whole thing.

Currently I have a quick and dirty .NET app which uses a SQL Adapter and DataSet to blindly select * from all tables except the ones I want to omit, dump that to an XML file (this is my poor-man's export feature). And I have companion app I can use locally to load the XML back into a DataSet and then use SQLBulkCopy to load it into my dev database. I had to do some tricks to disable constraints and so forth, but I made it work. This gets me the data, but not the schema.... ideally I'd like a way to get the schema along with it. It is a brute force approach, and I believe there must be a simpler way. Any suggestions for a better approach?

The solution needs to be automated since I'm dealing with end-users who generally have very little IT support.

(SQL 2005 and higher)


Solution

  • I solved it as follows. I created a custom winform app to export their data:

    1. use SMO (Transfer class) to script the entire schema to a .sql file
    2. loop through all specified tables (actually I just listed the large-growth ones to ignore, and process everything else) using: select * from
    3. capture the records into DataTable objects and dump them to .xml files
    4. Zip the whole folder up to be sent back to me

    And I have a custom winform app I use locally on my dev system to import the data:

    1. Create an empty database
    2. Run the .sql file to build the schema
    3. Disable all constraints on all tables (using sp_foreachtable)
    4. loop through all .xml file and bulk import them using SqlBulkCopy object
    5. Re-enable all constraints (using sp_foreachtable)

    This approach works well for my database, but I would not necessarily advise this for some database designs. In my case, there are many "small" configuration tables, and only a few "large" tables with minimal relations between them so I can easily ignore the large ones. If there were more complex relations it would probably not be a good solution...

    Seems to be working great so far. I was hoping to find a good way to do this without writing custom code, but I got it working with only a few hours of dev time.

    FYI, I almost got it working with sqlpubwiz, but the option to script data is all-or-nothing... you can't ignore specific tables... that was a deal-breaker for my app. The Transfer object in SMO is a very useful class - it only requires a few lines of code to script an entire database schema including all dependencies in the correct creation order to recreate the database from scratch!

    http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.transfer.aspx