Search code examples
sqlsql-server-2005sql-updatelarge-data

How to update one table from another one without specifying column names?


I have two tables with identical structure and VERY LARGE number of fields (about 1000). I need to perform 2 operations 1) Insert from the second table all rows into the fist. Example:

INSERT INTO [1607348182]
SELECT * 
FROM _tmp_1607348182;

2) Update the first table from the second table but for update i can't found proper sql syntax for update.

Queries like:

Update [1607348182]
set [1607348182].* = tmp.*
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

or

Update [1607348182]
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

are invalid.


Solution

  • Not sure if you'll be able to accomplish this without using dynamic sql to build out the update statement in a variable.

    This statement will return a list of columns based on the table name you put in:

    select name from syscolumns
    where [id] = (select [id] from sysobjects where name = 'tablename')
    

    Not sure if I can avoid a loop here....you'll need to load the results from above into a cursor and then build a query from it. Psuedo coded:

    set @query = 'update [1607348182] set '
    load cursor --(we will use @name to hold the column name)
    while stillrecordsincursor
    set @query = @query + @name + ' = tmp_[1607348182]. ' +@name + ','
    load next value from cursor
    loop!
    

    When the query is done being built in the loop, use exec sp_executesql @query.

    Just a little warning...building dynamic sql in a loop like this can get a bit confusing. For trouble shooting, putting a select @query in the loop and watch the @query get built.

    edit: Not sure if you'll be able to do all 1000 rows in an update at once...there are logical limits (varchar(8000)?) on the size that @query can grow too. You may have to divide the code so it handles 50 columns at a time. Put the columns from the syscolumns select statement into a temp table with an id and build your dynamic sql so it updates 20 columns (or 50?) at a time.

    Another alternative would be to use excel to mass build this. Do the column select and copy the results into column a of a spreadsheet. Put '= in column b, tmp.[12331312] in column c, copy column a into column D, and a comma into column e. Copy the entire spreadsheet into a notepad, and you should have the columns of the update statement built out for you. Not a bad solution if this is a one shot event, not sure if I'd rely on this as a on-going solution.