Search code examples
c#.netconnectiondatareadersqlcommand

DataReader and SQLCommand


As they say on the radio - long time listener first time caller....

Here's my issue. VS 2005 SQL Server 2005 Database. Windows Forms app. C#. Big table - 780K records. I'll call it the source table. Need to loop through the source table, and for each record do something with another table, then write back to the source table that it was completed. I haven't got as far as updating the second table yet...

I loop through all records of the source table using a datareader using connection object A. For each record I build an update statement to update the source table to indicate this record has been processed - and use a SQL Command against connection object B to do this uodate. So different connection objects because I know the dataReader wants an exclusive.

Here's the issue. After processing X records - where X seems to be about 60 - the update timesout.

While writing this - funny how this happens isn't it - my brain tells me this is to do with transaction isolation and / or locking...i.e. I'm reading through the source records using a datareader but changing those records...I can see this causing problems with different transaction isolations so I'll look into that...

Anyone seen this and know how to solve it?

Cheers

Pete


Solution

  • Without more detail, the possibilities for solutions are very numerous. As iivel noted, you could perform all of the activities within the database itself -- if that is possible for the type of operations you must perform. I would just add that it would very likely be best to do such a thing using set-based operations rather than cursors.

    If you must perform this operation outside of the database, then your source query can be executed without any locks at all, if that is a safe thing to do in your case. You can do that by setting the isolation level or by simply appending with (nolock) after your table name / alias in the query.

    There are several other options as well. For instance, you could operate in batches, instead, such as pulling 1000 records at a time from the source table into memory, disconnecting, and then performing whatever operations and updates you need. Once all 1000 records are processed, work on another set of 1000 records, and so on, until all records in the queue have been processed.