Search code examples
c#sqllinq-to-sqlconcurrencyrowversion

Disconnected LINQ Updates: rowversion vs. datetime with trigger?


We're using LINQ to SQL and WCF for a new middle tier, and we're using Data Transfer Objects for passing over the wire rather than using the actual LINQ classes. I'm going to be using one or the other of the methods outlined here - Linq Table Attach() based on timestamp or row version - in order to ensure that updates work correctly and that concurrency is handled correctly.

To save you folks some reading time, essentially you can either use a timestamp/rowversion column in your table or have a datetime column with a default and an update trigger - either way it gets you a column that gets a newly generated value each time an insert or update occurs, and that column is the one used by LINQ to check for concurrency.

My question is - which one is better? We already have datetime columns for "UpdatedWhen" in many of our tables (but not all - don't ask), but would be adding in the defaults and the triggers, or we could just add the rowversion (we'd have to use the timestamp syntax for now, since we're still supporting SQL2005 for a while) to each table - either way we're modifying the DB in order to make it work, so I'd like to know whether there's a performance difference or any other important difference to note between these two alternatives. I've tried searching the web and here on SO, but no luck so far. Thanks.


Solution

  • I would lean towards using timestamp column for concurrency checks. One - triggers would have some impact upon performance and two - with date time column you'll be limiting yourself to the precision of DateTime column in SQL and C#.

    MSDN:

    datetime values are rounded to increments of .000, .003, or .007 seconds...

    You may want to look at SO: Is there any difference between DateTime in c# and DateTime in SQL server? and MSDN: datetime (Transact-SQL) for more info.