I want to make a multiple insert so i will pass set of data , and i don't know which method is better from performance view .
- public static int Insert(List<EndServReward> reward) //list of
objects
- public static int Insert(DataTable reward) //datatable
Since neither of those actually shows any insert code, it is meaningless to comment. Generally speaking, the performance of any C# here is going to be vastly dwarfed by the latency of talking to an out-of-process database server that is presumably on a different machine on the LAN. However! Yes, it is possible for the library implementation to make a difference. As examples, Entity Framework and NHibernate are both big complicated products with lots of abstractions and complications - and as a result there can sometimes be measurable overheads. Other libraries are intentionally simple and lightweight, handling the 90% scenarios really efficiently, and not bothering to support the hard 10%. But: you don't indicate what library (if any) you are using, so we can't comment. Likewise, there are inbuilt overheads in things like DataTable
- but you'd only notice them when dealing in non-trivial batch sizes.
If you want the fastest possible performance, then dropping to SqlBulkCopy
via an IDataReader
will be the winner - but that is only worth doing if you are inserting lots of data: you wouldn't bother with that to insert 20 rows. But if you are inserting 2000 rows, sure! FastMember provides a property-mapped IDataReader
implementation for you - for example:
using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(list, "Id", "Name", "Description"))
{ // members to include, or omit for all ^^^^^^^^
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}
Note that bulk-insert is not the way to go for every day-to-day insert into a database: again, this is to fit a specific scenario (efficient insert of large quantities of data).