We have a huge list (say 100,000) which needs to be converted to DataTable
for SqlBulkcopy
.
Can you guide what could be the fastest way to do this without using for loop?
For now we are doing it like - in below code listDos
is object list
using (var dataTable = new DataTable(dataTableName))
{
dataTable.Locale = CultureInfo.CurrentCulture;
var columns = new[]
{
new DataColumn("Id", typeof(int)),
new DataColumn("FkId", typeof(int)),
new DataColumn("Status", typeof(string)),
new DataColumn("RecordFrom", typeof(DateTime))
};
dataTable.Columns.AddRange(columns);
foreach (ObjectDo listDo in listDos)
{
var row = dataTable.NewRow();
if (rebuildDo.Id != null) row["Id"] = rebuildDo.Id;
if (rebuildDo.FkId!= null) row["FkId"] = rebuildDo.FkId;
row["Status"] = rebuildDo.Status;
row["RecordFrom"] = rebuildDo.RecordFrom;
dataTable.Rows.Add(row);
}
return dataTable;
}
The fastest way would be: not to.
SqlBulkCopy
can take an IDataReader
. "FastMember" (on NuGet) can expose a List<T>
as an IDataReader
:
List<ObjectDo> listDos = ...
using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(listDos,
"Id", "FkId", "Status", "RecordFrom"))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}
This is now a direct reader over the list: no duplication of all the contents.