I'm importing an Excel file (about 1000 records only) to a dedicated SQL Server database. As I need to work on the incoming data from Excel (add a GUID per row, some data conversions) I want to do it row by row and don't want to bulk import (I have nothing against transactions, though).
I'm confused on how to do it correctly. I can either use an SQLCommand
with parameters like so:
SqlCommand sqlCommand = new SqlCommand("insert into TestTable(GUID,Name,Pricing) values(@GUID,@Name,@Pricing)", sqlConn);
foreach (DataRow dr in ds.Tables[0].Rows) //<-- this is my Excel file to iterate through
{
sqlCommander.Parameters.Clear();
String refGUID = Guid.NewGuid().ToString();
sqlCommander.Parameters.AddWithValue("GUID", refGUID);
sqlCommander.Parameters.AddWithValue("Name", dr.ItemArray[0]);
sqlCommander.Parameters.AddWithValue("Pricing", dr.ItemArray[1]);
sqlCommander.ExecuteNonQuery();
}
Or I can use the "connected" mode like so:
SqlDataAdapter dataAdapter = new SqlDataAdapter("SELECT GUID, Name, Pricing FROM TestTable", sqlConn);
SqlCommandBuilder commandBuilder = new SqlCommandBuilder(dataAdapter);
DataSet myDataSet = new DataSet();
dataAdapter.Fill(myDataSet, "TestTable");
foreach (DataRow dr in ds.Tables[0].Rows) //<-- this is my Excel file to iterate through
{
DataRow row = myDataSet.Tables[0].NewRow();
row["GUID"] = refGUID;
row["Name"] = dr.ItemArray[0];
row["Pricing"] = dr.ItemArray[1];
myDataSet.Tables[0].Rows.Add(row);
dataAdapter.Update(myDataSet);
}
Now my questions are the following:
INSERT
command for each row (that would be the SqlCommand
method) or is it better to fill a special DataSet
(2nd Method)? I guess it is rather "stupid" to have 1000's of inserts to an SQL server?dataAdapter.Update(myDataSet)
<-- should I do that AFTER iterating through all Excel rows or for each row (as shown in the example code above), would this magically create a transaction?DataSet
when some error occurs when reading out the Excel file - is the update still pushed to the SQL server or is everything lost?You mention not wanting to use SSIS - but have you considered SqlBulkCopy? Then there is no need for anything except .NET yet you can still use the fastest/most-direct import.
This will accept a DataTable, so you can prepare your data in a DataTable and then pull the trigger. Transactions are optionally supported IIRC. For larger data you can also implement IDataReader to provide fully streaming upload (while still processing each row in transit).