I try to import data from a XML file into SQL Server CE database. I use ErikEJ SQL Server Compact Bulk Insert Library (from NuGet) this library on codeplex. I create database and table. Then I read XML to DataTable
and import this DataTable
to DB table.
DataSet ds = new DataSet();
ds.ReadXml("myxml.xml");
DataTable table = new DataTable();
table = ds.Tables[0];
String connString = @"Data Source = test.sdf";
SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString);
bulkInsert.DestinationTableName = "testtable";
bulkInsert.WriteToServer(table);
It works on a small xml, but when I use large xml (more then 1gb) I get this error on ReadXml
:
"System.OutOfMemoryException" in mscorlib.dll
How to fix this?
update: I know that this error because I use large xml - question is how optimize this algorithm, mayby using buffer or read xml part by part, any idea?
There is no simple libary that will solve this for you.
You need to read the XML file in a streaming fashion ( Reading Xml with XmlReader in C# ) to avoid loading the entire XML file, and then for each element read add these to a List or DataTable, up to say 100,000 entries, then BulkInsert those, dispose/clear all unused objects and go on, until the entire file has been read.
In addition, calls to SqlCeBulkCopy should be wrapped in usings to dispose unmanaged resources:
using (SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString))
{
bulkInsert.DestinationTableName = "testtable";
bulkInsert.WriteToServer(table);
}