I have to restore some databases to a remote SQL Server. The problem is that they can be up to a few gb and this will cause an "OutOfMemory" exception when reading all bytes at one time. On smaller databases it is working without problems..
Now I try to stream the database sequentially to the SQL Server here:
using (System.Data.SqlClient.SqlCommand command = conn.CreateCommand())
{
command.CommandText = "INSERT INTO " + dbName + ".dbo.tempTable (blob) values (@data)";
command.CommandTimeout = 900;
SqlParameter dataParam = new SqlParameter("@data", SqlDbType.VarBinary);
command.Parameters.Add(dataParam);
SqlParameter offsetParam = new SqlParameter("@offset", SqlDbType.BigInt);
command.Parameters.Add(offsetParam);
SqlParameter lengthParam = new SqlParameter("@len", SqlDbType.BigInt);
command.Parameters.Add(lengthParam);
using (FileStream fs = new FileStream(fileSource, FileMode.Open, FileAccess.Read, FileShare.Read))
{
byte[] buffer = new byte[8192];
int read = 0;
int offset = 0;
while ((read = fs.Read(buffer, 0, buffer.Length)) > 0)
{
dataParam.Value = buffer;
offsetParam.Value = offset;
lengthParam.Value = read;
command.ExecuteNonQuery();
if (offset == 0)
{
command.CommandText = "UPDATE " + dbName + ".dbo.tempTable SET blob.WRITE(@data, @offset, @len)";
}
offset += read;
}
}
}
Now my problem:
A 500mb database takes more then 15 minutes and then it stops working with this exception:
System.Data.Common.DbException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Thanks in advance,
Relax
I have solved it by myself now. The problem was that i created a byte array with 8040 byte. This would be for a 520mb database about 50000 requests. I have changed the byte size to 8MB now.
byte[] buffer = new byte[8388608];
This works well now ;)