var h = (HttpWebRequest)WebRequest.Create(url);
using (var hr = (HttpWebResponse)(await h.GetResponseAsync()))
{
using (var s = hr.GetResponseStream())
{
using (var f = new FileStream(saveTo, FileMode.Create, FileAccess.Write, FileShare.None))
{
int bytesCount = 0;
byte[] buf = new byte[2048]; //<------------------------------
while ((bytesCount = await s.ReadAsync(buf, 0, buf.Length)) > 0)
{
await f.WriteAsync(buf, 0, bytesSize);
// Update UI : downloaded size, percent,...
}
}
}
}
I'm writing a downloader support update UI (ObservableCollection of thounsands items - Batch download) when download progress changed and resume download but not support multi-segment download (as each item's size ussually < 10MB).
I run about 5-20 downloads concurrency. What buffer size is suitable for this case (good for both UI update and for the download)?
You want to use a buffer size that is a multiple of the OS page size, because that is the granularity for writes to disk and pages in memory. Using anything smaller than an OS page size will be suboptimal.
OS pages are generally 4096 bytes. The default buffer size for a FileStream
, used if no buffer size is provided during its construction is also 4096 bytes.
For disk I/O it is generally preferable to have a buffer that is somewhat larger (32-128 KB).
In your scenario, using a maximum of 20 concurrent downloads, if you were to use a buffer size of 32 or 64 KB, this would only require 640 KB or 1.2 MB of memory, so those are clearly viable options.
Let's assume you are in the USA, where the average download speeds are 23 Mbps and 12 Mbps for broadband and mobile respectively, then if you were using 64 KB buffers (1.2 MB for 20 concurrent downloads), you could update the UI for each of the 20 downloads several times per second.
So, use at least 32 - 64 KB buffers.
One thing to take care off, is not to continuously allocate new byte buffers, but recycle these fixed size buffers by using a buffer pool