I'm trying to read a whole file's bytes into memory; for now, I'm doing nothing with it. Reading the file with FileInfo.OpenRead combined with a StreamReader is going very slowly; about 30 seconds as opposed to only 4 seconds to copy/paste the file from one hard drive to another in Explorer. Why would it be so slow to read it into memory? The example code:
namespace ConsoleApp1 {
internal class Program {
static void Main(string[] args) {
string multiGigFile = @"path\to\my\4gb\file";
FileInfo fi = new(multiGigFile);
using (StreamReader sr = new(fi.OpenRead())) {
int bufferSize = 104857600; // 100MB read buffer
int bytesRead;
char[] buffer = new char[bufferSize];
// Takes ~30 secs to finish the while loop, ~4s to copy from one hard drive to another
while ((bytesRead = sr.Read(buffer, 0, bufferSize)) > 0) {
Console.WriteLine($"File read; read next {bytesRead} bytes...");
}
Console.WriteLine($"File read; finished.");
}
}
}
}
The StreamReader
was adding a lot of overhead, converting byte
s to char
s presumably. Reading directly from the FileStream
sped things up quite a lot.
namespace ConsoleApp1 {
internal class Program {
static void Main(string[] args) {
int bufferSize = 104857600; // 100MB read buffer
string multiGigFile = @"path\to\my\4gb\file";
FileInfo fi = new(multiGigFile);
using (FileStream fs = fi.OpenRead()) {
int bytesRead;
byte[] buffer = new byte[bufferSize];
while ((bytesRead = fs.Read(buffer, 0, bufferSize)) > 0) {
Console.WriteLine($"File read; read next {bytesRead} bytes...");
}
Console.WriteLine($"File read; finished.");
}
}
}
}