I'm reading huge csv files (about 350K lines by file) using this way:
StreamReader readFile = new StreamReader(fi);
string line;
string[] row;
readFile.ReadLine();
while ((line = readFile.ReadLine()) != null)
{
row = line.Split(';');
x=row[1];
y=row[2];
//More code and assignations here...
}
readFile.Close();
}
The point here is that reading line by line a huge file for every day of the month may be slow and I think that it must be another method to do it faster.
Method 1
By using LINQ:
var Lines = File.ReadLines("FilePath").Select(a => a.Split(';'));
var CSV = from line in Lines
select (line.Split(',')).ToArray();
Method 2
As Jay Riggs stated here
Here's an excellent class that will copy CSV data into a datatable using the structure of the data to create the DataTable:
A portable and efficient generic parser for flat files
It's easy to configure and easy to use. I urge you to take a look.
Method 3
Rolling your own CSV reader is a waste of time unless the files that you're reading are guaranteed to be very simple. Use a pre-existing, tried-and-tested implementation instead.