I am fighting with problem of a large array. I need to read lots of csv files and work with them (make tables and save them). I try it this way
String[,,] pole = new string[5000, 10251, 100];
...
String[] proz = File.ReadAllText("@/../../History/201" + r + "-" + m1 + m2 + "-" + d1 + d2 + "_00/variables_ens.csv").Split(';');
for (int k = 0; k < 10251; k++)
{
int l = k / 99;
int lk = l * 99;
int b = k - lk;
pole[n, l, b] = proz[k];
}
But in the first row
String[,,] pole = new string[5000, 10251, 100];
It says out of range exception, but i do need that much space. When i write
String[,,] pole = new string[100, 10251, 100];
It works, but only for the hundred of those files. Any suggestion, please?
Well, I think you need to rethink your whole strategy and implement streaming vie IEnumerable<T>
and File.ReadLines
.
I don't have enough elements to understand completely what you're trying to achieve with you current implementation, but I'm pretty sure that the size of the data you're trying to manipulate is too much to be processed in memory.
What do you need to do? What data are you manipulating? 5000 files of 1025100 lines each? Every file has the same size? Do you REALLY, REALLY need to load everything at once?
The answers on this question should point you in the right direction. I would say you need to use streaming to efficiently load the data in some more manageable form (like a database).