I am testing how big a collection could be in .Net. Technically, any collection object could grows to the size of the physical memory.
Then I tested the following code in a sever, which has 16GB memory, running Windows 2003 server and Visual Studio 2008. I tested both F# and C# code, and looked at the Task Manager while running. I can see that after about growing 2GB memory, the program crashed with out-of-memory exception. I did set the target platform to x64 in the property page.
open System.Collections.Generic
let d = new Dictionary<int, int>()
for i=1 to 1000000000 do
d.Add(i,i)
I did a same test to the C5 collection library. The result is that the dictionary in C5 could use up the whole memory. The code uses C5:
let d = C5.HashDictionary<int, int> ()
for i=1 to 1000000000 do
d.Add(i,i)
Anyone knows why?
The Microsoft CLR has a 2GB maximum object size limit, even the 64 bit version. (I'm not sure whether this limit is also present in other implementations such as Mono.)
The limitation applies to each single object -- not the total size of all objects -- which means that it's relatively easy to workaround using a composite collection of some sort.
There's a discussion and some example code here...
There seems to be very little official documentation that refers to this limit. It is, after all, just an implementation detail of the current CLR. The only mention that I'm aware of is on this page:
When you run a 64-bit managed application on a 64-bit Windows operating system, you can create an object of no more than 2 gigabytes (GB).