Search code examples
.netmemory64-bitstacklimit

.NET stack memory limit


I am using C#, .NET 4.0, 64-bit. I need to store in memory 500 million "data points" that are used in computations. I need to decide whether to create these as struct or class objects. Structs seem so much faster.

Is there a memory limit for the stack? If so, how can it be adjusted.

Will storing so much data on a stack affect the overall performance of the system?

(By the way, I am aware of the single-object size limitation in .NET, so that's being addressed -- the data will be stored in multiple collections).


Solution

  • You're asking the wrong question. If stack size matters, you're doing something wrong.

    If you use many datapoints, you'll put them in a collection, such as an array. Arrays are always allocated on then heap. An array of structs embeds the individual structs and forms a continuous memory block. (If you have more than 2GB, you need several arrays).

    Whereas with reference types, the array will only contain the references, and the objects get allocated individually on the heap. A heap allocation has about 16 bytes of overhead, the reference in the array accounts for another 8.
    You'll also get worse cache locality due to the indirections, and the GC has to do more work, to crawl all those references.

    My conclusion is that if you have many small datapoints, make them a struct, and put them in an array.