Search code examples
c#.netwindowsmemory64-bit

Memory limitted to about 2.5 GB for single .net process


I am writing .NET applications running on Windows Server 2016 that does an http get on a bunch of pieces of a large file. This dramatically speeds up the download process since you can download them in parallel. Unfortunately, once they are downloaded, it takes a fairly long time to pieces them all back together.

There are between 2-4k files that need to be combined. The server this will run on has PLENTLY of memory, close to 800GB. I thought it would make sense to use MemoryStreams to store the downloaded pieces until they can be sequentially written to disk, BUT I am only able to consume about 2.5GB of memory before I get an System.OutOfMemoryException error. The server has hundreds of GB available, and I can't figure out how to use them.


Solution

  • MemoryStreams are built around byte arrays. Arrays cannot be larger than 2GB currently.

    The current implementation of System.Array uses Int32 for all its internal counters etc, so the theoretical maximum number of elements is Int32.MaxValue.

    There's also a 2GB max-size-per-object limit imposed by the Microsoft CLR.

    As you try to put the content in a single MemoryStream the underlying array gets too large, hence the exception.

    Try to store the pieces separately, and write them directly to the FileStream (or whatever you use) when ready, without first trying to concatenate them all into 1 object.