In a multithreaded setting such as a webserver, in .NET, I generally try to avoid creating contiguous blocks of memory occupying more thank 85KB as that could end up on the large object heap, which could lead to memory problems.
One of my fellow developers is using a loop to write to the Response
. OutputStream
, without Flush
ing except at the end of the loop.
Am I correct in thinking that not Flush
ing in the loop could lead to memory issues? How could I prove this?
That depends on the implementation details of the stream being used. From the MSDN doc on the Stream base class
A stream is an abstraction of a sequence of bytes, such as a file, an input/output device, an inter-process communication pipe, or a TCP/IP socket. The Stream class and its derived classes provide a generic view of these different types of input and output, and isolate the programmer from the specific details of the operating system and the underlying devices.
If the implementator of that stream didn't leave any extra guidance on how to deal with multiple subsequent Write
calls we should assume it handles those flushing details for you.
I assume you find this answer a bit disappointing so dig a bit deeper. As you say your colleague is using Response.OutputStream
let's see what the underlying implementation of that property is.
get
{
if (!this.UsingHttpWriter)
{
throw new HttpException(SR.GetString("OutputStream_NotAvail"));
}
return this._httpWriter.OutputStream;
}
So it is using the Stream
from something in _httpWriter
. That field turns out to hold a reference to an instance of an HttpWriter
. It's OutputStream
property get initialized in the constructor:
this._stream = new HttpResponseStream(this);
The class HttpResponseStream
is internal but we can use ILSpy to pry it open. Its Write
method defers the implementation back to this HttpWriter's method:
internal void WriteFromStream(byte[] data, int offset, int size)
{
if (this._charBufferLength != this._charBufferFree)
{
this.FlushCharBuffer(true);
}
this.BufferData(data, offset, size, true);
if (!this._responseBufferingOn)
{
this._response.Flush();
}
}
As you can see the byte[] data is being handed off to an method that further copy and stores the data with the help HttpResponseUnmanagedBufferElement
which copies the bytes to memory with an Marshal.Copy
to a buffer that is unmanaged memory. That memory seems to be allocated in blocks of around 16K for an integrated pipeline and around 31K for the rest.
With that I don't expect the Stream to allocate so much memory that its internal structures end-up on the LOH because by the looks of it it only makes managed -> unmanaged memory copies.
Leaves us with your idea to regularly call Flush
in the loop. The HttpResponseStream
has this implementation:
public override void Flush()
{
this._writer.Flush();
}
where _writer
is the earlier discovered HttpWriter
. Its implementation is
public override void Flush()
{
}
That is right, calling flush will only waste CPU cycles. It will not help the stream in clearing its buffers sooner, despite the promising documentation in MSDN.