I have a TcpClient
client connected to a server, that send a message back to the client.
When reading this data using the NetworkStream.Read
class I can specify the amount of bytes I want to read using the count
parameter, which will decrease the TcpClient.Available
by count
after the read is finished. From the docs:
count
Int32
The maximum number of bytes to be read from the current stream.
In example:
public static void ReadResponse()
{
if (client.Available > 0) // Assume client.Available is 500 here
{
byte[] buffer = new byte[12]; // I only want to read the first 12 bytes, this could be a header or something
var read = 0;
NetworkStream stream = client.GetStream();
while (read < buffer.Length)
{
read = stream.Read(buffer, 0, buffer.Length);
}
// breakpoint
}
}
This reads the first 12 bytes of the 500 available on the TcpClient
into buffer
, and inspecting client.Available
at the breakpoint will yield (the expected) result of 488
(500 - 12).
Now when I try to do the exact same thing, but using an SslStream
this time, the results are rather unexpected to me.
public static void ReadResponse()
{
if (client.Available > 0) // Assume client.Available is 500 here
{
byte[] buffer = new byte[12]; // I only want to read the first 12 bytes, this could be a header or something
var read = 0;
SslStream stream = new SslStream(client.GetStream(), false, new RemoteCertificateValidationCallback(ValidateServerCertificate), null);
while (read < buffer.Length)
{
read = stream.Read(buffer, 0, buffer.Length);
}
// breakpoint
}
}
This code will read the first 12 bytes into buffer
, as expected. However when inspecting the client.Available
at the breakpoint now will yield a result of 0
.
Like the normal NetworkStream.Read
the documentation for SslStream.Read
states that count
indicates the max amount of bytes to read.
count
Int32
AInt32
that contains the maximum number of bytes to read from this stream.
While it does only read those 12 bytes, and nothing more I am wondering where the remaining 488 bytes go.
In the docs for either SslStream
or TcpClient
I couldn't find anything indicating that using SslStream.Read
flushes the stream or otherwise empties the client.Available
. What is the reason for doing this (and where is this documented)?
There is this question that asks for an equivalent of TcpClient.Available
, which is not what i'm asking for. I want to know why this happens, which isn't covered there.
Remember that the SslStream might be reading large chunks from the underlying TcpStream at once and buffering them internally, for efficiency reasons, or because the decryption process doesn't work byte-by-byte and needs a block of data to be available. So the fact that your TcpClient contains 0 available bytes means nothing, because those bytes are probably sitting in a buffer inside the SslStream.
In addition, your code to read 12 bytes is incorrect, which might be affecting what you're seeing.
Remember that Stream.Read
can return fewer bytes than you were expecting. Subsequent calls to Stream.Read
will return the number of bytes read during that call, and not overall.
So you need something like this:
int read = 0;
while (read < buffer.Length)
{
int readThisTime = stream.Read(buffer, read, buffer.Length - read);
if (readThisTime == 0)
{
// The end of the stream has been reached: throw an error?
}
read += readThisTime;
}