I am using an SslStream
to encrypt a TCP connection between a client and server. The problem is that when the client reads the data, it may be given a bunch of zero bytes instead of the real data. Here is an example that shows the issue:
// Server
using (NetworkStream tcpStream = client.GetStream())
{
Stream stream = tcpStream;
if (ssl)
{
SslStream sslStream = new SslStream(tcpStream, true);
sslStream.AuthenticateAsServer(cert, false, SslProtocols.Default, false);
stream = sslStream;
}
byte[] buf = new byte[] {0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x02};
stream.Write(buf, 0, buf.Length);
buf = new byte[] {0x03, 0x03, 0x03, 0x03, 0x03, 0x03, 0x03, 0x03};
stream.Write(buf, 0, buf.Length);
}
// Client
using (NetworkStream tcpStream = client.GetStream())
{
Stream stream = tcpStream;
if (ssl)
{
SslStream sslStream = new SslStream(
tcpStream,
true,
delegate { return true; }
);
sslStream.AuthenticateAsClient(
"localhost",
null,
SslProtocols.Default,
false
);
stream = sslStream;
}
byte[] buf = new byte[7];
stream.Read(buf, 0, buf.Length);
// buf is 01010101010101 as expected
buf = new byte[9];
stream.Read(buf, 0, buf.Length);
// buf is 020000000000000000 instead of the expected 020303030303030303
// a subsequent read of 8 bytes will get me 0303030303030303
// if the ssl bool is set to false, then the expected data is received without the need for a third read
}
It appears as though the client needs to read from the stream in the exact same number of bytes as the server wrote them only when an SslStream
is being used. This can't be right. What am I missing here?
This code
buf = new byte[9];
stream.Read(buf, 0, buf.Length);
requests stream
to read between 1 and 9 bytes into buf
. It does not always read exactly 9 bytes.
The Read Method returns the number of bytes actually read.
Try this:
byte[] buffer = new byte[9];
int offset = 0;
int count = buffer.Length;
do
{
int bytesRead = stream.Read(buffer, offset, count);
if (bytesRead == 0)
break; // end of stream
offset += bytesRead;
count -= bytesRead;
}
while (count > 0);