Search code examples
c#socketstcp

Why does TCP endpoint disconnection throws a bunch of 0 bytes on the other endpoint?


I was playing around with Sockets and TCP in C# and this happened. When one endpoint of the connection disconnects, the other when using Receive method gets a bunch of zero bytes. Why does that happen?

This is the code (ran with dotnet script) that can reproduce what I mentioned:

using System;
using System.Net;
using System.Net.Sockets;
                    
Socket s = new Socket(SocketType.Stream, ProtocolType.Tcp);
s.Bind(new IPEndPoint(IPAddress.Any, 1000));
s.Listen(1);
Console.WriteLine("Opened.");

Socket c = new Socket(SocketType.Stream, ProtocolType.Tcp);
c.Connect(new IPEndPoint(IPAddress.Loopback, 1000));

Socket i = s.Accept();
Console.WriteLine("Accepted.");

byte[] buffer = Array.Empty<byte>();

i.Shutdown(SocketShutdown.Both);

while (true)
{
    buffer = new byte[4];
    c.Receive(buffer, 4, SocketFlags.None);
    Console.WriteLine(BitConverter.ToString(buffer).Replace("-"," "));
}

When the problem happens the console will print a lot of 00 00 00 00 sequences. It happens with i and c swapped too.


Solution

  • ... the other when using Receive method gets a bunch of zero bytes

    That's the wrong interpretation due to the wrong code. What is printed out is the content of the original zero-initialized buffer. The buffer was unchanged within c.Receive since the call did not receive anything.

    What should have been done is to check the return value of Receive which is the amount of bytes actually read. It would have shown that nothing code returned. Instead it was simply but wrongly assumed that 4 bytes were read because 4 bytes were asked for.

    In general Receive can always return less bytes than asked for, not only at connection close. Thus only as much bytes from the buffer should be considered read from the peer as explicitly returned as bytes received by Receive.