Search code examples
c#socketstcpbinaryreaderbinarywriter

C# binarywriter start and end of string


Im sending a large string 0.443+0.064+-0.120+-0.886+0.15167+-0.26754+0.95153 over a TCP socket-connection.

The message i recieve is not similar to the string i send. It is cut at random points, i.e. 43+0.064+-0.120+-0.886+0.15167+-0.26754+0

How can i make sure the full string is read?

This is the clientcode:

public static void SendMessage(string message)
{
   if (socketConnection == null)
      {
        return;
      }
   using (BinaryWriter writer = new
   BinaryWriter(socketConnection.GetStream(), Encoding.ASCII, true))
      {
         writer.Flush();
         writer.Write(message);
         writer.Flush();
       }
}

This is my servercode:

private void ListenForIncommingRequests()
{
     tcpListener = new TcpListener(IPAddress.Parse("127.0.0.1"), 8080);
     tcpListener.Start();
     connectedTcpClient = tcpListener.AcceptTcpClient();

     using (BinaryReader reader = new 
     BinaryReader(connectedTcpClient.GetStream()))
     {                   
         while (true)
          {
            string clientMessage = reader.ReadString();
          }
     }
}

Solution

  • As @NineBerry pointed out in the comments, you're writing ASCII encoded Bytes, but reading default (Unicode (UTF-16)) Encoded Bytes. Make sure to use the same Encoding on both ends, I'd recommend using Unicode, so either remove Encoding.ASCII when instantiating your BinaryWriter or use Encoding.Unicode when instantiating your BinaryWriter AND your BinaryReader