I am attempting to connect my laptop with my standalone pc using C# TCPClient class.
Laptop is running a simple console application and plays the role of the server.
PC is a Unity aplication (2018.1.6f1 with .Net4.x Mono)
The code for sending is
public void SendData() {
Debug.Log("Sending data");
NetworkStream ns = client.GetStream();
BinaryFormatter bf = new BinaryFormatter();
TCPData data = new TCPData(true);
using (MemoryStream ms = new MemoryStream()) {
bf.Serialize(ms, data);
byte[] bytes = ms.ToArray();
ns.Write(bytes, 0, bytes.Length);
}
}
The same code is used in the Laptop's project, except Debug.Log()
is replaced by Console.WriteLine()
For data reception I use
public TCPData ReceiveData() {
Debug.Log("Waiting for Data");
using (MemoryStream ms = new MemoryStream()) {
byte[] buffer = new byte[2048];
int i = stream.Read(buffer, 0, buffer.Length);
stream.Flush();
ms.Write(buffer, 0, buffer.Length);
ms.Seek(0, SeekOrigin.Begin);
BinaryFormatter bf = new BinaryFormatter();
bf.Binder = new CustomBinder();
TCPData receivedData = (TCPData)bf.Deserialize(ms);
Debug.Log("Got the data");
foreach (string s in receivedData.stuff) {
Debug.Log(s);
}
return receivedData;
}
}
Again the same on both sides,
The data I am trying to transfer looks like this
[Serializable, StructLayout(LayoutKind.Sequential)]
public struct TCPData {
public TCPData(bool predefined) {
stuff = new string[2] { "Hello", "World" };
ints = new List<int>() {
0,1,2,3,4,5,6,7,8,9
};
}
public string[] stuff;
public List<int> ints;
}
The custom binder is from here without it I get an assembly error
with it I get Binary stream '0' does not contain a valid BinaryHeader. Possible causes are invalid stream or object version change between serialization and deserialization.
Now the problem:
Sending this from PC to Laptop - 100% success rate
Sending this from Laptop to PC - 20% success rate
(80% is the Exception above)
How is it even possible that it "sometimes" works ?
Shouldn't it be 100% or 0% ?
How do I get it to work ?
Thanks
E: Ok thanks to all the suggestions I managed to increase the chances of success, but it still occasionally fails.
I send a data size "packet" which is 80% of the time received correctly, but in some cases the number I get from the byte[] is 3096224743817216 (insanely big) compared to the sent ~500.
I am using Int64 data type.
E2: In E1 I was sending the data length packet separately, now I have them merged, which does interpret the length properly, but now I am unable to deserialize the data... every time I get The input stream is not a valid binary format. The starting contents (in bytes) are: 00-00-00-00-00-00-04-07-54-43-50-44-61-74-61-02-00 ...
I read the first 8 bytes from the stream and the remaining 'x' are the data, deserializing it on server works, deserializing the same data throws.
E3: Fixed it by rewriting the stream handling code, I made a mistake somewhere in there ;)
NetworkStream.Read() doesn't block until it reads the requested number of bytes:
"This method reads data into the buffer parameter and returns the number of bytes successfully read. If no data is available for reading, the Read method returns 0. The Read operation reads as much data as is available, up to the number of bytes specified by the size parameter. If the remote host shuts down the connection, and all available data has been received, the Read method completes immediately and return zero bytes."
You must
1) Know how many bytes you are expecting
and
2) Loop on Read() until you have received the expected bytes.
If you use a higher-level protocol like HTTP or Web Sockets they will handle this "message framing" for you. If you code on TCP/IP directly, then that's your responsibility.