What's going on, I do this on the server:
var msg = Server.Api.CreateMessage();
msg.Write(2);
msg.Write(FreshChunks.Count());
Server.Api.SendMessage(msg, peer.Connection, NetDeliveryMethod.ReliableUnordered);
then on the client it succesfuly reads the byte = 2 and the switch then routes to function which reads Int32 (FreshChunks.Count) which was equal 4 but when received it equals 67108864. I've tried with Int16-64 and UInt16-64, none of them work out the correct value.
Given that:
msg.Write(2)
, the compiler reads the 2
as an int (Int32)
It seems that one of these options is happening:
msg.Write
is writing only bytes that have at least one-bit set (=1) in them. (to save space)msg.Write
is always casting the given argument to a byte.When asking for 4 bytes (Int32),
You got:
0x04 00 00 00. The first byte is exactly the 4 you passed.
It seems that when asking from msg.Read
more bytes than it has (you requested 4bytes and it has only 1 due to msg.Write
logic)
It does one of these:
For solving your problem, you should read the documentation of the Write
and Read
methods and understand how they behave.