I'm not sure that a "Buffer" is what I'm looking for, so I'll show you my problem and then you guys can decide if that's the correct word and on a solution. Currently I'm working on creating a networking port for the Java DataOutputStream class in C#. The last thing I have to do is fix this issue with sending segmented information.
Here's my WriteInt method in C# (ClientOutput is an instance of BinaryWriter)
public void WriteInt(int v)
{
ClientOutput.Write (((uint)v >> 24) & 0xFF);
ClientOutput.Write (((uint)v >> 16) & 0xFF);
ClientOutput.Write (((uint)v >> 8) & 0xFF);
ClientOutput.Write (((uint)v >> 0) & 0xFF);
IncCount (4);
}
For anyone who wants to compare; here's the original in Java
public final void writeInt(int v) throws IOException {
out.write((v >>> 24) & 0xFF);
out.write((v >>> 16) & 0xFF);
out.write((v >>> 8) & 0xFF);
out.write((v >>> 0) & 0xFF);
incCount(4);
}
Normally in java you would have to call "Flush()" before the data would be sent to the server or client; However it appears that the BinaryWriter automatically flushes whenever you call "Write()"
Here's the "ReadInt()" code in Java.
public final int readInt() throws IOException {
int ch1 = in.read();
int ch2 = in.read();
int ch3 = in.read();
int ch4 = in.read();
if ((ch1 | ch2 | ch3 | ch4) < 0)
throw new EOFException();
return ((ch1 << 24) + (ch2 << 16) + (ch3 << 8) + (ch4 << 0));
}
When the Java code is executed to "WriteInt" and ReadInt is called on the server, the value is displayed properly; However currently the server is processing the integer 4 different times and displaying an integer value based on the segments.
Example Input (C#):
Client.WriteInt(1000);
Example output(Java):
0
0
50331648
-402653184
When the output should be:
1000
Please bear with me as I just picked up C# a few days ago, and I may be asking a stupid question.
BinaryWriter has a bunch of Write() method overloads and it looks like you are calling the Write(Int32) overload, which of course sends four 4-byte integers.
You should be able to fix the problem simply by casting your values to byte in the call to Write():
public void WriteInt(int v)
{
ClientOutput.Write ((byte)(((uint)v >> 24) & 0xFF));
ClientOutput.Write ((byte)(((uint)v >> 16) & 0xFF));
ClientOutput.Write ((byte)(((uint)v >> 8) & 0xFF));
ClientOutput.Write ((byte)(((uint)v >> 0) & 0xFF));
IncCount (4);
}
Note that technically speaking, you should not need to cast the value v to uint before the shift. Even if sign-extension were to occur, you're masking off all but the last byte anyway. But that part shouldn't hurt anything.
Check out Jon Skeet's endian-aware version of BinaryWriter. There's discussion and some links here: BinaryWriter Endian issue