I am using a Node server to send a simple 16 bit Int to my client in Unity. The Unity client has a C# client script to connect to the node server and read the value.
The code I am using passes a String so I am converting the Int to a String and then back again on the client. I would rather just pass the value as a UInt16 ... firstly, is this more efficient? and how do I do pass it and convert it on the client?
Here is the node code:
server.on('connection', function(socket) {
console.log('A new connection has been established.');
// Now that a TCP connection has been established, the server can send data to
// the client by writing to its socket.
if (sensor != null){
sensor.on("change", value => {
socket.write(value.toString());
});
}
// The server can also receive data from the client by reading from its socket.
socket.on('data', function(chunk) {
console.log(`Data received from client: ${chunk.toString()}`);
});
// When the client requests to end the TCP connection with the server, the server
// ends the connection.
socket.on('end', function() {
console.log('Closing connection with the client');
});
// Don't forget to catch error, for your own sake.
socket.on('error', function(err) {
console.log(`Error: ${err}`);
});
});
and this is the C#
code in Unity :
socketConnection = new TcpClient("localhost", 8080);
Byte[] bytes = new Byte[1024];
while (true)
{
using (NetworkStream stream = socketConnection.GetStream())
{
int length;
// Read incoming stream into byte array
while ((length = stream.Read(bytes, 0, bytes.Length)) != 0)
{
var incomingData = new byte[length];
//Debug.Log(incomingData.GetValue(0));
//int value = Convert.ToUInt16(incomingData);
//Debug.Log(value);
Array.Copy(bytes, 0, incomingData, 0, length);
// Convert byte array to string message.
string serverMessage = Encoding.ASCII.GetString(incomingData);
int value = Int16.Parse(serverMessage);
Debug.Log(value);
}
}
}
I need to do some refactoring but generally, this works to pass a String
. Passing the UInt16
has not worked. Any help greatly appreciated.
I would rather just pass the value as a UInt16 ... firstly, is this more efficient?
It is - but there are many advantages to using text-based protocols instead of binary protocols: namely, it's MUCH easier to debug and inspect text-based protocols than binary protocols. HTTP is text-based, for example (HTTP/2 is a binary protocol, but only because Google wanted to hyper-optimize it which makes sense at their scale, but for the vast majority of applications using a text-based protocol makes a lot more sense).
and how do I do pass it and convert it on the client?
With difficulty:
It's harder to work with specific numeric types in JavaScript because JavaScript has only the number
type which isn't even an integer type. JavaScript [does have typed buffers][1] but that's an advanced topic.
When dealing with multi-byte binary integers you need to deal with "network-ordering" of multi-byte values, also known as endianness. Most C#.NET environments are "little-endian" but convention is to always use big-endian format on the wire.
TCP only sends packets when the system feels it's optimal (because there's a lot of overhead for each TCP packet), but by default sending individual 2-byte values won't actually be sent until the system has buffered up a lot of 2-byte values - otherwise you'll have to force-flush the network-stack. Note that using a text-based protocol has the same problem, but at least with a text-based protocol there's less conceptual wastage.
For sending single 16-bit integer values - you really are better-off sticking with a text protocol.
...but if you insist:
server.on('connection', function(socket) {
console.log('A new connection has been established.');
if (sensor != null){
sensor.on("change", value => {
if( typeof value !== 'number' ) {
throw new Error( "value must be a number." );
}
if( value < 0 || value > 65535 || Math.floor( value ) !== value ) {
throw new Error( "value must be a 16-bit unsigned integer." );
}
const buffer = new Uint8Array( 2 );
const hi = ( value >> 8 ) & 0xFF;
const lo = ( value ) & 0xFF;
buffer.set( [ hi, lo ] ); // Big-Endian Order.
const didFlush = socket.write( buffer );
if( !didFlush ) console.log('Data did not send immediately.');
});
}
// [...]
});
TcpClient tc = new TcpClient("localhost", 8080); // A TcpClient is not a Socket. It encapsulates a Socket.
Byte[] buffer = new Byte[4096]; // Use 4K as a network buffer size.
while (true)
{
using( NetworkStream stream = tc.GetStream() )
{
Int32 read;
while( ( read = stream.Read( buffer, 0, buffer.Length ) ) != 0 )
{
if( read % 2 != 0 )
{
// TODO: Handle the case where a 16-bit value is half-sent.
continue;
}
// It is highly unlikely that each packet will contain only 2 bytes, instead it's more likely a sequence of 16-bit integers will be sent all-at-once, so we read each of them in 2-byte steps from the buffer:
for( Int32 idx = 0; idx < read; idx += 2 )
{
Byte hi = buffer[ idx + 0 ];
Byte lo = buffer[ idx + 1 ];
UInt16 value = ( hi << 8 ) | ( lo );
Debug.Log( value );
}
} // while
} // using
}
[1]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray