Am new to websocket and i implemented websocket on web application which the server-side is written in java and client-side is javascript. Server send notifications to client via websocket. I wonder what would happened if client won't be fast enough to handle incoming messages as fast as server is sending them. For example it is possible that server will be sending about 200 text messages per second, client is slow and is handling 100 messages per second. I believe that browser queue incoming messages before it's processed but not sure. I what to also know how to check this buffer size and it's limit, and what would happen if buffer limit is reached. Any idea on how i can simulate this kind of situation, i tried:
webSocket.onmessage = function (message) {
var bool = true;
var datenexexec = Date.now() + 1000;
while(bool) {
if(Date.now() > datenexexec){
bool = false;
}
}
}
but this causes the browser to only hang and later crash. Thanks for help.
For sending data more rapidly than the client can read it, here's what will eventually happen.
TCP is a reliable protocol so it will just buffer and transmit later until the buffer is full. It shouldn't lose packets by itself (unless the connection drops), but when buffers are full, it will give you an error that it can't send any more because the buffer is full.
As for the client-side code you tried, you can't busy/wait in Javascript for very long. That kills the event loop and eventually brings down the script engine.
The only way for you to simulate this is to try to actually send more packets than the client can process. You can code a "slow" client that takes maybe 250ms to process each packet in a short busy/wait loop and a "fast" server that sends a flood of packets and you should be able to simulate it.