I am trying to setup load balanced environment having 2 application server instances. I am unable to make spring web socket relay messages to all instances. Let's take a example to describe my problem better:
Both are connected to same database schema so job request can come for any server instance but will get executed on Server 1.
Now, I have used spring web socket plugin for my GRAILS application and I push messages to browser using
brokerMessagingTemplate.convertAndSend(user.notificationChannel, ((notification.toMap(user) as JSON)).toString())
It was working fine on single server setup. But on multi-server setup, notifications are only received on Server 1 as that is the one calling the code block, if reverse the scenario, then vice-versa result is observed.
How can I push same notification to all server instances, so that user always gets the notification no matter what server instance he is on?
I initially thought of utilising a common queue like RabitMQ but that will add to system requirements and will get disapproved by client.
NOTE: Third party service solutions won't work in my case as applications are on intranet and don't have internet access.
websockets by default point to a hostname/ip address - whilst you could setup a dns record / hostname that points to multiple different ip's / servers. This itself would break communication flow of the websockets if it sent handshake to one and the message to another.
The most simplest approach would be to think of some db table that is shared across both and as each instance comes up/alive it records its local ip / socket port to a db table - each instance can then read this table and work out at any point which are the hosts to transmit a socket message to - (this table would need to managed somehow - upon a brand new bootup ) it would be empty and would popuplate as instances came up - something again to manage when a host is taken down shutdown.
Each instance would then be running an ws internal client. When a message is sent the ws client would be triggered attempting to find all alive websocket servers "from the db" to each using the ws client it would attempt to connect and send the message on. Each would then get the message and either broadcast to all connected users or if it is from user x meant for user x then like per chat plugin it would relay it only to user x if found on server y and so on.
this then keeps it all inline with 1 technology controlling the entire process websocket server that has its own client which relay to the end multiple instance ws socket server