I'm working on a small group conversation server in Java and I'm currently hacking network code, but it seems like I cannot set right timeout on blocking I/O ops: chances are I've been bitten by some Java weirdness (or, simply, I misinterpret javadoc).
So, this is the pertinent code from ConversationServer
class (with all security checks and logging stripped for simplicity):
class ConversationServer {
// ...
public int setup() throws IOException {
ServerSocketChannel server = ServerSocketChannel.open();
server.bind(new InetSocketAddress(port), Settings.MAX_NUMBER_OF_PLAYERS + 1);
server.socket().setSoTimeout((int) Settings.AWAIT_PLAYERS_MS);
int numberOfPlayers;
for (numberOfPlayers = 0; numberOfPlayers < Settings.MAX_NUMBER_OF_PLAYERS; ++numberOfPlayers) {
SocketChannel clientSocket;
try {
clientSocket = server.accept();
} catch (SocketTimeoutException timeout) {
break;
}
clients.add(messageStreamFactory.create(clientSocket));
}
return numberOfPlayers;
}
// ...
}
The expected behaviour is to let connect Settings.MAX_NUMBER_OF_PLAYERS
clients at most, or terminate setup anyway after Settings.AWAIT_PLAYER_MS
milliseconds (currently, 30000L
).
What happens, is that if I connect Settings.MAX_NUMBER_OF_PLAYERS
clients, everything is fine (exit because of for
condition), but if I don't, the SocketTimeoutException
I'd expect is never thrown and the server hangs forever.
If I understand right, server.socket().setSoTimeout((int) Settings.AWAIT_PLAYERS_MS);
should be sufficient, but it doesn't give the expected behaviour.
So, can anyone spot the error here?
It looks like the timeout works if you change from
server.socket().setSoTimeout((int) Settings.AWAIT_PLAYERS_MS);
server.accept();
to
server.socket().setSoTimeout((int) Settings.AWAIT_PLAYERS_MS);
server.socket().accept();
I.e. call accept()
on the same object on which you set the SO timeout. I don't know enough about NIO sockets to say exactly what results from doing this. Maybe somebody else can shed some light.