I am currently working on a client-server application (for learning purposes) and I am completely stuck at how to properly set the DatagramSocket timeout and how to handle the exception.
Server-side looks like this:
while (true) {
try {
serverSocket = new DatagramSocket(25000);
running = acceptConnection(serverSocket, ready);
serverSocket.setSoTimeout(5000);
while (running) {
receivePacket = new DatagramPacket(receiveData, receiveData.length);
try {
serverSocket.receive(receivePacket);
} catch (SocketTimeoutException e) {
System.out.println("Timed out...");
}
receiveMessage = new String(receivePacket.getData(), 0, receivePacket.getLength());
...
}
...
} catch (IOException e) {
System.err.println("");
} finally {
serverSocket.close();
}
...
}
What i want to do in the event of the timeout is to go back to the main while-loop so i can handle another client. The server is designed to only handle one client at a time.
I have tried to add running = false;
in the catch block but it did not work.
OP did not now how to break out of the while(running)
loop back to the while(true)
loop when the socket receive timed out.
In the comments I told him to use the break
keyword after catching the SocketTimeoutException
and he mentioned this solved his problem.