Search code examples
javasocketsdatagramsocket-timeout-exception

setSotimeout on a datagram socket


The server acts like an echo server. The clients sends 10 packets to server (1 sec of gap)

When Client receives packets from the server, sometimes the packets are lost.

So the client has to wait for up to one second for the packet to arrive. If the packet does not arrive in 1 second then the client should continue sending the other packets.

How would i use .setSoTimeout to achieve this?

Code:

import java.io.*;
import java.net.*;
import java.util.*;
/*
* Client to process ping requests over UDP.
*/
public class PingClient
{
    private static final int AVERAGE_DELAY = 100; // milliseconds
    public static void main(String[] args) throws Exception
    {
// Get command line argument.
        int port = Integer.parseInt(args[1]);//specified as argument
// Create random number generator for use in simulating
// packet loss and network delay.
        System.out.println("Port "+port);
// Create a datagram socket for receiving and sending UDP packets
// through the port specified on the command line.
        DatagramSocket socket = new DatagramSocket(1234);

    int i=0;
        for(i=0;i<10;i++)
    {
    byte[] buf = new byte[1024] ;
    Calendar cal=Calendar.getInstance();
    String ping="Ping "+ i +" "+cal.getTimeInMillis()+"\r\n";
    buf=ping.getBytes("UTF-8");
    InetAddress address = InetAddress.getByName(args[0]);
    System.out.println("Name "+args[1]);
    DatagramPacket packet = new DatagramPacket(buf, buf.length, 
                                       address, port);
    packet.setData(buf);
    socket.send(packet);
    Thread.sleep( 10* AVERAGE_DELAY);//1 sec

    DatagramPacket server_response = new DatagramPacket(new byte[1024], 1024);
    // Block until the host receives a UDP packet.

        socket.setSoTimeout(1000); //I don't know how to use this
        socket.receive(server_response);

    // Print the recieved data.

        printData(server_response);

}   
}

private static void printData(DatagramPacket request) throws Exception
    {
// Obtain references to the packet's array of bytes.
    byte[] buf = request.getData();
// Wrap the bytes in a byte array input stream,
// so that you can read the data as a stream of bytes.
    ByteArrayInputStream bais = new ByteArrayInputStream(buf);
// Wrap the byte array output stream in an input stream reader,
// so you can read the data as a stream of characters.
    InputStreamReader isr = new InputStreamReader(bais);
// Wrap the input stream reader in a bufferred reader,
// so you can read the character data a line at a time.
// (A line is a sequence of chars terminated by any combination of \r and \n.)
    BufferedReader br = new BufferedReader(isr);
// The message data is contained in a single line, so read this line.
    String line = br.readLine();
// Print host address and data received from it.
    System.out.println(
        "Received from " +
        request.getAddress().getHostAddress() +
        ": " +
        new String(line) );
    }

}


Solution

  • The javadoc for setSoTimeout says:

    With this option set to a non-zero timeout, a call to receive() for this DatagramSocket will block for only this amount of time. If the timeout expires, a java.net.SocketTimeoutException is raised, though the DatagramSocket is still valid.

    So, if you want to send packets if no response has been received after 1 second, you just have to use

    socket.setSoTimeout(1000L);
    boolean continueSending = true;
    int counter = 0;
    while (continueSending && counter < 10) {
        // send to server omitted
        counter++;
        try {
            socket.receive(packet);
            continueSending = false; // a packet has been received : stop sending
        }
        catch (SocketTimeoutException e) {
            // no response received after 1 second. continue sending
        }
    }