Search code examples
javamultithreadingrestcachingjersey-1.0

Can a Jersey web service sometimes process requests synchronously and sometimes asynchronously?


Here is the basic method I'm trying to write using Jersey 1.8 and Java 6.

@GET
@Path("/{teamId}")
@Produces(MediaType.TEXT_PLAIN)
public String getTeamInfo(@PathParam("teamId") final int teamId, @Context HttpServletRequest httpServletRequest) {
    if(!updatingCache) {
        //do some work
        return info;
    }
    else if(updatingCache) {
        //add http request to a queue to be processed once the cache is updated
        return info;
    }
}

Sometimes the method functions as a basic REST service, but if the cache is being updated the incoming requests should go to a queue to be processed once the cache is updated.

I thought about trying to add the entire HttpServletRequest to the queue, but it would block before returning the info, and all incoming requests should be added to the queue while the cache is updating.

When the service starts, it starts a background thread that uses JeroMQ (a pure java implementation of ZeroMQ) to listen for messages to update the cache.

What's the best way to handle this situation?

PS, I'm an intern with 4 weeks of experience :)

Update:

I think the @Suspended annotation along with Jersey's AsyncResponse class would accomplish what I'm looking for. Unfortunately that class was introduced in Jersey 2.0 and I have to use 1.8. Can anyone suggest something similar to this?


Solution

  • I was trying to do the wrong thing. It turns out I didn't need to implement my own queue for HTTP requests because that happens internally. I just needed to have a thread-safe mechanism to count the number of requests being processed so that I only updated the cache while there were no requests currently being processed. Since the cache mechanism blocked the server from processing more requests, they queue up automatically, and when the lock is released, they all process in order.