I have a simple jetty server and im learning different approaches. I have this adapted from here:
public class BlockingServlet extends HttpServlet {
protected void doGet(
HttpServletRequest request,
HttpServletResponse response)
throws ServletException, IOException {
response.setContentType("application/json");
try {
TimeUnit.SECONDS.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
response.setStatus(HttpServletResponse.SC_OK);
response.getWriter().println("{ \"status\": \"ok\"}");
}
}
This is supposed to be a blocking servlet. My understanding is that this is single threaded and therefore if I sent 2 requests at the same time, the second would only be processed after the first is complete.
However when I actually test it out, they both complete at the same time (almost like it is already asynchronous). If so, why would I ever need AsyncContext async = request.startAsync();
? What difference does this make?
I think you are mixing two things: asynchronous and blocking.
Servlets are not single threaded. In fact, Jetty will initialize a thread pool during startup and in the traditional approach, a new thread from the pool will be assigned to each request.
However, this might not be very scalable, if you need to serve thousands of requests.
Since Servlet 3.0, you can decouple (attempt to make it non-blocking) the "request threads" by registering the request to a queue or a list (I don't know how exactly AsyncContext is implemented) and processing the registered requests by your "worker threads", which can be managed with another thread pool.
But this does not have to mean that I cannot write a blocking code that in the end blocks another worker thread.