I've implemented a controller method which makes a couple of requests to an third parry API, which is quite slow. Further I've utilized one of Thin's asynchronous features:
# This informs thin that the request will be handled asynchronously
self.response_body = ''
self.status = -1
Thread.new do
# This will be the response to the client
env['async.callback'].call('200', {}, "Response body")
end
However I'm curious if this could be implemented without using Thin, or to be more precise if that could be accomplished with Apache/Phusionpassenger.
Any suggestions, pointers, links, comments or answers are appreciated. Thanks
Not sure wether this is possible now with passenger 4. In this article They announced having made a complete redesign to support the evented model. As they also do have plans to support Node.js, I would expect the above method to work.
However if you look at this post from them, they clearly say:
... There is another way to support high I/O concurrency though: multi-threading ...
And so this leaves multithreaded servers as the only serious options for handling streaming support in Rails apps....
Rails is just not designed for the evented process model, but it supports the multi-threaded model quite well. And multithreaded sutup can be achieved with passenger-enterprise.
Another option might be to extract this problem to another application (see Railscast). So for example instead of directly calling a 3d party api in you controller, which will spend the most time on blocking the I/O call, you solve this by processing this request in a backround job. The user will get an immidiate reponse and after that directly subscribe on some faye message channel. In your background job, when the 3d party call is ready, you publish the response on this channel to faye. PROFIT.