Search code examples
ruby-on-railsamazon-s3newrelic

Request to External Service (S3) takes too much time


I have struggled with this for 2 days now and I just can not figure what I should do. I am using Rails (4.2.3) and Ruby 2.2.1, if that helps in any way.

After installing the new relic gem I have found out that the reason two of my controllers take an enormous time to respond (~17 secs) is due to requests to amazon s3.

Those pages just display 10 records using the ransack gem.

Here is a screenshot for you to see.

Any help would be greatly appreciated...

New Relic external services screenshot


Solution

  • Its hard to say exactly, and with 10 records 1.7 seconds per record seems fairly large, maybe you should measure that outside Rails (e.g. request the same URLs direct in a browser).

    But generally making 10 sequential requests in a controller is a bad thing. Even if the request was fairly fast, making 10 of them is likely to be many times the slowest thing in that response.

    1. You could do this either by spawning 10 threads each to get a record, then joining and rendering the view.
    2. Or you could fetch each record asyncronously (non-blocking so all 10 happen at once).
    3. Or if you can render the data with AJAX/XHR responses in the page, make the constroller only fetch 1 item and make the app request all 10 seperately (but you will hit the browers max request to a domain limit so wont get a full 10x speedup).
    4. If the browser does not need the response immediately, use a background task runner (many Gems) and just tell the browser "its pending". This will avoid holding up your Rails server, but is likely to be even slower from a single users perspective.

    Unfortunately async support in Ruby/Rails is not great (guess needs more people demanding and contributing). I dont think option 2 is possible at all with the AWS SDK, and you would need to look around for a HTTP client library that is really async to do it at a low level (and not just a thread/fibre wrapper).