Search code examples
ruby-on-railsperformancepostgresqlruby-on-rails-4newrelic

Rails API requests extremely slow -> 300+ sec


My application suddenly became really slow with response time that can go up to 3 minutes. I have tried on two different local environment to make sure it is not being slowed down by the CPU/RAM or any other hardware component, but the problem remains.

Some contexts

I am using ruby 2.0.0 and rails 4.0.0. Pow as a rack server and a PostgreSQL database. The request used for this test is a basic API call to fetch all entries in an entries table. The table itself doesn't have many records (~12k).

Postman API request

And here is the what the log shows for that request:

Started GET "/en/*****/api/v1/entries.json" for 127.0.0.1 at 2014-09-11 11:42:13 +0100
  ActiveRecord::SchemaMigration Load (0.7ms)  SELECT "schema_migrations".* FROM "schema_migrations"
Processing by Api::V1::EntriesController#index as JSON
  Parameters: {"subdomain"=>"cms", "not"=>[:new, :create, :edit, :update, :destroy], "locale"=>"en", "tenant"=>"*****"}
...
...
...
...
Completed 200 OK in 309341ms (Views: 307471.4ms | ActiveRecord: 1620.3ms)

Yes you read it well... 299 seconds for the view rendering.

Entries action

Api::V1::EntriesController#index is fairly simple

  def index  
    # Set Status
    @status = params[:status].present? ? status_id("#{params[:status]}") : [0..3]

    # Set Promoted
    @promoted = params[:promoted].present? ? params[:promoted] : [true,false]

    # Fetch Entries   
    @entries = Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag]).order("published_at DESC").paginate( page: params[:page], per_page: params[:limit] )
    respond_with @entries
  end

New Relic Preview

New Relic Details

And finally the SQL reports:

New Relic SQL reports

As you can see above, the SQL queries duration and timestamp do not match. Even though queries don't take long, the timestamp increase very quickly (until 300 seconds at the end). I will spare you the details but the long list of queries (thanks to acts-as-taggable-on) does not show any weird or long unexpected queries. The longest query takes 300ms.

development.rb

  config.cache_classes = false

  # Do not eager load code on boot.
  config.eager_load = false

  # Show full error reports and disable caching.
  config.consider_all_requests_local       = true
  config.action_controller.perform_caching = true

  # Don't care if the mailer can't send.
  config.action_mailer.raise_delivery_errors = false

  # Print deprecation notices to the Rails logger.
  config.active_support.deprecation = :log

  # Raise an error on page load if there are pending migrations
  config.active_record.migration_error = :page_load

  # Debug mode disables concatenation and preprocessing of assets.
  # This option may cause significant delays in view rendering with a large
  # number of complex assets.
  config.assets.debug = true

  # Do not compress assets
  config.assets.compress = false

  # Disable color for log file
  config.colorize_logging = false

Pow and Rails logs don't show anything wrong. I bypassed .dev in the proxy settings of my ethernet adapter.

Any idea what could cause my app to slow down that much?

Edit #1

Following apneadiving reply, I removed all association that could possibly slowing down my request.

It is now simplified to render a simple JSON array of IDs (~300 records).

/app/controllers/api/v1/entries_controller.rb

def index

  # Set Status
  @status = params[:status].present? ? status_id("#{params[:status]}") : [0..3]

  # Set Promoted
  @promoted = params[:promoted].present? ? params[:promoted] : [true,false]

  # Fetch entries
  @entries = Entry.where(status_id: @status, promoted: @promoted)
  respond_with @entries
end

/app/serializers/entry_serializer.rb

class EntrySerializer < ActiveModel::Serializer
  attributes :id
end

Results is still more that 3 seconds for such a simple request...

Started GET "/en/*****/api/v1/entries.json?limit=2" for 127.0.0.1 at 2014-09-12 10:07:10 +0100
Processing by Api::V1::EntriesController#index as JSON
  Parameters: {"limit"=>"2", "subdomain"=>"cms", "not"=>[:new, :create, :edit, :update, :destroy], "locale"=>"en", "tenant"=>"*****"}
  Account Load (1.3ms)  SELECT "accounts".* FROM "accounts" WHERE "accounts"."domain" = '******' ORDER BY "accounts"."id" ASC LIMIT 1
   (0.6ms)  BEGIN
  SQL (2.2ms)  INSERT INTO "sessions" ("cookie", "created_at", "domain", "locale", "updated_at") VALUES ($1, $2, $3, $4, $5) RETURNING "id"  [["cookie", "Dn/aXXbbWG8t8A5ZYGVgsQ=="], ["created_at", Fri, 12 Sep 2014 09:07:11 UTC +00:00], ["domain", "*******"], ["locale", :en], ["updated_at", Fri, 12 Sep 2014 09:07:11 UTC +00:00]]
   (0.6ms)  COMMIT
  ApiKey Load (0.6ms)  SELECT "api_keys".* FROM "api_keys" WHERE "api_keys"."account_id" = 2 AND "api_keys"."access_token" IS NULL LIMIT 1
  ApiKey Load (0.6ms)  SELECT "api_keys".* FROM "api_keys" WHERE "api_keys"."account_id" = 2 AND "api_keys"."access_token" = '****************' LIMIT 1
  Account Load (0.8ms)  SELECT "accounts".* FROM "accounts" WHERE "accounts"."domain" = '*****' ORDER BY "accounts"."id" ASC LIMIT 1
  SQL (2.6ms)  UPDATE "api_keys" SET "count" = COALESCE("count", 0) + 1 WHERE "api_keys"."account_id" = 2 AND "api_keys"."id" = 2
  Entry Load (4.9ms)  SELECT "entries".* FROM "entries" WHERE "entries"."account_id" = 2 AND "entries"."promoted" IN ('t', 'f') AND (("entries"."status_id" BETWEEN 0 AND 3 OR 1=0))
Completed 200 OK in 3558ms (Views: 3246.6ms | ActiveRecord: 27.7ms)

Indexes of the entries table:

# \d entries
                                               Table "public.entries"
      Column       |            Type             |                             Modifiers                             
-------------------+-----------------------------+-------------------------------------------------------------------
 id                | integer                     | not null default nextval('entries_id_seq'::regclass)
 title             | character varying(255)      | 
 slug              | character varying(255)      | 
 status_id         | integer                     | 
 promoted          | boolean                     | 
 published_at      | timestamp without time zone | default '2014-07-31 15:06:20.462154'::timestamp without time zone
 created_at        | timestamp without time zone | 
 updated_at        | timestamp without time zone | 
 account_id        | integer                     | 
 excerpt           | text                        | 
 user_id           | uuid                        | 
 extra             | hstore                      | 
 entry_views_count | integer                     | 
Indexes:
    "entries_pkey" PRIMARY KEY, btree (id)
    "index_entries_on_slug" UNIQUE, btree (slug)
    "entries_title" gin (to_tsvector('english'::regconfig, title::text))
    "index_entries_on_account_id" btree (account_id)
    "index_entries_on_promoted" btree (promoted)
    "index_entries_on_status_id" btree (status_id)
    "index_entries_on_user_id" btree (user_id)

Solution

  • You have to eager load data as much a possible to prevent N+1 from happening:

    entries = Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag])
    entries = entries.includes(:entry_fields).order("published_at DESC")
    entries = entries.paginate( page: params[:page], per_page: params[:limit] )
    

    includes doc is here