Search code examples
ruby-on-railspostgresqlsidekiq

How can I prevent database connections from timing out in Rails?


I have a Rails system in which every half hour, the following is done:

  • There are 15 clients somewhere else on the network
  • The server creates a record called Measurement for each of these clients
  • The measurement records are configured, and then they are run asynchronously via Sidekiq, using MeasurementWorker.perform_async(m.id)
  • The connection to the client is done with Celluloid actors and a WebSocket client
  • Each measurement, when run, creates a number of event records that are stored in the database

The system has been running well with 5 clients, but now I am at 15, and many of the measurements don't run anymore when I start them at the same time, with the following error:

2015-02-04T07:30:10.410Z 35519 TID-owd4683iw MeasurementWorker JID-15f6b396ae9e3e3cb2ee3f66 INFO: fail: 5.001 sec
2015-02-04T07:30:10.412Z 35519 TID-owd4683iw WARN: {"retry"=>false, "queue"=>"default", "backtrace"=>true, "class"=>"MeasurementWorker", "ar
gs"=>[6504], "jid"=>"15f6b396ae9e3e3cb2ee3f66", "enqueued_at"=>1423035005.4078047}
2015-02-04T07:30:10.412Z 35519 TID-owd4683iw WARN: could not obtain a database connection within 5.000 seconds (waited 5.000 seconds)
2015-02-04T07:30:10.412Z 35519 TID-owd4683iw WARN: /home/webtv/.rbenv/versions/2.1.2/lib/ruby/gems/2.1.0/gems/activerecord-4.1.4/lib/active_
record/connection_adapters/abstract/connection_pool.rb:190:in `block in wait_poll'
....

Now, my production environment looks like this:

config/sidekiq.yml

production:
  :verbose: false
  :logfile: ./log/sidekiq.log
  :poll_interval: 5
  :concurrency: 50

config/unicorn.rb

...
worker_processes Integer(ENV["WEB_CONCURRENCY"] || 3)
timeout 60
...

config/database.yml

production:
  adapter: postgresql
  database: ***
  username: ***
  password: ***
  host: 127.0.0.1
  pool: 50

postgresql.conf

max_connections = 100 # default

As you see, I've already increased the concurrency of Sidekiq to 50, to cater for a high number of possible concurrent measurements. I've set the database pool to 50, which already looks like overkill to me.

I should add that the server itself is quite powerful, with 8 GB RAM and a quad-core Xeon E5-2403 1.8 GHz.

What should these values ideally be set to? What formula can I use to calculate them? (E.g. number of maximum DB connections = Unicorn workers × Sidekiq concurrency × N)


Solution

  • It looks to me like your pool configuration of 100 is not taking affect. Each process will need a max of 50 so change 100 to 50. I don't know if you are using Heroku but it is notoriously tough to configure the pool size.

    Inside mysql, your max connection count should look like this:

    ((Unicorn processes) * 1) + ((sidekiq processes) * 50)
    

    Unicorn is single threaded and never needs more than one connection unless you are spinning up your own threads in your Rails app for some reason.