Search code examples
postgresqlsqlalchemyconnection-pooling

Sqlalchemy: Connections aren't closed when pool is overflowed


When I run ab (apache benchmark) on my site (with SQLAlchemy and postgresql hosted on Apache web server), SQLAlchemy makes many connections to postgre and I got too many connections error.

I traced the problem, and found that problem is the pool (actually QueuePool).

The documentation at http://www.sqlalchemy.org/docs/core/pooling.html#sqlalchemy.pool.Pool says that if when the pool is full, returning connections (that opened because max_overflow allowed creation of these extra connections) will be discarded and disconnected. But it seems connections actually didn't close! They silently dropped out of pool without closing.

So SQLAlchemy continuously opens new connections, ignores them (without closing!) and opens new ones.

Increasing pool size is not the real solution, the problem is additional connections aren't closed.

(Default settings for QueuePool is pool_size=5 and max_overflow=10)


Solution

  • Looks like a bug in SQLAlchemy, fixed 2 weeks ago: http://hg.sqlalchemy.org/sqlalchemy/rev/aff95843c12f#l2.17

    There was no release with this fix, so you have to patch it manually.