Django persistent database connection Django persistent database connection django django

Django persistent database connection


Django 1.6 has added persistent connections support (link to doc for latest stable Django ):

Persistent connections avoid the overhead of re-establishing aconnection to the database in each request. They’re controlled by theCONN_MAX_AGE parameter which defines the maximum lifetime of aconnection. It can be set independently for each database.


Try PgBouncer - a lightweight connection pooler for PostgreSQL.Features:

  • Several levels of brutality when rotating connections:
    • Session pooling
    • Transaction pooling
    • Statement pooling
  • Low memory requirements (2k per connection by default).


In Django trunk, edit django/db/__init__.py and comment out the line:

signals.request_finished.connect(close_connection)

This signal handler causes it to disconnect from the database after every request. I don't know what all of the side-effects of doing this will be, but it doesn't make any sense to start a new connection after every request; it destroys performance, as you've noticed.

I'm using this now, but I havn't done a full set of tests to see if anything breaks.

I don't know why everyone thinks this needs a new backend or a special connection pooler or other complex solutions. This seems very simple, though I don't doubt there are some obscure gotchas that made them do this in the first place--which should be dealt with more sensibly; 5ms overhead for every request is quite a lot for a high-performance service, as you've noticed. (It takes me 150ms--I havn't figured out why yet.)

Edit: another necessary change is in django/middleware/transaction.py; remove the two transaction.is_dirty() tests and always call commit() or rollback(). Otherwise, it won't commit a transaction if it only read from the database, which will leave locks open that should be closed.