Maintain a large dictionary in memory for Django-Python? Maintain a large dictionary in memory for Django-Python? django django

Maintain a large dictionary in memory for Django-Python?


For choosing Memcache or REDIS, they are capable of tens of thousands request per second on low-end hardware (eg. 80,000 req/s for REDIS on C2D Q8300). With latencies of well below 1ms. You're saying that you're be doing something in order of 20 request a second, so performance wise it's really non-issue.

If you choose dump.py option, you don't need to restart Django to reload. You can make your own simple reloader:

dump.py:

[ dict code...]mtime = 0

djago code:

import dump #this does nothing if it's already loadedstat = os.stat(dump_filename)if(stat.mtime > dump.mtime):    reload(dump)    dump.mtime = stat.mtime


Memcached, though a great product, is trumped by Redis in my book. It offers lots of things that memcached doesn't, like persistence.

It also offers more complex data structures like hashses. What is your particular data dump? How big is it, and how large / what type of values?


In past for a similar problem I have used the idea of a dump.py . I would think that all of the other data structures would require a layer to convert objects of one kind into python objects . However I would still think that this would depend on data size and the amount of data you are handling . Memcache and redis should have better indexing and look up when it comes to really large data sets and things like regex based lookup . So my recommendation would be

json -- if you are serving the data over http to some other servicepython file - if data structure is not too large and you need not any special kind of look ups

memcache and redis -- if the data becomes really large