Using 'HttpContext.Current.Cache' safely Using 'HttpContext.Current.Cache' safely multithreading multithreading

Using 'HttpContext.Current.Cache' safely


The cache object is thread-safe but HttpContext.Current will not be available from background threads. This may or may not apply to you here, it's not obvious from your code snippet whether or not you are actually using background threads, but in case you are now or decide to at some point in the future, you should keep this in mind.

If there's any chance that you'll need to access the cache from a background thread, then use HttpRuntime.Cache instead.

In addition, although individual operations on the cache are thread-safe, sequential lookup/store operations are obviously not atomic. Whether or not you need them to be atomic depends on your particular application. If it could be a serious problem for the same query to run multiple times, i.e. if it would produce more load than your database is able to handle, or if it would be a problem for a request to return data that is immediately overwritten in the cache, then you would likely want to place a lock around the entire block of code.

However, in most cases you would really want to profile first and see whether or not this is actually a problem. Most web applications/services don't concern themselves with this aspect of caching because they are stateless and it doesn't matter if the cache gets overwritten.


You are correct. The retrieving and adding operations are not being treated as an atomic transaction. If you need to prevent the query from running multiple times, you'll need to use a lock.

(Normally this wouldn't be much of a problem, but in the case of a long running query it can be useful to relieve strain on the database.)


I believe the Add should be thread-safe - i.e. it won't error if Add gets called twice with the same key, but obviously the query might execute twice.

Another question, however, is is the data thread-safe. There is no guarantee that each List<blabla> is isolated - it depends on the cache-provider. The in-memory cache provider stores the objects directly, so there is a risk of collisions if any of the threads edit the data (add/remove/swap items in the list, or change properties of one of the items). However, with a serializing provider you should be fine. Of course, this then demands that blabla is serializable...