Locking pattern for proper use of .NET MemoryCache Locking pattern for proper use of .NET MemoryCache multithreading multithreading

Locking pattern for proper use of .NET MemoryCache


This is my 2nd iteration of the code. Because MemoryCache is thread safe you don't need to lock on the initial read, you can just read and if the cache returns null then do the lock check to see if you need to create the string. It greatly simplifies the code.

const string CacheKey = "CacheKey";static readonly object cacheLock = new object();private static string GetCachedData(){    //Returns null if the string does not exist, prevents a race condition where the cache invalidates between the contains check and the retreival.    var cachedString = MemoryCache.Default.Get(CacheKey, null) as string;    if (cachedString != null)    {        return cachedString;    }    lock (cacheLock)    {        //Check to see if anyone wrote to the cache while we where waiting our turn to write the new value.        cachedString = MemoryCache.Default.Get(CacheKey, null) as string;        if (cachedString != null)        {            return cachedString;        }        //The value still did not exist so we now write it in to the cache.        var expensiveString = SomeHeavyAndExpensiveCalculation();        CacheItemPolicy cip = new CacheItemPolicy()                              {                                  AbsoluteExpiration = new DateTimeOffset(DateTime.Now.AddMinutes(20))                              };        MemoryCache.Default.Set(CacheKey, expensiveString, cip);        return expensiveString;    }}

EDIT: The below code is unnecessary but I wanted to leave it to show the original method. It may be useful to future visitors who are using a different collection that has thread safe reads but non-thread safe writes (almost all of classes under the System.Collections namespace is like that).

Here is how I would do it using ReaderWriterLockSlim to protect access. You need to do a kind of "Double Checked Locking" to see if anyone else created the cached item while we where waiting to to take the lock.

const string CacheKey = "CacheKey";static readonly ReaderWriterLockSlim cacheLock = new ReaderWriterLockSlim();static string GetCachedData(){    //First we do a read lock to see if it already exists, this allows multiple readers at the same time.    cacheLock.EnterReadLock();    try    {        //Returns null if the string does not exist, prevents a race condition where the cache invalidates between the contains check and the retreival.        var cachedString = MemoryCache.Default.Get(CacheKey, null) as string;        if (cachedString != null)        {            return cachedString;        }    }    finally    {        cacheLock.ExitReadLock();    }    //Only one UpgradeableReadLock can exist at one time, but it can co-exist with many ReadLocks    cacheLock.EnterUpgradeableReadLock();    try    {        //We need to check again to see if the string was created while we where waiting to enter the EnterUpgradeableReadLock        var cachedString = MemoryCache.Default.Get(CacheKey, null) as string;        if (cachedString != null)        {            return cachedString;        }        //The entry still does not exist so we need to create it and enter the write lock        var expensiveString = SomeHeavyAndExpensiveCalculation();        cacheLock.EnterWriteLock(); //This will block till all the Readers flush.        try        {            CacheItemPolicy cip = new CacheItemPolicy()            {                AbsoluteExpiration = new DateTimeOffset(DateTime.Now.AddMinutes(20))            };            MemoryCache.Default.Set(CacheKey, expensiveString, cip);            return expensiveString;        }        finally         {            cacheLock.ExitWriteLock();        }    }    finally    {        cacheLock.ExitUpgradeableReadLock();    }}


There is an open source library [disclaimer: that I wrote]: LazyCache that IMO covers your requirement with two lines of code:

IAppCache cache = new CachingService();var cachedResults = cache.GetOrAdd("CacheKey",   () => SomeHeavyAndExpensiveCalculation());

It has built in locking by default so the cacheable method will only execute once per cache miss, and it uses a lambda so you can do "get or add" in one go. It defaults to 20 minutes sliding expiration.

There's even a NuGet package ;)


I've solved this issue by making use of the AddOrGetExisting method on the MemoryCache and the use of Lazy initialization.

Essentially, my code looks something like this:

static string GetCachedData(string key, DateTimeOffset offset){    Lazy<String> lazyObject = new Lazy<String>(() => SomeHeavyAndExpensiveCalculationThatReturnsAString());    var returnedLazyObject = MemoryCache.Default.AddOrGetExisting(key, lazyObject, offset);     if (returnedLazyObject == null)       return lazyObject.Value;    return ((Lazy<String>) returnedLazyObject).Value;}

Worst case scenario here is that you create the same Lazy object twice. But that is pretty trivial. The use of AddOrGetExisting guarantees that you'll only ever get one instance of the Lazy object, and so you're also guaranteed to only call the expensive initialization method once.