Is MemoryCache scope session or application wide? Is MemoryCache scope session or application wide? asp.net asp.net

Is MemoryCache scope session or application wide?


From MSDN:

The main differences between the Cache and MemoryCache classes are that the MemoryCache class has been changed to make it usable by .NET Framework applications that are not ASP.NET applications. For example, the MemoryCache class has no dependencies on the System.Web assembly. Another difference is that you can create multiple instances of the MemoryCache class for use in the same application and in the same AppDomain instance.

Reading that and doing some investigation in reflected code it is obvious that MemoryCache is just a simple class. You can use MemoryCache.Default property to (re)use same instance or you can construct as many instances as you want (though recommended is as few as possible).

So basically the answer lies in your code.
If you use MemoryCache.Default then your cache lives as long as your application pool lives. (Just to remind you that default application pool idle time-out is 20 minutes which is less than 1 hour.)

If you create it using new MemoryCache(string, NameValueCollection) then the above mentioned considerations apply plus the context you create your instance in, that is if you create your instance inside controller (which I hope is not the case) then your cache lives for one request

It's a pity I can't find any references, but ... MemoryCache does not guarantee to hold data according to a cache policy you specify. In particular if machine you're running your app on gets stressed on memory your cache might be discarded.

If you still have no luck figuring out what's the reason for early cache item invalidation you could take advantage of RemoveCallback and investigate what is the reason of item invalidation.


Reviewing this a year later I found out some more information on my original post about the cache 'dropping' randomly. The MSDN states the following for the configurable cache properties CacheMemoryLimitMegabytes and PhysicalMemoryLimitPercentage:

The default value is 0, which means that the MemoryCache class's autosize heuristics are used by default.

Doing some decompiling and investigation, there are predetermined scenarios deep in the CacheMemoryMonitor.cs class that define the memory thresholds. Here is a sampling of the comments in that class on the AutoPrivateBytesLimit property:

// Auto-generate the private bytes limit:// - On 64bit, the auto value is MIN(60% physical_ram, 1 TB)// - On x86, for 2GB, the auto value is MIN(60% physical_ram, 800 MB)// - On x86, for 3GB, the auto value is MIN(60% physical_ram, 1800 MB)//// - If it's not a hosted environment (e.g. console app), the 60% in the above//   formulas will become 100% because in un-hosted environment we don't launch//   other processes such as compiler, etc.

It's not necessarily that the specific values are important as much as realizing to why cache is often used: to store large objects that we don't want to fetch over and over. If these large objects are being stored in the cache and the hosting environments memory threshold based on these internal calculations are exceeded, you may have the item removed from cache automatically. This could certainly explain my OP because I was storing a very large collection in memory on a hosted server with probably 2GB of memory running multiple apps in IIS.

There is an explicit override to setting these values. You can via configuration (or when setting up the MemoryCache instance) set the CacheMemoryLimitMegabytes and PhysicalMemoryLimitPercentage values. Here is modified sample from the following MSDN link where I set the physicalMemoryPercentage to 95 (%):

<configuration>  <system.runtime.caching>    <memoryCache>      <namedCaches>          <add name="default"                physicalMemoryLimitPercentage="95" />      </namedCaches>    </memoryCache>  </system.runtime.caching></configuration>