MemoryCache does not obey memory limits in configuration

41,793

Solution 1

Wow, so I just spent entirely too much time digging around in the CLR with reflector, but I think I finally have a good handle on what's going on here.

The settings are being read in correctly, but there seems to be a deep-seated problem in the CLR itself that looks like it will render the memory limit setting essentially useless.

The following code is reflected out of the System.Runtime.Caching DLL, for the CacheMemoryMonitor class (there is a similar class that monitors physical memory and deals with the other setting, but this is the more important one):

protected override int GetCurrentPressure()
{
  int num = GC.CollectionCount(2);
  SRef ref2 = this._sizedRef;
  if ((num != this._gen2Count) && (ref2 != null))
  {
    this._gen2Count = num;
    this._idx ^= 1;
    this._cacheSizeSampleTimes[this._idx] = DateTime.UtcNow;
    this._cacheSizeSamples[this._idx] = ref2.ApproximateSize;
    IMemoryCacheManager manager = s_memoryCacheManager;
    if (manager != null)
    {
      manager.UpdateCacheSize(this._cacheSizeSamples[this._idx], this._memoryCache);
    }
  }
  if (this._memoryLimit <= 0L)
  {
    return 0;
  }
  long num2 = this._cacheSizeSamples[this._idx];
  if (num2 > this._memoryLimit)
  {
    num2 = this._memoryLimit;
  }
  return (int) ((num2 * 100L) / this._memoryLimit);
}

The first thing you might notice is that it doesn't even try to look at the size of the cache until after a Gen2 garbage collection, instead just falling back on the existing stored size value in cacheSizeSamples. So you won't ever be able to hit the target right on, but if the rest worked we would at least get a size measurement before we got in real trouble.

So assuming a Gen2 GC has occurred, we run into problem 2, which is that ref2.ApproximateSize does a horrible job of actually approximating the size of the cache. Slogging through CLR junk I found that this is a System.SizedReference, and this is what it's doing to get the value (IntPtr is a handle to the MemoryCache object itself):

[SecurityCritical]
[MethodImpl(MethodImplOptions.InternalCall)]
private static extern long GetApproximateSizeOfSizedRef(IntPtr h);

I'm assuming that extern declaration means that it goes diving into unmanaged windows land at this point, and I have no idea how to start finding out what it does there. From what I've observed though it does a horrible job of trying to approximate the size of the overall thing.

The third noticeable thing there is the call to manager.UpdateCacheSize which sounds like it should do something. Unfortunately in any normal sample of how this should work s_memoryCacheManager will always be null. The field is set from the public static member ObjectCache.Host. This is exposed for the user to mess with if he so chooses, and I was actually able to make this thing sort of work like it's supposed to by slopping together my own IMemoryCacheManager implementation, setting it to ObjectCache.Host, and then running the sample. At that point though, it seems like you might as well just make your own cache implementation and not even bother with all this stuff, especially since I have no idea if setting your own class to ObjectCache.Host (static, so it affects every one of these that might be out there in process) to measure the cache could mess up other things.

I have to believe that at least part of this (if not a couple parts) is just a straight up bug. It'd be nice to hear from someone at MS what the deal was with this thing.

TLDR version of this giant answer: Assume that CacheMemoryLimitMegabytes is completely busted at this point in time. You can set it to 10 MB, and then proceed to fill up the cache to ~2GB and blow an out of memory exception with no tripping of item removal.

Solution 2

I know this answer is crazy late, but better late than never. I wanted to let you know that I wrote a version of MemoryCache that resolves the Gen 2 Collection issues automatically for you. It therefore trims whenever the polling interval indicates memory pressure. If you're experiencing this issue, give it a go!

http://www.nuget.org/packages/SharpMemoryCache

You can also find it on GitHub if you're curious about how I solved it. The code is somewhat simple.

https://github.com/haneytron/sharpmemorycache

Solution 3

I've encountered this issue as well. I'm caching objects that are being fired into my process dozens of times per second.

I have found the following configuration and usage frees the items every 5 seconds most of the time.

App.config:

Take note of cacheMemoryLimitMegabytes. When this was set to zero, the purging routine would not fire in a reasonable time.

   <system.runtime.caching>
    <memoryCache>
      <namedCaches>
        <add name="Default" cacheMemoryLimitMegabytes="20" physicalMemoryLimitPercentage="0" pollingInterval="00:00:05" />
      </namedCaches>
    </memoryCache>
  </system.runtime.caching>  

Adding to cache:

MemoryCache.Default.Add(someKeyValue, objectToCache, new CacheItemPolicy { AbsoluteExpiration = DateTime.Now.AddSeconds(5), RemovedCallback = cacheItemRemoved });

Confirming the cache removal is working:

void cacheItemRemoved(CacheEntryRemovedArguments arguments)
{
    System.Diagnostics.Debug.WriteLine("Item removed from cache: {0} at {1}", arguments.CacheItem.Key, DateTime.Now.ToString());
}

Solution 4

I have done some testing with the example of @Canacourse and the modification of @woany and I think there are some critical calls that block the cleaning of the memory cache.

public void CacheItemRemoved(CacheEntryRemovedArguments Args)
{
    // this WriteLine() will block the thread of
    // the MemoryCache long enough to slow it down,
    // and it will never catch up the amount of memory
    // beyond the limit
    Console.WriteLine("...");

    // ...

    // this ReadKey() will block the thread of 
    // the MemoryCache completely, till you press any key
    Console.ReadKey();
}

But why does the modification of @woany seems to keep the memory at the same level? Firstly, the RemovedCallback is not set and there is no console output or waiting for input that could block the thread of the memory cache.

Secondly...

public void AddItem(string Name, string Value)
{
    // ...

    // this WriteLine will block the main thread long enough,
    // so that the thread of the MemoryCache can do its work more frequently
    Console.WriteLine("...");
}

A Thread.Sleep(1) every ~1000th AddItem() would have the same effect.

Well, it's not a very deep investigation of the problem, but it looks as if the thread of the MemoryCache does not get enough CPU time for cleaning, while many new elements are added.

Solution 5

I (thankfully) stumbled across this useful post yesterday when first attempting to use the MemoryCache. I thought it would be a simple case of setting values and using the classes but I encountered similar issues outlined above. To try and see what was going on I extracted the source using ILSpy and then set up a test and stepped through the code. My test code was very similar to the code above so I won't post it. From my tests I noticed that the measurement of the cache size was never particularly accurate (as mentioned above) and given the current implementation would never work reliably. However the physical measurement was fine and if the physical memory was measured at every poll then it seemed to me like the code would work reliably. So, I removed the gen 2 garbage collection check within MemoryCacheStatistics; under normal conditions no memory measurements will be taken unless there has been another gen 2 garbage collection since the last measurement.

In a test scenario this obviously makes a big difference as the cache is being hit constantly so objects never have the chance to get to gen 2. I think we are going to use the modified build of this dll on our project and use the official MS build when .net 4.5 comes out (which according to the connect article mentioned above should have the fix in it). Logically I can see why the gen 2 check has been put in place but in practise I'm not sure if it makes much sense. If the memory reaches 90% (or whatever limit it has been set to) then it should not matter if a gen 2 collection has occured or not, items should be evicted regardless.

I left my test code running for about 15 minutes with a the physicalMemoryLimitPercentage set to 65%. I saw the memory usage remain between 65-68% during the test and saw things getting evicted properly. In my test I set the pollingInterval to 5 seconds, physicalMemoryLimitPercentage to 65 and physicalMemoryLimitPercentage to 0 to default this.

Following the above advice; an implementation of IMemoryCacheManager could be made to evict things from the cache. It would however suffer from the gen 2 check issue mentioned. Although, depending on the scenario, this may not be a problem in production code and may work sufficiently for people.

Share:
41,793
Canacourse
Author by

Canacourse

Updated on December 08, 2020

Comments

  • Canacourse
    Canacourse over 3 years

    I’m working with the .NET 4.0 MemoryCache class in an application and trying to limit the maximum cache size, but in my tests it does not appear that the cache is actually obeying the limits.

    I'm using the settings which, according to MSDN, are supposed to limit the cache size:

    1. CacheMemoryLimitMegabytes: The maximum memory size, in megabytes, that an instance of an object can grow to."
    2. PhysicalMemoryLimitPercentage: "The percentage of physical memory that the cache can use, expressed as an integer value from 1 to 100. The default is zero, which indicates that MemoryCache instances manage their own memory1 based on the amount of memory that is installed on the computer." 1. This is not entirely correct-- any value below 4 is ignored and replaced with 4.

    I understand that these values are approximate and not hard limits as the thread that purges the cache is fired every x seconds and is also dependent on the polling interval and other undocumented variables. However even taking into account these variances, I'm seeing wildly inconsistent cache sizes when the first item is being evicted from the cache after setting CacheMemoryLimitMegabytes and PhysicalMemoryLimitPercentage together or singularly in a test app. To be sure I ran each test 10 times and calculated the average figure.

    These are the results of testing the example code below on a 32-bit Windows 7 PC with 3GB of RAM. Size of the cache is taken after the first call to CacheItemRemoved() on each test. (I am aware the actual size of cache will be larger than this)

    MemLimitMB    MemLimitPct     AVG Cache MB on first expiry    
       1            NA              84
       2            NA              84
       3            NA              84
       6            NA              84
      NA             1              84
      NA             4              84
      NA            10              84
      10            20              81
      10            30              81
      10            39              82
      10            40              79
      10            49              146
      10            50              152
      10            60              212
      10            70              332
      10            80              429
      10           100              535
     100            39              81
     500            39              79
     900            39              83
    1900            39              84
     900            41              81
     900            46              84
    
     900            49              1.8 GB approx. in task manager no mem errros
     200            49              156
     100            49              153
    2000            60              214
       5            60              78
       6            60              76
       7           100              82
      10           100              541
    

    Here is the test application:

    using System;
    using System.Collections.Generic;
    using System.Collections.Specialized;
    using System.Linq;
    using System.Runtime.Caching;
    using System.Text;
    namespace FinalCacheTest
    {       
        internal class Cache
        {
            private Object Statlock = new object();
            private int ItemCount;
            private long size;
            private MemoryCache MemCache;
            private CacheItemPolicy CIPOL = new CacheItemPolicy();
    
            public Cache(long CacheSize)
            {
                CIPOL.RemovedCallback = new CacheEntryRemovedCallback(CacheItemRemoved);
                NameValueCollection CacheSettings = new NameValueCollection(3);
                CacheSettings.Add("CacheMemoryLimitMegabytes", Convert.ToString(CacheSize)); 
                CacheSettings.Add("physicalMemoryLimitPercentage", Convert.ToString(49));  //set % here
                CacheSettings.Add("pollingInterval", Convert.ToString("00:00:10"));
                MemCache = new MemoryCache("TestCache", CacheSettings);
            }
    
            public void AddItem(string Name, string Value)
            {
                CacheItem CI = new CacheItem(Name, Value);
                MemCache.Add(CI, CIPOL);
    
                lock (Statlock)
                {
                    ItemCount++;
                    size = size + (Name.Length + Value.Length * 2);
                }
    
            }
    
            public void CacheItemRemoved(CacheEntryRemovedArguments Args)
            {
                Console.WriteLine("Cache contains {0} items. Size is {1} bytes", ItemCount, size);
    
                lock (Statlock)
                {
                    ItemCount--;
                    size = size - 108;
                }
    
                Console.ReadKey();
            }
        }
    }
    
    namespace FinalCacheTest
    {
        internal class Program
        {
            private static void Main(string[] args)
            {
                int MaxAdds = 5000000;
                Cache MyCache = new Cache(1); // set CacheMemoryLimitMegabytes
    
                for (int i = 0; i < MaxAdds; i++)
                {
                    MyCache.AddItem(Guid.NewGuid().ToString(), Guid.NewGuid().ToString());
                }
    
                Console.WriteLine("Finished Adding Items to Cache");
            }
        }
    }
    

    Why is MemoryCache not obeying the configured memory limits?

  • Canacourse
    Canacourse over 12 years
    A fine answer thank you. I gave up trying to figure out what was going on with this and instead now manage the cache size by counting items in/out and calling .Trim() manually as needed. I thought System.Runtime.Caching was an easy choice for my app as it seems to be widely used and I thought therefore would not have any major bugs.
  • Canacourse
    Canacourse almost 12 years
    Are you saying it does or does not get trimmed?
  • Daniel Lidström
    Daniel Lidström almost 11 years
    Yep, it does get trimmed. Strange, considering all the problems people seem to have with MemoryCache. I wonder why this sample works.
  • Bruno Brant
    Bruno Brant over 10 years
    Wow. That's why I love SO. I ran into the exact same behavior, wrote a test app and managed to crash my PC many times even though polling time was as low as 10 seconds and cache memory limit was 1MB. Thanks for all the insights.
  • Bruno Brant
    Bruno Brant over 10 years
    An update: I'm using .NET framework 4.5 and in no way the problem is corrected. The cache can grow large enough to crash the machine.
  • Bruno Brant
    Bruno Brant over 10 years
    A question: do you have the link to the connect article you mentioned?
  • Bruno Brant
    Bruno Brant over 10 years
    I don't follow it. I tried repeating the example, but the cache still grows indefinitely.
  • Bruno Brant
    Bruno Brant over 10 years
    I know I just mentioned it up there in the question but, for completeness sake, I'll mention it here again. I've opened an issue at Connect for this. connect.microsoft.com/VisualStudio/feedback/details/806334/…
  • Svend
    Svend over 9 years
    I'm using the MemoryCache for external service data, and when I test by injecting garbage into the MemoryCache, it does auto-trim content, but only when using the percentage limit value. Absolute size does nothing to limit size, at least when inpsecting with a memory profiler. Not tested in a while loop, but by more "realistic" usages (it's a backend system, so I've added a WCF service which lets me inject data into the caches on demand).
  • Karl Cassar
    Karl Cassar over 9 years
    This works as intended, tested out with a generator which fills up cache with loads of strings of 1000 characters. Although, adding up what should be like 100MB to the cache adds up actually 200 - 300MB to the cache, which I found quite strange. Maybe some overheards I'm not counting.
  • Haney
    Haney over 9 years
    @KarlCassar strings in .NET are roughly 2n + 20 in size with respect to bytes, where n is the length of the string. This is mostly due to Unicode support.
  • Ohad Schneider
    Ohad Schneider over 9 years
  • Bernhard
    Bernhard over 4 years
    A confusing example class: "Statlock", "ItemCount", "size" are useless... The NameValueCollection(3) only holds 2 items?... In fact You created a cache with sizelimit and pollInterval properties, nothing more! The problem of "not evicting" items is not touched...
  • Павле
    Павле over 3 years
    Is this still an issue in .NET Core?
  • Igor Beaufils
    Igor Beaufils about 3 years