LRU implementation in production code

34,709

Solution 1

Recently I implemented a LRU cache using a linked list spread over a hash map.

    /// Typedef for URL/Entry pair
    typedef std::pair< std::string, Entry > EntryPair;

    /// Typedef for Cache list
    typedef std::list< EntryPair > CacheList;

    /// Typedef for URL-indexed map into the CacheList
    typedef boost::unordered_map< std::string, CacheList::iterator > CacheMap;

    /// Cache LRU list
    CacheList mCacheList;

    /// Cache map into the list
    CacheMap mCacheMap;

It has the advantage of being O(1) for all important operations.

The insertion algorithm:

// create new entry
Entry iEntry( ... );

// push it to the front;
mCacheList.push_front( std::make_pair( aURL, iEntry ) );

// add it to the cache map
mCacheMap[ aURL ] = mCacheList.begin();

// increase count of entries
mEntries++;

// check if it's time to remove the last element
if ( mEntries > mMaxEntries )
{
    // erease from the map the last cache list element
    mCacheMap.erase( mCacheList.back().first );

    // erase it from the list
    mCacheList.pop_back();

    // decrease count
    mEntries--;
}

Solution 2

Here is a very simple implementation of LRU cache

https://github.com/lamerman/cpp-lru-cache .

It's easy to use and understand how it works. The total size of code is about 50 lines.

Solution 3

For simplicity, maybe you should consider using Boost's MultiIndex map. If we separate the key from the data, we support multiple sets of keys on the same data.

From [ http://old.nabble.com/realization-of-Last-Recently-Used-cache-with-boost%3A%3Amulti_index-td22326432.html ]:

"...use two indexes: 1) hashed for searching value by key 2) sequential for tracking last recently used items (get function put item as last item in sequesnce. If we need to remove some items from cache, we may delete they from begin of sequence)."

Note that the "project" operator "allows the programmer to move between different indices of the same multi_index_container" efficiently.

Solution 4

This article describes implementation using a pair of STL containers (a key-value map plus a list for the key access history), or a single boost::bimap.

Solution 5

In our production environment we use a C++ double linked list which is similar to the Linux kernel linked list. The beauty of it is that you can add an object to as many linked lists as you want and list operation is fast and simple.

Share:
34,709
sud03r
Author by

sud03r

Graduate Student at University of Waterloo

Updated on September 07, 2020

Comments

  • sud03r
    sud03r over 3 years

    I have some C++ code where I need to implement cache replacement using LRU technique.
    So far I know two methods to implement LRU cache replacement:

    1. Using timeStamp for each time the cached data is accessed and finally comparing the timeStamps at time of replacement.
    2. Using a stack of cached items and moving them to the top if they are accessed recently, so finally the bottom will contain the LRU Candidate.

    So, which of these is better to be used in production code?
    Are their any other better methods?