LRU cache implementation in Javascript

18,798

Solution 1

Map should be O(1) in most implementations average case. Since Map keeps insertion order, adding a bit of code around it will get you a LRU and for most uses this should be plenty fast.

I needed a simple LRU cache for a small number of expensive operations (1 second). I felt better about copy-pasting some small code rather than introducing something external, but since I didn't find it I wrote it:

class LRU {
    constructor(max = 10) {
        this.max = max;
        this.cache = new Map();
    }

    get(key) {
        let item = this.cache.get(key);
        if (item) {
            // refresh key
            this.cache.delete(key);
            this.cache.set(key, item);
        }
        return item;
    }

    set(key, val) {
        // refresh key
        if (this.cache.has(key)) this.cache.delete(key);
        // evict oldest
        else if (this.cache.size == this.max) this.cache.delete(this.first());
        this.cache.set(key, val);
    }

    first() {
        return this.cache.keys().next().value;
    }
}

Usage:

> let cache = new LRU(3)
> [1, 2, 3, 4, 5].forEach(v => cache.set(v, 'v:'+v))
> cache.get(2)
undefined
> cache.get(3)
"v:3"
> cache.set(6, 6)
> cache.get(4)
undefined
> cache.get(3)
"v:3"

Solution 2

This:

https://github.com/monsur/jscache

seems to fit you case although setItem (i.e. put) is O(N) in the worst case, that happens if the cache is filled up on insertion. In this case the cache is searched to purge expired items or least recently used items. getItem is O(1) and the expiry is handled on the getItem operation (i.e. if the item being fetched is expired, removes it and returns null).

The code is compact enough to be easily understood.

P.S. It might be useful to add to the constructor the option to specify the fillFactor, which is fixed to 0.75 (meaning that when the cache is purged it's size is reduced at least to 3/4th of the maximum size)

Solution 3

This is worth a mention: https://github.com/rsms/js-lru

The core set of functions are O(1) and the code is heavily commented (with ASCII art too!)

Solution 4

The monsur.com implementation is O(n) on insertion only because it has items which actually expire on real world time. It is not a simple LRU. If you only care about maintaining the most recently used items without regard to real world time, this can be done in O(1). A queue, implemented as a doubly linked list, is O(1) for insertion or deletion from the end, and this is all you should need for a cache. As for lookup, a hash map, which javascript makes pathetically easy, should be good for nearly O(1) lookup (assuming the javascript engine uses a good hashmap, that's implementation dependent of course). So you have a linked list of items with a hash map pointing to the items. Manipulate the ends of the linked list as needed to put new items and requested items on one end and remove old items from the other end.

Solution 5

This library runtime-memcache implements lru and a few other caching schemes in javascript.

It uses modified Doubly Linked List to achieve O(1) for get, set and remove. You can check out the implementation which is pretty simple.

Share:
18,798
Jason S
Author by

Jason S

Updated on June 14, 2022

Comments

  • Jason S
    Jason S almost 2 years

    Java has LinkedHashMap which gets you 99% there to an LRU cache.

    Is there a Javascript implementation of an LRU cache, preferably from a reputable source, that is:

    1. understandable
    2. efficient (amortized O(1) get/put/delete)

    ? I've been searching on the web but couldn't find one; I thought I found one on Ajax Design Patterns but it glosses over the sendToTail() method and has O(n) performance (presumably, since the queue and associative array are split up).

    I suppose I could write my own, but I've learned the hard way that reinventing the wheel for core algorithms can be hazardous to one's health :/

  • Jason S
    Jason S almost 15 years
    thanks, I did run across that. It seemed like it had too many bells & whistles for my application (not to mention the phrase ASP.NET which is a huge red flag in my mind), but maybe I should give it another look.
  • Sam Saffron
    Sam Saffron almost 15 years
    +1 The implementation has nothing to do with ASP.NET I think it is worth a look
  • Jason S
    Jason S almost 14 years
    the linked list needs to be able to delete (but not insert) items from the middle if items are removed from the LRU cache and reinserted. That's the hard part, your answer seems to gloss over that.
  • Eloff
    Eloff about 13 years
    Redundantly, removing from the middle of a doubly linked list is O(n), which you would have to do to maintain the LRU invariant on access.
  • Mörre
    Mörre almost 7 years
    @Eloff, There is the additional hash map to get to any element anywhere in the list with O(1). But you and "Jason S" are right that manipulating the ends isn't enough, any item at any position in the list can be the next one that needs to go back to the front position, so while insertion is at one end removal can be from any position. Still, thanks to the hash map that can be done independent of the length of the list.
  • cnnr
    cnnr almost 4 years
    Please explain your answer with some additional information, not only with link to outer source.
  • Jason S
    Jason S almost 4 years
    interesting... could you comment on any testing / peer review that would help someone justify use of this library? (My question was from 2009 and I have no idea what I was doing at the time, but maybe your answer can help someone else)
  • odinho - Velmont
    odinho - Velmont almost 4 years
    Don't link to an external resource, inline your implementation in the answer.
  • Taran Goel
    Taran Goel almost 4 years
    how is your this.first() working? Map doesn't provide first()
  • odinho - Velmont
    odinho - Velmont almost 4 years
    @TaranGoel The .first() is implemented right there in the code. Just see below the set().
  • Codebling
    Codebling about 3 years
    If I'm not mistaken, this implementation will fail with multiple put/write to the same key -- every element in the cache will be for the same key, which is NOT how LRU should work
  • Codebling
    Codebling about 3 years
    This is the same answer as this other answer. The other answer was posted a year earlier by the original author
  • Kapil Raghuwanshi
    Kapil Raghuwanshi almost 3 years
    Awesome @odinho-Velmont, seems like Map possesses many capabilities already to help us implementing LRU cache. I was trying with simple JS Object (as Hash Table) and DoublyLinkedList and it's becoming quite cumbersome.
  • John Vandivier
    John Vandivier about 2 years
    this fails the leetcode LRU problem. see: gist.github.com/Vandivier/ea2228f8f363417aa6392612e5399449
  • odinho - Velmont
    odinho - Velmont about 2 years
    @JohnVandivier Why? I registered on that site and used your example, it says "Accepted" for me with runtime 95ms and all test cases passing.
  • John Vandivier
    John Vandivier about 2 years
    ah well I can't say much other than it doesn't work for me, so I guess we see different results. I posted a screenshot in the gist above
  • Steven Tsao
    Steven Tsao about 2 years
    @odinho-Velmont John's answer works. I assume he's referring to the solution from the top of the thread where get returns undefined instead of -1 as expected by the test suite on LC.
  • odinho - Velmont
    odinho - Velmont about 2 years
    @StevenTsao Yeah, People need to adjust it to what they need of course. :) Like returning -1 if not found if that's what they require (like leetcode). Sure the code won't work directly 1-to-1 copied into tests that require slightly different API :P