Limiting the max size of a HashMap in Java

89,167

Solution 1

Sometimes simpler is better.

public class InstrumentedHashMap<K, V> implements Map<K, V> {

    private Map<K, V> map;

    public InstrumentedHashMap() {
        map = new HashMap<K, V>();
    }

    public boolean put(K key, V value) {
        if (map.size() >= MAX && !map.containsKey(key)) {
             return false;
        } else {
             map.put(key, value);
             return true;
        }
    }

    ...
}

Solution 2

You could create a new class like this to limit the size of a HashMap:

public class MaxSizeHashMap<K, V> extends LinkedHashMap<K, V> {
    private final int maxSize;

    public MaxSizeHashMap(int maxSize) {
        this.maxSize = maxSize;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        return size() > maxSize;
    }
}

Solution 3

Simple solution is usually the best, so use unmodifiable or Immutable hashmap.

If you can not change amount of elements, then the size will be fixed - problem solved.

Solution 4

I tried setting the loadFactor to 0.0f in the constructor (meaning that I don't want the HashMap to grow in size EVER) but javac calls this invalid

A loadFactor of 1 means "don't grow until the HashMap is 100% full". A loadFactor of 0 would mean "grow exponentially" if it were accepted.

From the HashMap docs:

The capacity is the number of buckets in the hash table, and the initial capacity is simply the capacity at the time the hash table is created. The load factor is a measure of how full the hash table is allowed to get before its capacity is automatically increased. When the number of entries in the hash table exceeds the product of the load factor and the current capacity, the hash table is rehashed (that is, internal data structures are rebuilt) so that the hash table has approximately twice the number of buckets.

Example: A HashMap initialized with default settings has a capacity of 16 and a load factor of 0.75. Capacity * load factor = 16 * 0.75 = 12. So adding the 13th item to the HashMap will cause it to grow to (approximately) 32 buckets.

Invalid example: A HashMap initialized with a capacity of 16 and a load factor of 0. Capacity * load factor = 16 * 0 = 0. So every attempt to add an item would trigger a rehash and doubling of size, until you ran out of memory.

What you originally wanted:

If the initial capacity is greater than the maximum number of entries divided by the load factor, no rehash operations will ever occur.

If you create a HashMap with a capacity M > N, a load factor of 1, and add N items, it will not grow.

Solution 5

public class Cache {
    private LinkedHashMap<String, String> Cache = null;
    private final int cacheSize;  
    private ReadWriteLock readWriteLock=null;
    public Cache(LinkedHashMap<String, String> psCacheMap, int size) {
        this.Cache = psCacheMap;
        cacheSize = size;
        readWriteLock=new ReentrantReadWriteLock();
    }

    public void put(String sql, String pstmt) throws SQLException{
        if(Cache.size() >= cacheSize && cacheSize > 0){
            String oldStmt=null;
            String oldSql = Cache.keySet().iterator().next();
            oldStmt = remove(oldSql);
            oldStmt.inCache(false);
            oldStmt.close();

        }
        Cache.put(sql, pstmt);
    }

    public String get(String sql){
        Lock readLock=readWriteLock.readLock();
        try{
            readLock.lock();
            return Cache.get(sql);
        }finally{
            readLock.unlock();
        }
    }

    public boolean containsKey(String sql){
        Lock readLock=readWriteLock.readLock();
        try{
            readLock.lock();
            return Cache.containsKey(sql);
        }finally{
            readLock.unlock();
        }
    }

    public String remove(String key){
        Lock writeLock=readWriteLock.writeLock();
        try{
            writeLock.lock();
            return Cache.remove(key);
        }finally{
            writeLock.unlock();
        }
    }

    public LinkedHashMap<String, String> getCache() {
        return Cache;
    }

    public void setCache(
            LinkedHashMap<String, String> Cache) {
        this.Cache = Cache;
    }


}
Share:
89,167
andandandand
Author by

andandandand

doodles.

Updated on July 09, 2022

Comments

  • andandandand
    andandandand almost 2 years

    I want to limit the maximum size of a HashMap to take metrics on a variety of hashing algorithms that I'm implementing. I looked at the loadfactor in one of HashMap's overloaded constructors.

    HashMap(int initialCapacity, float loadFactor) 
    

    I tried setting the loadFactor to 0.0f in the constructor (meaning that I don't want the HashMap to grow in size EVER) but javac calls this invalid:

    Exception in thread "main" java.lang.IllegalArgumentException: Illegal load factor: 0.0
            at java.util.HashMap.<init>(HashMap.java:177)
            at hashtables.CustomHash.<init>(Main.java:20)
            at hashtables.Main.main(Main.java:70) Java Result: 1
    

    Is there another way to limit the size of HashMap so it doesn't grow ever?

  • matt burns
    matt burns almost 12 years
    This answer limits the maximum size of your Map. See Margus' answer for a simpler Map that prevents putting or removing entries.
  • Rishi Dua
    Rishi Dua about 10 years
    Not always better as HashMaps are used most often when there is a large amount of data to be hadled in memory. Memory consumption can become an issue when using Immutable hashmaps
  • Alaa M.
    Alaa M. almost 4 years
    Just to clarify, if I understand correctly, this way when you insert a new element, it just removes the eldest element from the map, and inserts the new one instead, thus limiting the size to maxSize. It's not that it doesn't let you add new elements.
  • Sriman
    Sriman almost 3 years
    @mattburns, Isn't that the question? Or the question has been rephrased after your comment?
  • matt burns
    matt burns almost 3 years
    @sriman , well, yeah, that matches the question title, but not the detailed question description. OP wanted it to never grow ever (Eg, be immutable). But 10 years on, people reading this will probably only be here because they searched for limiting the max capacity of hashmaps... Meh