在java中简单易用的LRU缓存

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/224868/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-11 11:41:00  来源:igfitidea点击:

Easy, simple to use LRU cache in java

javacachinglru

提问by Juraj

I know it's simple to implement, but I want to reuse something that already exist.

我知道实现起来很简单,但我想重用已经存在的东西。

Problem I want to solve is that I load configuration (from XML so I want to cache them) for different pages, roles, ... so the combination of inputs can grow quite much (but in 99% will not). To handle this 1%, I want to have some max number of items in cache...

我要解决的问题是,我为不同的页面、角色加载配置(从 XML,所以我想缓存它们),所以输入的组合可以增长很多(但 99% 不会)。为了处理这 1%,我想在缓存中有一些最大数量的项目......

Till know I have found org.apache.commons.collections.map.LRUMap in apache commons and it looks fine but want to check also something else. Any recommendations?

直到知道我在 apache commons 中找到了 org.apache.commons.collections.map.LRUMap ,它看起来不错,但还想检查其他东西。有什么建议吗?

采纳答案by Guido

You can use a LinkedHashMap(Java 1.4+) :

您可以使用LinkedHashMap(Java 1.4+):

// Create cache
final int MAX_ENTRIES = 100;
Map cache = new LinkedHashMap(MAX_ENTRIES+1, .75F, true) {
    // This method is called just after a new entry has been added
    public boolean removeEldestEntry(Map.Entry eldest) {
        return size() > MAX_ENTRIES;
    }
};

// Add to cache
Object key = "key";
cache.put(key, object);

// Get object
Object o = cache.get(key);
if (o == null && !cache.containsKey(key)) {
    // Object not in cache. If null is not a possible value in the cache,
    // the call to cache.contains(key) is not needed
}

// If the cache is to be used by multiple threads,
// the cache must be wrapped with code to synchronize the methods
cache = (Map)Collections.synchronizedMap(cache);

回答by Bobby Powers

This is an old question, but for posterity I wanted to list ConcurrentLinkedHashMap, which is thread safe, unlike LRUMap. Usage is quite easy:

这是一个老问题,但为了后代,我想列出ConcurrentLinkedHashMap,它是线程安全的,与LRUMap不同。使用非常简单:

ConcurrentMap<K, V> cache = new ConcurrentLinkedHashMap.Builder<K, V>()
    .maximumWeightedCapacity(1000)
    .build();

And the documentation has some good examples, like how to make the LRU cache size-based instead of number-of-items based.

并且文档中有一些很好的例子,比如如何使 LRU 缓存基于大小而不是基于项目数。

回答by Daimon

I also had same problem and I haven't found any good libraries... so I've created my own.

我也有同样的问题,我还没有找到任何好的库……所以我创建了自己的库。

simplelrucache provides threadsafe, very simple, non-distributed LRU caching with TTL support. It provides two implementations

simplelrucache 提供线程安全的、非常简单的、非分布式 LRU 缓存,支持 TTL。它提供了两种实现

  • Concurrent based on ConcurrentLinkedHashMap
  • Synchronized based on LinkedHashMap
  • 基于 ConcurrentLinkedHashMap 的并发
  • 基于LinkedHashMap同步

You can find it here.

你可以在这里找到它。

回答by Barak

Hereis a very simple and easy to use LRU cache in Java. Although it is short and simple it is production quality. The code is explained (look at the README.md) and has some unit tests.

这里有一个非常简单易用的 Java 中的 LRU 缓存。虽然它简短而简单,但它是生产质量。代码已解释(查看 README.md)并有一些单元测试。

回答by botek

Here is my implementation which lets me keep an optimal number of elements in memory.

这是我的实现,它让我在内存中保留最佳数量的元素。

The point is that I do not need to keep track of what objects are currently being used since I'm using a combination of a LinkedHashMap for the MRU objects and a WeakHashMap for the LRU objects. So the cache capacity is no less than MRU size plus whatever the GC lets me keep. Whenever objects fall off the MRU they go to the LRU for as long as the GC will have them.

关键是我不需要跟踪当前正在使用的对象,因为我使用的是 MRU 对象的 LinkedHashMap 和 LRU 对象的 WeakHashMap 的组合。所以缓存容量不小于 MRU 大小加上 GC 允许我保留的任何大小。每当对象从 MRU 上脱落时,只要 GC 拥有它们,它们就会进入 LRU。

public class Cache<K,V> {
final Map<K,V> MRUdata;
final Map<K,V> LRUdata;

public Cache(final int capacity)
{
    LRUdata = new WeakHashMap<K, V>();

    MRUdata = new LinkedHashMap<K, V>(capacity+1, 1.0f, true) {
        protected boolean removeEldestEntry(Map.Entry<K,V> entry)
        {
            if (this.size() > capacity) {
                LRUdata.put(entry.getKey(), entry.getValue());
                return true;
            }
            return false;
        };
    };
}

public synchronized V tryGet(K key)
{
    V value = MRUdata.get(key);
    if (value!=null)
        return value;
    value = LRUdata.get(key);
    if (value!=null) {
        LRUdata.remove(key);
        MRUdata.put(key, value);
    }
    return value;
}

public synchronized void set(K key, V value)
{
    LRUdata.remove(key);
    MRUdata.put(key, value);
}
}