Cache simply means loading the data by some key and it should be in memory. If we are running our application in clustered mode then available implémentation is Coherence which can isolated interaction with DB and provided fail over and syncronization between multiple application cache.
Here is example how we can build LRU cache in java using concurrent package. This implementation will be fragile and blocking if we would have implemented cache using old synchronize idiom. Cache data is backed up by ConcurrentHashMap so that reading from cache is not blocked and writing to cache will have concurrent effect, it means only the buckets will be locked during write operation.
public class ConcurrentLRUCache<Key, Value> {
private final int maxSize;
private ConcurrentHashMap<Key, Value> map;
private ConcurrentLinkedQueue<Key> queue;
public ConcurrentLRUCache(final int maxSize) {
this.maxSize = maxSize;
map = new ConcurrentHashMap<Key, Value>(maxSize);
queue = new ConcurrentLinkedQueue<Key>();
}
/**
* @param key - may not be null!
* @param value - may not be null!
*/
public void put(final Key key, final Value value) {
if (map.containsKey(key)) {
queue.remove(key); // remove the key from the FIFO queue
}
while (queue.size() >= maxSize) {
Key oldestKey = queue.poll();
if (null != oldestKey) {
map.remove(oldestKey);
}
}
queue.add(key);
map.put(key, value);
}
/**
* @param key - may not be null!
* @return the value associated to the given key or null
*/
public Value get(final Key key) {
return map.get(key);
}
}