What’s a Cache? According to wikipedia,

A cache is a hardware or software component that stores data so future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation, or the duplicate of data stored elsewhere.

In short, Cache is a technique to store common used resources and return it immediately when someone requests that same resource. These resources are mostly stored in memory. It has significant improvement on the performance of the system with small increase in the storage size.

Instead of talking about Cache in general, I’ll talk about specific type of software cache, LRU Cache here.

What’s an LRU Cache?

One of the most common cache system is LRU (Least Recently Used) Cache. It is a cache with eviction policy of evicting the least recently used entry. In other words, when LRU cache is low on memory, it evicts the least recently used entry.

The way LRU cache works is quite simple. When the user requests for a resource R.

  • If R exists in the Cache, it will return R immediately
  • If not and Cache has room to add entry, R is fetched, inserted into the Cache and returned to the client
  • If the Cache doesn’t have any space, the least recently used entry is determined, removed from Cache and R is inserted into the Cache.

An LRU Cache constraints

An LRU Cache meets the following constraints:

  1. Bounded Size: It has a fixed size as defined by the system or user
  2. Fast Access: Lookup and insert operations should be in O(1) time
  3. Eviction: When Cache is full (when size limit is reached), it should evict entries that are least recently used

Designing an LRU Cache

An LRU Cache should support fast lookup. Apparently, in order to achieve fast lookup, we need to use Hashtable or HashMap.
Also, an LRU Cache requires that insert and delete operations should be in O(1) time. The obvious choice for this requirement is Linked List. Linked List support insert/delete operations in O(1) time if we have the reference of element.

While building an LRU cache requires that you think in terms of these two data structures. The reality is that these two data structures actually work coherently to achieve the design.

So, we are finalizing on these data structures: HashMap and Doubly Linked List

HashMap holds the keys and values as reference of the nodes (entries) of Doubly Linked List.

Doubly Linked List is used to store list of nodes (entries) with most recently used node at the head of the list. So, as more nodes are added to the list, least recently used nodes are moved to the end of the list with node at tail being the least recently used in the list.

Cache Implementation

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
public class LRUCache<K, V> {

private final int CACHE_SIZE;
private final Map<K, Entry<K, V>> CACHE;

private Entry<K, V> head, tail;

public LRUCache() {
this(16); // default cache size as 16
}

public LRUCache(int CACHE_SIZE) {
this.CACHE_SIZE = CACHE_SIZE;
CACHE = new HashMap<>(CACHE_SIZE);
}

public V get(K key) {
if (CACHE.containsKey(key)) {
Entry<K, V> entry = CACHE.get(key);
// remove the recently accessed entry from linkedlist
remove(entry);
// and move to top
splayOnTop(entry);
return entry.value;
}
return null;
}

public void put(K key, V value) {
if (CACHE.containsKey(key)) {
Entry<K, V> entry = CACHE.get(key);
entry.value = value;
remove(entry);
splayOnTop(entry);
} else {
Entry<K, V> entry = new Entry<>();
entry.key = key;
entry.value = value;
// reached the cache size, evict the least recently used entry
if (CACHE.size() == CACHE_SIZE) {
CACHE.remove(tail.key);
remove(tail);
}
// move the recently accessed entry to top
splayOnTop(entry);
CACHE.put(key, entry);
}
}

public Set<Entry<K, V>> entrySet() {
return new HashSet<>(CACHE.values());
}

public Set<K> keySet() {
return CACHE.keySet();
}

@Override
public String toString() {
StringBuilder sb = new StringBuilder("{");
String delimiter = "";
for (Entry<K, V> entry : entrySet()) {
sb.append(delimiter).append(entry.key).append("=").append(entry.value);
delimiter = ", ";
}
sb.append("}");
return sb.toString();
}

private void splayOnTop(Entry<K, V> entry) {
entry.next = head;
if (head != null) // when linkedlist not empty
head.prev = entry;
head = entry;
if (tail == null) // when first entry
tail = head;
}

private void remove(Entry<K, V> entry) {
if (entry.prev != null) {
entry.prev.next = entry.next;
} else {
head = entry.next;
}
if (entry.next != null) {
entry.next.prev = entry.prev;
} else {
tail = entry.prev;
}
}

private static class Entry<K, V> {
K key;
V value;
Entry prev;
Entry next;
}
}

Testing our LRU Cache

Let’s test our LRU Cache that we have just implemented above.
We’ll start with small cache size for testing purpose. I’m testing with a Cache size of 4. First, add few entries into the cache and check the Cache contents. As the Cache reaches its size and we add new entry into the Cache, you can see the least recently used entry is evicted.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
public static void main(String[] args) {
LRUCache<Integer, Integer> lruCache = new LRUCache<>(4);
lruCache.put(1, 11);
lruCache.put(2, 12);
lruCache.put(3, 13);
lruCache.get(2);
lruCache.put(4, 14);
lruCache.get(3);

System.out.println("Cache contents before eviction");
System.out.println(lruCache);

lruCache.put(5, 15);

System.out.println("Cache contents after eviction");
System.out.println(lruCache);
}

And the output of the run is

Console Output
1
2
3
4
Cache contents before eviction
{2=12, 1=11, 4=14, 3=13}
Cache contents after eviction
{2=12, 4=14, 5=15, 3=13}

Check the complete source code at LRU Cache