LRU cache implementation in Java
In this blog post, we will delve into the concept of LRU caching and demonstrate how to implement it in Java using a combination of HashMap and LinkedList.
Understanding LRU Caching
LRU caching is a strategy that removes the least recently used items first when the cache reaches its capacity. This is particularly useful in scenarios where you need to keep frequently accessed items in memory, saving time and resources. LRU caches are employed in various domains, including database management systems and web applications.
Data Structures for Implementation
To build an LRU cache, we will use two essential data structures: a HashMap and a LinkedList. The HashMap will provide us with fast key-value lookups, while the LinkedList will help us maintain the order of recently used items.
Designing the LRU Cache
In our Java implementation, we'll create an LRU cache class with the following attributes:
- A HashMap to store key-value pairs.
- A doubly linked list to keep track of usage order.
- A constructor to initialize the cache size.
When the cache is full, we need to remove the least recently used item. Our LinkedList will come in handy here, as it helps us identify and evict the oldest item.
Implementing a LRU cache in Java
We'll add two fundamental cache operations:
- get(key): Retrieves items from the cache and updates their position in the LRU order
- put(key, value): Adds items in the cache, handling capacity, and LRU order maintenance.
Here is the complete implementation:
public class CacheData<K, V> {
public K key;
public V value;
public CacheData(K key, V value) {
this.key = key;
this.value = value;
}
@Override
public String toString() {
return "{ key: " + key + ", value: " + value + " }";
}
@Override
@SuppressWarnings("unchecked")
public boolean equals(Object o) {
if (o == this)
return true;
if (!(o instanceof CacheData<?, ?>))
return false;
CacheData<K, V> other = (CacheData<K, V>) o;
boolean key = (this.key == null && other.key == null)
|| (this.key != null && this.key.equals(other.key));
return this.value == other.value && key;
}
}
import java.util.HashMap;
import java.util.Map;
public class LRUCache<K, V> {
public LinkedList<CacheData<K, V>> order = new LinkedList<>();
public Map<K, V> cache = new HashMap<>();
public int capacity;
public LRUCache() {
this(5);
}
public LRUCache(int capacity) {
this.capacity = capacity;
}
public void put(K key, V value) {
CacheData<K, V> data = new CacheData<>(key, value);
if (order.size() == capacity) {
CacheData<K, V> dataRemoved = order.removeAtEnd();
cache.remove(dataRemoved.key);
System.out.println("LRU data removed from cache: " + dataRemoved);
} else {
cache.put(data.key, data.value);
order.addAtStart(data);
}
}
public V get(K key) {
V value = cache.get(key);
if (value != null) {
CacheData<K, V> cacheData = new CacheData<>(key, value);
order.remove(cacheData);
order.addAtStart(cacheData);
}
return value;
}
public void showCacheOrder() {
order.print();
}
}
In the put(K key, V value) method, we first check if the size of the linked list is equal to the capacity. If yes, we remove the LRU data from the list and from the map.
In the get(K key) method, if there is a cache hit - we remove the data from the list and re-add it to the start of the list in order to maintain the correct order of items.
That is all about how to implement a LRU cache in Java.
No comments:
Post a Comment