Cache replacement refers to the process that takes place when the cache becomes full and old objects must be removed to make space for new ones. Usually, a cache assigns some kind of value to each object and removes the least valuable ones. The actual meaning of “valuable” may vary from one cache to another. Typically, an object’s value is related to the probability that it will be requested again, thus attempting to maximize the hit ratio. Caching researchers and developers have proposed and evaluated numerous replacement algorithms, some of which are described here.
Least Recently Used (LRU)
LRU is certainly the most popular replacement algorithm used by web caches. The algorithm is quite simple to understand and implement, and it gives very good performance in almost all situations. As the name implies, LRU removes the objects that have not been accessed for the longest time. This algorithm can be implemented with a simple list. Every time an object is accessed, it is moved to the top of the list. The least recently used objects then automatically migrate to the bottom of the list.
A strict interpretation of LRU would consider time-since-reference as the only parameter. In practice, web caches almost always use a variant known as LRU-Threshold, where “threshold” refers to object size. Objects larger than the threshold size are simply not cached. This prevents one very large object from ejecting many smaller ones. This highlights the biggest problem with LRU: ...