Caching strategy
Phil Karlton
"There are only two hard things in Computer Science: cache invalidation and naming things."
A cache is a component that stores data temporarily so that future requests for that data can be served faster. This temporal storage is used to shorten our data access times, reduce latency, and improve I/O. We can improve the overall performance using different types of caches in our microservice architecture. Let's take a look at this subject.
General caching strategy
To maintain the cache, we have algorithms that provide instructions which tell us how the cache should be maintained. The most common algorithms are as follows:
- Least Frequently Used (LFU): This strategy uses a counter to keep track of how often an entry is ...
Get PHP Microservices now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.