8 Caching techniques
This chapter covers
- Caching overview
- HTTP response caching (client-side, intermediate, and server-side)
- In-memory caching
- Distributed caching (using SQL Server or Redis)
In information technology, the term cache describes a hardware component or a software mechanism that can be used to store data so that future requests that require such data can be served faster and—most important—without being retrieved from scratch. Good caching practices often result in performance benefits, lower latency, less CPU overhead, reduced bandwidth use, and decreased costs.
Based on this definition, we can understand that adopting and implementing a caching strategy can create many invaluable optimization advantages. These advantages are ...
Get Building Web APIs with ASP.NET Core now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.