Target Audience: .NET Core developers, software architects, anyone looking to optimize application performance.
Goal: To provide a practical, in-depth guide on implementing caching in .NET Core applications, covering different caching strategies and best practices.
Is your .NET Core application feeling sluggish? Are frequent database calls slowing down your user experience? You're not alone. Performance bottlenecks are a common challenge in modern web applications. Fortunately, a powerful technique can dramatically improve your application's speed and responsiveness: caching.
Caching involves storing frequently accessed data in a fast-access location, like memory, so that subsequent requests for the same data can be served much quicker without needing to hit slower resources like databases or external APIs. In the world of .NET Core, implementing caching can lead to:
This comprehensive guide will walk you through the ins and outs of implementing caching in your .NET Core applications. We'll explore different caching strategies, from simple in-memory caching to robust distributed solutions like Redis, along with best practices to ensure your caching implementation is effective and efficient.
At its core, caching operates on a simple principle: store data you've already fetched so you don't have to fetch it again. When your application requests data, it first checks the cache. This is known as a cache hit. If the data isn't there, it's a cache miss, and the application then retrieves the data from its original source (e.g., a database) and stores it in the cache for future use.
In-memory caching is the simplest form of caching in .NET Core, where data is stored directly within the application's memory. It's fast, easy to set up, and perfect for many scenarios.
In .NET Core, in-memory caching is primarily managed through the IMemoryCache interface, with MemoryCache being its default implementation.
IMemoryCache
MemoryCache
You need to register the in-memory cache service in your Program.cs (or Startup.cs for older .NET Core versions):
Program.cs
Startup.cs
using Microsoft.Extensions.Caching.Memory; public class Program { public static void Main(string[] args) { var builder = WebApplication.CreateBuilder(args); // Add services to the container. builder.Services.AddMemoryCache(); // Register IMemoryCache builder.Services.AddControllers(); // ... other services ... var app = builder.Build(); // ... other middleware ... app.Run(); } }
Once registered, you can inject IMemoryCache into your controllers or services:
using Microsoft.Extensions.Caching.Memory; public class ProductService { private readonly IMemoryCache _cache; private readonly IProductRepository _productRepository; // Assume this exists public ProductService(IMemoryCache cache, IProductRepository productRepository) { _cache = cache; _productRepository = productRepository; } public async Task<List<Product>> GetProductsAsync() { const string cacheKey = "AllProducts"; // Define a unique cache key // Try to get data from cache if (!_cache.TryGetValue(cacheKey, out List<Product> products)) { // Cache miss: Data not found in cache, retrieve from source products = await _productRepository.GetAllProductsAsync(); // Set data in cache with expiration var cacheEntryOptions = new MemoryCacheEntryOptions() .SetAbsoluteExpiration(TimeSpan.FromMinutes(5)); // Data expires after 5 minutes // .SetSlidingExpiration(TimeSpan.FromMinutes(2)); // Data expires if not accessed for 2 minutes _cache.Set(cacheKey, products, cacheEntryOptions); } return products; } public void ClearProductCache() { // Manually remove an item from the cache _cache.Remove("AllProducts"); } }
_cache.TryGetValue(key, out value)
_cache.Set(key, value, options)
SetAbsoluteExpiration()
SetSlidingExpiration()
For scalable applications, especially those deployed across multiple servers or in microservices architectures, distributed caching is essential. Instead of storing data in the application's memory, it's stored externally in a shared, centralized cache store that all application instances can access.
In .NET Core, IDistributedCache is the interface that abstracts distributed caching operations, allowing you to switch between different implementations easily.
IDistributedCache
You can use a SQL Server database as a distributed cache store. While convenient if you already have SQL Server, it's generally less performant than dedicated cache servers like Redis.
builder.Services.AddDistributedSqlServerCache(options => { options.ConnectionString = builder.Configuration.GetConnectionString("CacheConnection"); options.SchemaName = "dbo"; options.TableName = "DistributedCache"; });
dotnet tool install --global dotnet-sql-cache dotnet sql-cache create "Data Source=server;Initial Catalog=CacheDB;Integrated Security=True" dbo DistributedCache
Redis is an open-source, in-memory data structure store, used as a database, cache, and message broker. It's incredibly fast and versatile, making it the most popular choice for distributed caching in .NET Core and beyond.
using Microsoft.Extensions.Caching.StackExchangeRedis; public class Program { public static void Main(string[] args) { var builder = WebApplication.CreateBuilder(args); // Add Redis Distributed Cache builder.Services.AddStackExchangeRedisCache(options => { options.Configuration = builder.Configuration.GetConnectionString("RedisConnection"); options.InstanceName = "MyRedisApp:"; // Optional: Prefix for keys to avoid collisions }); // ... other services ... var app = builder.Build(); // ... other middleware ... app.Run(); } }
using Microsoft.Extensions.Caching.Distributed; using System.Text.Json; // For JSON serialization/deserialization public class UserService { private readonly IDistributedCache _cache; private readonly IUserRepository _userRepository; // Assume this exists public UserService(IDistributedCache cache, IUserRepository userRepository) { _cache = cache; _userRepository = userRepository; } public async Task<User> GetUserByIdAsync(int userId) { string cacheKey = $"user:{userId}"; // Create a unique key // Try to get data as a string from the distributed cache string userJson = await _cache.GetStringAsync(cacheKey); if (!string.IsNullOrEmpty(userJson)) { // Cache hit: Deserialize and return return JsonSerializer.Deserialize<User>(userJson); } // Cache miss: Retrieve from database User user = await _userRepository.GetUserAsync(userId); if (user != null) { // Serialize and set data in distributed cache with options DistributedCacheEntryOptions options = new DistributedCacheEntryOptions() .SetAbsoluteExpiration(TimeSpan.FromMinutes(10)) // Absolute expiration after 10 minutes .SetSlidingExpiration(TimeSpan.FromMinutes(3)); // Or expire if idle for 3 minutes await _cache.SetStringAsync(cacheKey, JsonSerializer.Serialize(user), options); } return user; } public async Task UpdateUserAsync(User user) { // Update user in DB await _userRepository.UpdateUserAsync(user); // Invalidate cache for this user await _cache.RemoveAsync($"user:{user.Id}"); } } public class User // Example User class { public int Id { get; set; } public string Name { get; set; } public string Email { get; set; } }
When working with IDistributedCache, you often deal with string or byte array data. This means you'll need to serialize your objects (e.g., using System.Text.Json or Newtonsoft.Json) before storing them and deserialize them upon retrieval.
System.Text.Json
While Redis is dominant, other options include NCache (commercial, feature-rich), Apache Geode, and more specialized solutions depending on your infrastructure.
Implementing caching effectively goes beyond just adding lines of code. It requires strategic thinking about how and what you cache.
This is the most common and recommended caching strategy. The application is responsible for managing both the cache and the data source.
Other strategies like Write-Through (write to cache and then immediately to database) and Write-Behind (write to cache and then asynchronously to database) exist but are less common for general-purpose application caching and often handled by the cache provider itself.
Ensuring cached data is fresh is critical to prevent serving stale information.
Remove()
RemoveAsync()
"product:123"
While caching offers significant benefits, it also introduces complexity. Being aware of common pitfalls can save you a lot of headaches.
As you become more comfortable with basic caching, you might explore these advanced concepts:
Caching is an indispensable technique for building high-performance, scalable .NET Core applications. Whether you opt for the simplicity of in-memory caching or the robustness of a distributed solution like Redis, understanding the fundamentals and applying best practices will significantly enhance your application's responsiveness and efficiency.
While caching introduces complexity, the performance benefits often far outweigh the challenges. Start by identifying hot spots in your application where data is frequently accessed but rarely changes, and then strategically implement the appropriate caching solution.
Ready to supercharge your .NET Core apps? Start experimenting with caching today!
Comments - Beta - WIP