Caching is one of the most important performance boosters in modern .NET applications.
Done right, it reduces database hits, improves response times, saves costs, and scales apps easily.
In this post, we’ll cover popular caching strategies in .NET with detailed examples and comments.
1️⃣ In-Memory Caching (IMemoryCache
)
Best for: Single-server apps, small data, per-instance cache.
Storage: Inside the app process memory.
Downside: Cache is lost if app restarts, and not shared across multiple servers.
Example
// Program.cs
builder.Services.AddMemoryCache(options =>
{
options.SizeLimit = 1024; // optional global limit
});
public class WeatherService
{
private readonly IMemoryCache _cache;
private readonly IHttpClientFactory _http;
public WeatherService(IMemoryCache cache, IHttpClientFactory http)
{
_cache = cache;
_http = http;
}
public async Task<WeatherDto> GetWeatherAsync(string city)
{
var key = $"weather:{city}";
return await _cache.GetOrCreateAsync(key, async entry =>
{
entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10);
entry.SlidingExpiration = TimeSpan.FromMinutes(2);
entry.SetSize(1); // important when SizeLimit is set
entry.RegisterPostEvictionCallback((k, v, reason, state) =>
{
Console.WriteLine($"Evicted {k} due to {reason}");
});
var client = _http.CreateClient("weather");
return await client.GetFromJsonAsync<WeatherDto>($"https://api.example.com/{city}");
});
}
}
✅ Fastest caching strategy
❌ Not suitable for multi-server environments
2️⃣ Distributed Caching (IDistributedCache
)
Best for: Multi-server apps, cloud-native apps, microservices.
Popular providers: Redis (recommended), SQL Server, NCache.
Configure Redis
// NuGet: Microsoft.Extensions.Caching.StackExchangeRedis
builder.Services.AddStackExchangeRedisCache(options =>
{
options.Configuration = "localhost:6379";
options.InstanceName = "MyApp_";
});
Example: Cache-Aside with JSON Serialization
public class ProductService
{
private readonly IDistributedCache _cache;
private readonly IProductRepository _repo;
public ProductService(IDistributedCache cache, IProductRepository repo)
{
_cache = cache;
_repo = repo;
}
public async Task<Product?> GetProductAsync(int id)
{
var key = $"product:{id}";
var cachedBytes = await _cache.GetAsync(key);
if (cachedBytes != null)
{
return JsonSerializer.Deserialize<Product>(cachedBytes);
}
var product = await _repo.GetByIdAsync(id);
if (product == null) return null;
var bytes = JsonSerializer.SerializeToUtf8Bytes(product);
await _cache.SetAsync(key, bytes, new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(30)
});
return product;
}
}
✅ Works across multiple servers
❌ Requires external dependency (Redis/SQL)
3️⃣ Cache-Aside Pattern (Lazy Loading)
How it works:
- App checks cache.
- If miss → fetch from DB → store in cache.
- Serve response.
Risk: Thundering herd (many requests missing at once).
Fix: Add locking or distributed locks.
private static readonly ConcurrentDictionary<string, SemaphoreSlim> _locks = new();
public async Task<T> GetOrCreateAsync<T>(string key, Func<Task<T>> factory, TimeSpan ttl)
{
var cached = await _cache.GetAsync(key);
if (cached != null) return JsonSerializer.Deserialize<T>(cached);
var sem = _locks.GetOrAdd(key, _ => new SemaphoreSlim(1, 1));
await sem.WaitAsync();
try
{
cached = await _cache.GetAsync(key);
if (cached != null) return JsonSerializer.Deserialize<T>(cached);
var value = await factory();
await _cache.SetAsync(key, JsonSerializer.SerializeToUtf8Bytes(value),
new DistributedCacheEntryOptions { AbsoluteExpirationRelativeToNow = ttl });
return value;
}
finally
{
sem.Release();
}
}
4️⃣ Write-Through & Write-Behind Caching
- Write-Through: Writes go to DB and cache immediately.
- Write-Behind (Write-Back): Writes go to cache → persisted asynchronously later.
Example: Write-Behind (simplified)
public class CacheWriteBehindService : BackgroundService
{
private readonly Channel<Product> _channel = Channel.CreateUnbounded<Product>();
private readonly IProductRepository _db;
public CacheWriteBehindService(IProductRepository db) => _db = db;
public async Task EnqueueAsync(Product p) => await _channel.Writer.WriteAsync(p);
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
await foreach (var product in _channel.Reader.ReadAllAsync(stoppingToken))
{
await _db.SaveAsync(product); // async DB write
}
}
}
✅ High performance writes
❌ Risk of losing data if cache crashes before DB save
5️⃣ Output Caching (Whole Response)
ASP.NET Core 7+ has Output Caching Middleware.
// Program.cs
builder.Services.AddOutputCache();
app.UseOutputCache();
// Minimal API
app.MapGet("/time", () => DateTime.UtcNow).CacheOutput();
// Controller
[ApiController]
public class InfoController : ControllerBase
{
[HttpGet]
[OutputCache]
public IActionResult Get() => Ok(new { Now = DateTime.UtcNow });
}
With Redis-backed output cache, multiple servers can share cached responses.
✅ Best for caching whole API responses
❌ Not flexible for per-user personalized data
6️⃣ Response Caching (via Headers)
Use Cache-Control
, ETag
, Last-Modified
headers.
[HttpGet("{id}")]
public IActionResult GetResource(int id)
{
var resource = _repo.Get(id);
var etag = $"W/\"{resource.Version}\"";
Response.Headers["ETag"] = etag;
if (Request.Headers["If-None-Match"] == etag)
return StatusCode(304);
return Ok(resource);
}
✅ Best Practices
- Always set TTL (avoid infinite caches).
- Use versioned keys:
product:v2:{id}
. - Monitor hit/miss ratio in production.
- Use Redis for distributed cache in real-world apps.
- Prefer System.Text.Json or MessagePack for compact serialization.
- Don’t cache sensitive user data unless encrypted.
🚀 Wrapping Up
We’ve explored:
✔️ In-Memory Cache
✔️ Distributed Cache (Redis, SQL, NCache)
✔️ Cache-Aside pattern
✔️ Write-Through / Write-Behind
✔️ Output & Response caching
✔️ Best practices
Caching is all about balancing speed, consistency, and memory usage.
Pick the right strategy depending on your app’s scale and requirements.
Comments
Post a Comment