Cache Stampede
๐ก Concept Name
Cache Stampede occurs when many requests simultaneously try to recompute and refresh an expired cache entry, potentially overwhelming backend resources.
๐ง Analogy
Imagine a store manager is absent and hundreds of customers rush in at opening time demanding the same product, causing chaos. Similarly, a cache stampede overwhelms the system digitally.
๐ง Technical Explanation
- ๐ง Occurs when multiple threads detect an expired cache and try to regenerate the same data simultaneously.
- โ ๏ธ Leads to heavy load on backend systems like databases or APIs.
- ๐ Causes severe performance degradation during high concurrency.
- ๐ Mitigation techniques include locks, single-flight request handling, and prewarming caches.
๐ก๏ธ Prevention Strategies
- ๐ Use locking or memoization to ensure only one thread recomputes the cache.
- โณ Implement staggered expiry to avoid simultaneous expiration.
- ๐ Prewarm caches to refresh data proactively before expiry.
- ๐ก Employ background refresh mechanisms to update cache asynchronously.
๐ป Code Example
// Cache stampede protection using locks and Lazy initialization
private static readonly object _lock = new();
public string GetProduct(int id)
{
if (!_cache.TryGetValue($"product_{id}", out string result))
{
lock (_lock)
{
if (!_cache.TryGetValue($"product_{id}", out result))
{
result = GetFromDb(id);
_cache.Set($"product_{id}", result, TimeSpan.FromMinutes(5));
}
}
}
return result;
}
โ Interview Q&A
Q1: What is cache stampede?
A: It is when many requests simultaneously try to update the same expired cache entry.
Q2: What causes cache stampede?
A: Cache expiry causing multiple simultaneous recomputations under load.
Q3: How to prevent cache stampede?
A: Use locking, cache prewarming, or lazy loading.
Q4: What does prewarming mean?
A: Refreshing cache entries proactively before expiry.
Q5: What is single-flight in caching?
A: Allowing only one request to recompute the cache while others wait.
๐ MCQs
Q1. What is a cache stampede?
- Multiple users logging in
- Multiple requests updating expired cache simultaneously
- Cache running out of space
- Cache flushing
Q2. What happens during a stampede?
- System shutdown
- No effect
- Heavy backend load
- Slower disk access
Q3. Which pattern prevents stampede?
- Polling
- Locking or memoization
- Retry loop
- Thread pooling
Q4. What is cache prewarming?
- Purging old cache
- Compressing data
- Refreshing cache before expiry
- Encrypting cache
Q5. What does TTL stand for?
- Token Transfer Limit
- Time To Live
- Traffic Tracking Level
- Temporary Task Layer
Q6. Which backend resource overloads in stampede?
- Web server
- Memory
- Database or API
- Disk cache
Q7. Which lock helps prevent stampede?
- Queue
- Semaphore
- Mutex
- Timer
Q8. What is the cache-aside pattern?
- Cache auto-loads all data
- Application loads data then caches it
- Cache sits beside RAM
- Only works with Redis
Q9. What is lazy loading in caching?
- Eager loading
- Cache always full
- Delay loading until needed
- Disk-first access
Q10. What causes stampede after TTL?
- Log rotation
- Multiple simultaneous fetches
- Redis restart
- Garbage collection
๐ก Bonus Insight
In distributed systems, use distributed locks like Redis RedLock to coordinate cache refreshes safely across servers.
๐ PDF Download
Need a handy summary for your notes? Download this topic as a PDF!