Cache Stampede

๐Ÿ’ก Concept Name

Cache Stampede occurs when many requests simultaneously try to recompute and refresh an expired cache entry, potentially overwhelming backend resources.

๐Ÿง  Analogy

Imagine a store manager is absent and hundreds of customers rush in at opening time demanding the same product, causing chaos. Similarly, a cache stampede overwhelms the system digitally.

๐Ÿ”ง Technical Explanation

  • ๐Ÿง  Occurs when multiple threads detect an expired cache and try to regenerate the same data simultaneously.
  • โš ๏ธ Leads to heavy load on backend systems like databases or APIs.
  • ๐Ÿ“‰ Causes severe performance degradation during high concurrency.
  • ๐Ÿ” Mitigation techniques include locks, single-flight request handling, and prewarming caches.

๐Ÿ›ก๏ธ Prevention Strategies

  • ๐Ÿ” Use locking or memoization to ensure only one thread recomputes the cache.
  • โณ Implement staggered expiry to avoid simultaneous expiration.
  • ๐Ÿ”„ Prewarm caches to refresh data proactively before expiry.
  • ๐Ÿ’ก Employ background refresh mechanisms to update cache asynchronously.

๐Ÿ’ป Code Example

// Cache stampede protection using locks and Lazy initialization
private static readonly object _lock = new();

public string GetProduct(int id)
{
    if (!_cache.TryGetValue($"product_{id}", out string result))
    {
        lock (_lock)
        {
            if (!_cache.TryGetValue($"product_{id}", out result))
            {
                result = GetFromDb(id);
                _cache.Set($"product_{id}", result, TimeSpan.FromMinutes(5));
            }
        }
    }
    return result;
}

โ“ Interview Q&A

Q1: What is cache stampede?
A: It is when many requests simultaneously try to update the same expired cache entry.

Q2: What causes cache stampede?
A: Cache expiry causing multiple simultaneous recomputations under load.

Q3: How to prevent cache stampede?
A: Use locking, cache prewarming, or lazy loading.

Q4: What does prewarming mean?
A: Refreshing cache entries proactively before expiry.

Q5: What is single-flight in caching?
A: Allowing only one request to recompute the cache while others wait.

๐Ÿ“ MCQs

Q1. What is a cache stampede?

  • Multiple users logging in
  • Multiple requests updating expired cache simultaneously
  • Cache running out of space
  • Cache flushing

Q2. What happens during a stampede?

  • System shutdown
  • No effect
  • Heavy backend load
  • Slower disk access

Q3. Which pattern prevents stampede?

  • Polling
  • Locking or memoization
  • Retry loop
  • Thread pooling

Q4. What is cache prewarming?

  • Purging old cache
  • Compressing data
  • Refreshing cache before expiry
  • Encrypting cache

Q5. What does TTL stand for?

  • Token Transfer Limit
  • Time To Live
  • Traffic Tracking Level
  • Temporary Task Layer

Q6. Which backend resource overloads in stampede?

  • Web server
  • Memory
  • Database or API
  • Disk cache

Q7. Which lock helps prevent stampede?

  • Queue
  • Semaphore
  • Mutex
  • Timer

Q8. What is the cache-aside pattern?

  • Cache auto-loads all data
  • Application loads data then caches it
  • Cache sits beside RAM
  • Only works with Redis

Q9. What is lazy loading in caching?

  • Eager loading
  • Cache always full
  • Delay loading until needed
  • Disk-first access

Q10. What causes stampede after TTL?

  • Log rotation
  • Multiple simultaneous fetches
  • Redis restart
  • Garbage collection

๐Ÿ’ก Bonus Insight

In distributed systems, use distributed locks like Redis RedLock to coordinate cache refreshes safely across servers.

๐Ÿ“„ PDF Download

Need a handy summary for your notes? Download this topic as a PDF!

โฌ…๏ธ Previous:

๐Ÿ’ฌ Feedback
๐Ÿš€ Start Learning
Share:

Tags: