Caching Strategies (APCu, Redis, Memcached)
Also Known As
cache
in-memory cache
application cache
TL;DR
Storing computed results or fetched data so future requests can be served without repeating expensive operations.
Explanation
Caching strategies include: in-process caching (APCu, OPcache for bytecode), distributed caching (Redis, Memcached for shared state across servers), HTTP caching (ETags, Cache-Control headers for browser/CDN caching), and database query result caching. Key cache design decisions are TTL (time-to-live), invalidation strategy (TTL expiry vs. explicit eviction), cache stampede prevention (probabilistic early expiry, locking), and what to cache (expensive queries, rendered fragments, API responses). Cache invalidation is famously one of the two hard problems in computer science.
Diagram
flowchart LR
REQ[Request] --> CHECK{Cache hit?}
CHECK -->|HIT| RETURN[Return cached<br/>response O 1]
CHECK -->|MISS| DB[(Database<br/>or API)]
DB --> STORE[Store in cache<br/>with TTL]
STORE --> RETURN2[Return response]
subgraph Cache Layers
L1[Browser cache]
L2[CDN edge cache]
L3[App cache Redis]
L4[DB query cache]
end
style RETURN fill:#238636,color:#fff
style L1 fill:#238636,color:#fff
style L4 fill:#1f6feb,color:#fff
Common Misconception
✗ Caching always makes things faster. Stale cache returning wrong data, cache stampedes on expiry, or over-caching data that changes frequently can make things slower or silently incorrect.
Why It Matters
Caching is the single most effective performance optimisation for most web applications — serving a precomputed result is orders of magnitude faster than recomputing it from scratch on every request. Without caching, every user pays the full cost of every database query.
Common Mistakes
- Caching everything without considering cache invalidation — stale data is often worse than no cache.
- Using the same TTL for all data regardless of how often it changes — user sessions and product catalogues have very different staleness tolerances.
- Not caching at the right layer — caching a slow SQL result but not the rendered HTML misses bigger gains.
- Forgetting to handle cache misses gracefully — if the cache server is down, the app should still work.
Code Examples
✗ Vulnerable
// No caching — expensive DB query on every request:
function getDashboardStats(): array {
return $db->query('SELECT COUNT(*), SUM(total) FROM orders WHERE ...')->fetch();
// Called 1000 times/min — same result every minute
}
// With caching:
function getDashboardStats(): array {
return apcu_fetch('dashboard_stats', $ok)
?: tap(expensiveQuery(), fn($r) => apcu_store('dashboard_stats', $r, 60));
}
✓ Fixed
// Cache-aside pattern with PSR-6
$item = $cache->getItem('user:' . $id);
if (!$item->isHit()) {
$user = $this->db->fetchUser($id); // expensive
$item->set($user);
$item->expiresAfter(300); // 5 min TTL
$cache->save($item);
}
return $item->get();
// Cache invalidation on write
public function updateUser(User $user): void {
$this->db->save($user);
$this->cache->deleteItem('user:' . $user->id); // invalidate
}
Tags
🤝 Adopt this term
£79/year · your link shown here
Added
15 Mar 2026
Edited
22 Mar 2026
Views
34
🤖 AI Guestbook educational data only
|
|
Last 30 days
Agents 0
No pings yet today
Perplexity 10
Amazonbot 6
Ahrefs 5
SEMrush 3
Unknown AI 2
Google 1
Also referenced
How they use it
crawler 26
crawler_json 1
Related categories
⚡
DEV INTEL
Tools & Severity
🟡 Medium
⚙ Fix effort: Medium
⚡ Quick Fix
Cache the result of any operation taking >50ms with a TTL matching data freshness requirements; use cache-aside pattern with Redis or APCu
📦 Applies To
PHP 5.0+
web
cli
queue-worker
laravel
symfony
🔗 Prerequisites
🔍 Detection Hints
Database query or external API call inside a loop or called >10x per request without caching
Auto-detectable:
✗ No
laravel-debugbar
clockwork
blackfire
⚠ Related Problems
🤖 AI Agent
Confidence: Low
False Positives: High
✗ Manual fix
Fix: Medium
Context: File
Tests: Update