Space Complexity
debt(d7/e5/b7/t7)
Closest to 'only careful code review or runtime testing' (d7). The detection_hints list blackfire and php-meminfo as tools, but automated detection is explicitly marked 'no' — there's no linter rule that flags O(n) memory patterns; a developer must profile with memory_get_peak_usage() or a specialist profiler under realistic data volumes, which only surfaces the problem at runtime with large datasets.
Closest to 'touches multiple files / significant refactor in one component' (e5). The quick_fix mentions switching to generators, but this is not a one-line swap: converting array-based result-set loading to cursor/generator iteration typically requires changing query logic, the consuming loop, any intermediate transformation steps, and possibly caching strategy — spanning multiple layers within a component. It's not a single-call replacement.
Closest to 'strong gravitational pull' (b7). Space complexity decisions apply across all contexts (web, cli, queue-worker) and the misconception/common_mistakes show this is a pervasive structural choice: once large result-sets or recursive algorithms are baked in, every data-intensive feature is shaped by the same pattern. Memory limits become a ceiling that forces re-examination of multiple workflows simultaneously.
Closest to 'serious trap — contradicts how a similar concept works elsewhere' (t7). The misconception field explicitly states that developers believe space complexity only matters for algorithms, not web code — so they load entire DB result sets into arrays without concern. This is a direct contradiction of the concept's real scope, and the PHP-specific 8-10× array memory overhead amplifies the surprise. Competent developers who think in algorithmic terms will still misjudge the practical web-app impact.
Also Known As
TL;DR
Explanation
Space complexity is the memory equivalent of time complexity. An algorithm that stores all input in memory is O(n) space; one that uses a fixed amount regardless of input is O(1). In PHP, loading an entire CSV into an array before processing is O(n) space; processing it line-by-line with a generator is O(1). Memory limits (memory_limit in php.ini) make space complexity a practical concern: processing large files or result sets without streaming can cause fatal out-of-memory errors.
Common Misconception
Why It Matters
Common Mistakes
- Loading entire result sets into PHP arrays when a generator or cursor would process one row at a time.
- Recursive algorithms with O(n) stack depth — deep recursion on large inputs causes stack overflows.
- Not considering that PHP arrays have ~8-10× memory overhead vs the raw data size.
- Caching large datasets in APCu or session without measuring memory impact.
Code Examples
// Loads entire 500k-row table into memory
function exportAll(PDO $pdo): void {
$rows = $pdo->query('SELECT * FROM events')->fetchAll(); // could be GBs
foreach ($rows as $row) { writeCsv($row); }
}
// O(1) memory — fetch and discard one row at a time
function exportAll(PDO $pdo): void {
$stmt = $pdo->query('SELECT * FROM events');
$stmt->setFetchMode(PDO::FETCH_ASSOC);
while ($row = $stmt->fetch()) {
writeCsv($row);
}
}
// With generators for composable pipelines
function streamEvents(PDO $pdo): Generator {
$stmt = $pdo->query('SELECT * FROM events');
while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) { yield $row; }
}