← CodeClarityLab Home
Browse by Category
+ added · updated 7d
← Back to glossary

Space Complexity

performance Intermediate
debt(d7/e5/b7/t7)
d7 Detectability Operational debt — how invisible misuse is to your safety net

Closest to 'only careful code review or runtime testing' (d7). The detection_hints list blackfire and php-meminfo as tools, but automated detection is explicitly marked 'no' — there's no linter rule that flags O(n) memory patterns; a developer must profile with memory_get_peak_usage() or a specialist profiler under realistic data volumes, which only surfaces the problem at runtime with large datasets.

e5 Effort Remediation debt — work required to fix once spotted

Closest to 'touches multiple files / significant refactor in one component' (e5). The quick_fix mentions switching to generators, but this is not a one-line swap: converting array-based result-set loading to cursor/generator iteration typically requires changing query logic, the consuming loop, any intermediate transformation steps, and possibly caching strategy — spanning multiple layers within a component. It's not a single-call replacement.

b7 Burden Structural debt — long-term weight of choosing wrong

Closest to 'strong gravitational pull' (b7). Space complexity decisions apply across all contexts (web, cli, queue-worker) and the misconception/common_mistakes show this is a pervasive structural choice: once large result-sets or recursive algorithms are baked in, every data-intensive feature is shaped by the same pattern. Memory limits become a ceiling that forces re-examination of multiple workflows simultaneously.

t7 Trap Cognitive debt — how counter-intuitive correct behaviour is

Closest to 'serious trap — contradicts how a similar concept works elsewhere' (t7). The misconception field explicitly states that developers believe space complexity only matters for algorithms, not web code — so they load entire DB result sets into arrays without concern. This is a direct contradiction of the concept's real scope, and the PHP-specific 8-10× array memory overhead amplifies the surprise. Competent developers who think in algorithmic terms will still misjudge the practical web-app impact.

About DEBT scoring →

Also Known As

memory complexity space efficiency auxiliary space

TL;DR

A measure of how much memory an algorithm uses relative to its input size, expressed in Big O notation.

Explanation

Space complexity is the memory equivalent of time complexity. An algorithm that stores all input in memory is O(n) space; one that uses a fixed amount regardless of input is O(1). In PHP, loading an entire CSV into an array before processing is O(n) space; processing it line-by-line with a generator is O(1). Memory limits (memory_limit in php.ini) make space complexity a practical concern: processing large files or result sets without streaming can cause fatal out-of-memory errors.

Common Misconception

Space complexity only matters for algorithms, not web application code. Loading an entire database result set into a PHP array is O(n) space — using a generator to iterate row-by-row is O(1). Space complexity decisions have direct impact on PHP memory limits and server costs.

Why It Matters

Space complexity determines whether an algorithm is practical at scale — an O(n) space algorithm that loads all records into memory to sort them will exhaust RAM on large datasets regardless of how fast the computation is. Understanding space complexity helps choose between approaches: a streaming algorithm that processes one record at a time uses O(1) space but may be slower; an in-memory approach uses O(n) space but enables random access. In PHP, uncontrolled memory growth is the most common cause of memory_limit fatal errors on large imports and report generation.

Common Mistakes

  • Loading entire result sets into PHP arrays when a generator or cursor would process one row at a time.
  • Recursive algorithms with O(n) stack depth — deep recursion on large inputs causes stack overflows.
  • Not considering that PHP arrays have ~8-10× memory overhead vs the raw data size.
  • Caching large datasets in APCu or session without measuring memory impact.

Code Examples

✗ Vulnerable
// Loads entire 500k-row table into memory
function exportAll(PDO $pdo): void {
    $rows = $pdo->query('SELECT * FROM events')->fetchAll(); // could be GBs
    foreach ($rows as $row) { writeCsv($row); }
}
✓ Fixed
// O(1) memory — fetch and discard one row at a time
function exportAll(PDO $pdo): void {
    $stmt = $pdo->query('SELECT * FROM events');
    $stmt->setFetchMode(PDO::FETCH_ASSOC);
    while ($row = $stmt->fetch()) {
        writeCsv($row);
    }
}

// With generators for composable pipelines
function streamEvents(PDO $pdo): Generator {
    $stmt = $pdo->query('SELECT * FROM events');
    while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) { yield $row; }
}

Added 15 Mar 2026
Edited 23 Mar 2026
Views 35
Rate this term
No ratings yet
🤖 AI Guestbook educational data only
| |
Last 30 days
0 pings T 1 ping F 0 pings S 2 pings S 0 pings M 0 pings T 0 pings W 0 pings T 0 pings F 1 ping S 2 pings S 0 pings M 0 pings T 1 ping W 0 pings T 0 pings F 0 pings S 0 pings S 1 ping M 0 pings T 0 pings W 0 pings T 0 pings F 0 pings S 0 pings S 1 ping M 1 ping T 0 pings W 0 pings T 0 pings F
No pings yet today
No pings yesterday
Amazonbot 9 Perplexity 9 SEMrush 3 Ahrefs 2 Unknown AI 2 Google 2
crawler 26 crawler_json 1
DEV INTEL Tools & Severity
🟡 Medium ⚙ Fix effort: Medium
⚡ Quick Fix
Analyse space alongside time complexity — a generator that processes one CSV row at a time is O(1) space vs O(n) for loading the entire file; in PHP, memory_get_peak_usage() measures actual space used
📦 Applies To
any web cli queue-worker
🔗 Prerequisites
🔍 Detection Hints
Loading entire large dataset into memory array; recursive algorithm with O(n) call stack depth; building intermediate arrays that could be streamed
Auto-detectable: ✗ No blackfire php-meminfo
⚠ Related Problems
🤖 AI Agent
Confidence: Low False Positives: High ✗ Manual fix Fix: High Context: Function Tests: Update

✓ schema.org compliant