Out of Memory Errors (memory_limit)
debt(d7/e5/b5/t7)
Closest to 'only careful code review or runtime testing' (d7), profilers like blackfire/xdebug can reveal memory issues but no default linter flags file_get_contents on large files; often only discovered under load.
Closest to 'touches multiple files / significant refactor in one component' (e5), per quick_fix replacing file_get_contents with streaming/generators usually means refactoring data processing logic across the import/export component, not a one-line swap.
Closest to 'persistent productivity tax' (b5), applies across web/cli/queue contexts and shapes how data-handling code must be written everywhere large datasets appear.
Closest to 'serious trap' (t7), the misconception explicitly states devs believe raising memory_limit fixes OOM when it only defers it — contradicts the intuitive 'give it more memory' fix.
TL;DR
Explanation
memory_limit defaults to 128M (256M in recent distros). When exceeded, PHP emits a fatal error and terminates. It cannot be caught normally but register_shutdown_function() + error_get_last() can detect it. Common causes: loading large files into strings, unbounded array growth, processing huge CSV/Excel imports, missing pagination on DB queries. Increase with ini_set('memory_limit','512M') only for the specific operation, then restore. Use generators and streaming to avoid loading everything into memory. memory_get_peak_usage(true) reports peak consumption.
Common Misconception
Why It Matters
Common Mistakes
- Setting memory_limit = -1 in production — removes the safety net.
- Not using generators for large dataset iteration.
- Loading entire files with file_get_contents() when streaming suffices.
- Not monitoring peak memory usage.
Code Examples
// Loading a 500MB CSV into memory
$data = file_get_contents('/exports/huge.csv');
$rows = explode("\n", $data); // Fatal: Allowed memory size exhausted
// Stream line by line — constant memory usage
$handle = fopen('/exports/huge.csv', 'r');
while (($line = fgets($handle)) !== false) {
processLine($line);
}
fclose($handle);
// Or with generators
function readCsv(string $file): Generator {
$h = fopen($file, 'r');
while (($row = fgetcsv($h)) !== false) yield $row;
fclose($h);
}