Database Query Result Streaming
Also Known As
TL;DR
Explanation
Query result streaming fetches rows incrementally from the database rather than buffering the entire result set in PHP memory. In mysqli this means using MYSQLI_USE_RESULT mode or unbuffered queries. The default MYSQLI_STORE_RESULT buffers all rows in PHP memory before returning - fine for 100 rows, fatal for 10 million. With streaming, you fetch one row at a time from the MySQL connection, process it, then discard it before fetching the next. Memory usage stays constant regardless of result size. The trade-off: the connection remains locked until you finish reading all rows or call mysqli_free_result() - you cannot run another query on the same connection mid-stream. For PDO use PDO::MYSQL_ATTR_USE_BUFFERED_QUERY set to false. MySQL must also be configured with sufficient net_buffer_length for streaming. This pattern is essential for data exports, ETL pipelines, and batch processing in queue workers. PHP 8.1+ generators combine elegantly with streaming - yield each row from a generator function for memory-efficient iteration with foreach.
Common Misconception
Why It Matters
Common Mistakes
- Running another query on the same connection before consuming all streamed rows - causes 'Commands out of sync' error.
- Using MYSQLI_USE_RESULT with mysqli_num_rows() - row count is unavailable until all rows are fetched.
- Forgetting to call mysqli_free_result() when stopping iteration early - connection remains locked.
- Streaming in web requests where connection time is limited - streaming is for long-running CLI processes.
- Combining unbuffered queries with transactions that timeout - the transaction may rollback mid-stream.
Avoid When
- Web requests with short timeouts - streaming ties up the connection for the entire duration.
- Small result sets under 10,000 rows - buffered queries are simpler and faster.
- When you need mysqli_num_rows() before iteration - row count unavailable in streaming mode.
- Multiple concurrent queries needed on the same connection - streaming locks the connection.
When To Use
- CLI scripts processing millions of rows where memory is constrained.
- Data export jobs that write rows directly to file/CSV without storing in memory.
- ETL pipelines transforming large datasets row by row.
- Queue workers processing batch jobs with large query results.
Code Examples
<?php
// FATAL: Loads 10 million rows into PHP memory before loop starts
$mysqli = new mysqli('localhost', 'user', 'pass', 'db');
$result = $mysqli->query('SELECT * FROM huge_table'); // Default: MYSQLI_STORE_RESULT
// All 10M rows already in memory - too late
while ($row = $result->fetch_assoc()) {
processRow($row);
}
// Memory exhausted before loop even begins:
// Fatal error: Allowed memory size of 134217728 bytes exhausted
<?php
declare(strict_types=1);
$mysqli = new mysqli('localhost', 'user', 'pass', 'db');
$mysqli->options(MYSQLI_OPT_INT_AND_FLOAT_NATIVE, true);
// MYSQLI_USE_RESULT = streaming mode - rows fetched on demand
$result = $mysqli->query(
'SELECT * FROM huge_table',
MYSQLI_USE_RESULT
);
if ($result === false) {
throw new RuntimeException($mysqli->error);
}
// Memory stays constant - one row at a time
while ($row = $result->fetch_assoc()) {
processRow($row);
// Row is discarded after processing, memory freed
}
// CRITICAL: Free result before any other query on this connection
$result->free();
// PHP 8.1+ Generator pattern for cleaner iteration
function streamQuery(mysqli $db, string $sql): Generator
{
$result = $db->query($sql, MYSQLI_USE_RESULT);
if ($result === false) {
throw new RuntimeException($db->error);
}
try {
while ($row = $result->fetch_assoc()) {
yield $row;
}
} finally {
$result->free();
}
}
// Usage - memory-efficient foreach
foreach (streamQuery($mysqli, 'SELECT * FROM huge_table') as $row) {
processRow($row);
}
References
Tags
Edits history 1 edit
- long PF Media Bot Claude Opus 4.5 · 3 May 2026