← CodeClarityLab Home
Browse by Category
+ added · updated 7d
← Back to glossary

Database Query Result Streaming

database PHP 8.0+ Intermediate

Also Known As

unbuffered query streaming result set row-by-row fetch MYSQLI_USE_RESULT

TL;DR

Processing large result sets row-by-row without loading the entire dataset into memory - essential for PHP CLI scripts handling millions of rows.

Explanation

Query result streaming fetches rows incrementally from the database rather than buffering the entire result set in PHP memory. In mysqli this means using MYSQLI_USE_RESULT mode or unbuffered queries. The default MYSQLI_STORE_RESULT buffers all rows in PHP memory before returning - fine for 100 rows, fatal for 10 million. With streaming, you fetch one row at a time from the MySQL connection, process it, then discard it before fetching the next. Memory usage stays constant regardless of result size. The trade-off: the connection remains locked until you finish reading all rows or call mysqli_free_result() - you cannot run another query on the same connection mid-stream. For PDO use PDO::MYSQL_ATTR_USE_BUFFERED_QUERY set to false. MySQL must also be configured with sufficient net_buffer_length for streaming. This pattern is essential for data exports, ETL pipelines, and batch processing in queue workers. PHP 8.1+ generators combine elegantly with streaming - yield each row from a generator function for memory-efficient iteration with foreach.

Common Misconception

Unbuffered queries are always faster than buffered ones. Streaming is slower per-row because of repeated network round-trips - it is a memory optimisation, not a speed optimisation. For small result sets, buffered queries are faster.

Why It Matters

Without streaming, a query returning 5 million rows buffers all data into PHP memory before your code even executes, causing fatal memory exhaustion errors in CLI scripts and queue workers that process large datasets.

Common Mistakes

  • Running another query on the same connection before consuming all streamed rows - causes 'Commands out of sync' error.
  • Using MYSQLI_USE_RESULT with mysqli_num_rows() - row count is unavailable until all rows are fetched.
  • Forgetting to call mysqli_free_result() when stopping iteration early - connection remains locked.
  • Streaming in web requests where connection time is limited - streaming is for long-running CLI processes.
  • Combining unbuffered queries with transactions that timeout - the transaction may rollback mid-stream.

Avoid When

  • Web requests with short timeouts - streaming ties up the connection for the entire duration.
  • Small result sets under 10,000 rows - buffered queries are simpler and faster.
  • When you need mysqli_num_rows() before iteration - row count unavailable in streaming mode.
  • Multiple concurrent queries needed on the same connection - streaming locks the connection.

When To Use

  • CLI scripts processing millions of rows where memory is constrained.
  • Data export jobs that write rows directly to file/CSV without storing in memory.
  • ETL pipelines transforming large datasets row by row.
  • Queue workers processing batch jobs with large query results.

Code Examples

✗ Vulnerable
<?php
// FATAL: Loads 10 million rows into PHP memory before loop starts
$mysqli = new mysqli('localhost', 'user', 'pass', 'db');
$result = $mysqli->query('SELECT * FROM huge_table'); // Default: MYSQLI_STORE_RESULT

// All 10M rows already in memory - too late
while ($row = $result->fetch_assoc()) {
    processRow($row);
}

// Memory exhausted before loop even begins:
// Fatal error: Allowed memory size of 134217728 bytes exhausted
✓ Fixed
<?php
declare(strict_types=1);

$mysqli = new mysqli('localhost', 'user', 'pass', 'db');
$mysqli->options(MYSQLI_OPT_INT_AND_FLOAT_NATIVE, true);

// MYSQLI_USE_RESULT = streaming mode - rows fetched on demand
$result = $mysqli->query(
    'SELECT * FROM huge_table',
    MYSQLI_USE_RESULT
);

if ($result === false) {
    throw new RuntimeException($mysqli->error);
}

// Memory stays constant - one row at a time
while ($row = $result->fetch_assoc()) {
    processRow($row);
    // Row is discarded after processing, memory freed
}

// CRITICAL: Free result before any other query on this connection
$result->free();

// PHP 8.1+ Generator pattern for cleaner iteration
function streamQuery(mysqli $db, string $sql): Generator
{
    $result = $db->query($sql, MYSQLI_USE_RESULT);
    if ($result === false) {
        throw new RuntimeException($db->error);
    }
    
    try {
        while ($row = $result->fetch_assoc()) {
            yield $row;
        }
    } finally {
        $result->free();
    }
}

// Usage - memory-efficient foreach
foreach (streamQuery($mysqli, 'SELECT * FROM huge_table') as $row) {
    processRow($row);
}

Added 3 May 2026
Views 10
AI edit PF Media Bot Claude Opus 4.5 on long · 3 May 2026
Edits history 1 edit
  1. long PF Media Bot Claude Opus 4.5 · 3 May 2026
Rate this term
No ratings yet
🤖 AI Guestbook educational data only
| |
Last 30 days
0 pings W 0 pings T 0 pings F 0 pings S 0 pings S 0 pings M 0 pings T 0 pings W 0 pings T 0 pings F 0 pings S 0 pings S 0 pings M 0 pings T 0 pings W 0 pings T 0 pings F 0 pings S 0 pings S 0 pings M 0 pings T 0 pings W 0 pings T 0 pings F 0 pings S 4 pings S 1 ping M 0 pings T 0 pings W 0 pings T
No pings yet today
No pings yesterday
ChatGPT 2 Perplexity 2 Google 1
crawler 4 crawler_json 1
DEV INTEL Tools & Severity
🟠 High ⚙ Fix effort: Low
⚡ Quick Fix
Pass MYSQLI_USE_RESULT as second argument to mysqli::query() and always call free() before running another query on the same connection
📦 Applies To
PHP 8.0+ cli queue-worker
🔗 Prerequisites
🔍 Detection Hints
mysqli->query\([^)]+\)(?!.*MYSQLI_USE_RESULT)
Auto-detectable: ✓ Yes phpstan psalm
⚠ Related Problems
🤖 AI Agent
Confidence: High False Positives: Low ✗ Manual fix Fix: Medium Context: Function Tests: Update

✓ schema.org compliant