Pagination Strategies
Also Known As
SQL pagination
LIMIT OFFSET
keyset pagination
cursor pagination
TL;DR
Techniques for splitting large result sets across pages to avoid loading unbounded data in a single query.
Explanation
Offset pagination (LIMIT x OFFSET y) is simple but degrades at large offsets — the database must scan and discard all preceding rows. Keyset (cursor) pagination (WHERE id > :last_id ORDER BY id LIMIT x) is O(log n) via the index regardless of page depth, making it suitable for deep pagination and infinite scroll. For PHP applications with large datasets, keyset pagination is the default recommendation. Total count queries for offset pagination are expensive — consider approximate counts or omitting page counts for large datasets.
Common Misconception
✗ LIMIT/OFFSET pagination is efficient for all pages. OFFSET N scans and discards the first N rows on every query — page 1000 of 20 items per page scans 20,000 rows. Keyset (cursor) pagination using WHERE id > last_seen_id is O(1) regardless of page depth.
Why It Matters
Loading all records at once is a common performance anti-pattern — pagination limits result sets to manageable chunks, keeping memory usage flat and response times consistent regardless of total data size.
Common Mistakes
- OFFSET-based pagination on large datasets — OFFSET 100000 still scans 100K rows before returning results.
- Not using cursor-based (keyset) pagination for high-volume feeds — use WHERE id > :last_id LIMIT 20.
- No total count query — either skip it (infinite scroll) or cache it; running COUNT(*) on every page is expensive.
- Allowing unlimited page sizes via API — ?per_page=999999 loads the entire table.
Avoid When
- Offset pagination on large tables — OFFSET 10000 scans and discards 10,000 rows; use keyset pagination instead.
- Paginating tiny result sets — if the total is always under 20 rows, returning all at once is simpler.
- Using page numbers in APIs that other services consume — page numbers are unstable when data is inserted or deleted between requests.
When To Use
- Any list endpoint that can return more than ~50 items — return all and you risk memory exhaustion and slow responses.
- Cursor or keyset pagination for feeds and timelines where consistent ordering and stability matter.
- User-facing tables and search results where navigation by page number is expected.
- Infinite scroll UIs where the next page is loaded as the user approaches the bottom.
Code Examples
✗ Vulnerable
// OFFSET pagination — slow on large tables (reads all preceding rows)
$page = (int) $_GET['page'];
$rows = $db->query('SELECT * FROM posts ORDER BY id DESC LIMIT 20 OFFSET ' . ($page * 20));
✓ Fixed
// Cursor/keyset pagination — constant time regardless of page depth
$cursor = (int) ($_GET['cursor'] ?? PHP_INT_MAX);
$rows = $db->query(
'SELECT * FROM posts WHERE id < ? ORDER BY id DESC LIMIT 21',
[$cursor]
);
$hasMore = count($rows) === 21;
$items = array_slice($rows, 0, 20);
$nextCursor = $hasMore ? end($items)->id : null;
References
Tags
🤝 Adopt this term
£79/year · your link shown here
Added
15 Mar 2026
Edited
25 Mar 2026
Views
39
🤖 AI Guestbook educational data only
|
|
Last 30 days
Agents 0
No pings yet today
No pings yesterday
Amazonbot 13
Perplexity 7
Ahrefs 5
Google 4
Unknown AI 3
Majestic 1
Also referenced
How they use it
crawler 30
crawler_json 1
pre-tracking 2
Related categories
⚡
DEV INTEL
Tools & Severity
🟡 Medium
⚙ Fix effort: Medium
⚡ Quick Fix
Use keyset/cursor pagination (WHERE id > :last_id LIMIT 20) instead of OFFSET for large datasets — OFFSET scans and discards rows, keyset uses the index directly
📦 Applies To
PHP 5.0+
web
api
laravel
symfony
🔍 Detection Hints
OFFSET pagination on large tables: LIMIT 20 OFFSET 10000 — full scan to row 10020 then discard 10000
Auto-detectable:
✓ Yes
laravel-debugbar
mysql-slow-query-log
clockwork
⚠ Related Problems
🤖 AI Agent
Confidence: Medium
False Positives: Medium
✗ Manual fix
Fix: Medium
Context: File
Tests: Update