JavaScript Generators
debt(d7/e3/b3/t7)
Closest to 'only careful code review or runtime testing' (d7). The detection_hints note automated=no, and while ESLint and TypeScript are listed tools, there is no standard rule that catches misuse of generators (e.g., building a full array instead of using a lazy generator, or incorrectly calling .next()). These mistakes are silent in normal execution — memory bloat only surfaces under load, and the wrong mental model (treating generators like arrays) produces code that runs but wastes resources, requiring careful review or profiling to catch.
Closest to 'simple parameterised fix' (e3). The quick_fix indicates replacing eager array construction with generator functions using function* and yield. This is typically a localised refactor within one component — replacing a data-producing function and updating call sites to use for...of — not a single one-liner, but not a cross-cutting change either.
Closest to 'localised tax' (b3). Generators apply to web and CLI contexts but are a local choice: adopting them in one module or utility does not impose a systematic cost on the rest of the codebase. Callers consume via for...of or .next(), which is standard iterable protocol. The burden is confined to the specific code paths using them.
Closest to 'serious trap' (t7). The misconception field explicitly states developers believe generators are just a different way to write arrays, when they are fundamentally lazy and potentially infinite. Additionally, common_mistakes highlight that calling the generator function returns a generator object (not the first value), and the final .next() returns {done: true} — both contradict intuitions from similar constructs in other languages or from array/promise patterns, making this a serious cognitive trap.
Also Known As
TL;DR
Explanation
A generator function (function*) returns a generator object. Calling next() runs the function until the next yield, returns the yielded value, then pauses. Generators are lazy — they compute values on demand rather than all at once. They underlie async/await (which was originally implemented as a syntax transform over generators + Promises). Use cases: infinite sequences, paginated API iteration, streaming data processing, and coroutines.
Diagram
flowchart TD
subgraph Generator_Function
GF[function* generator]
GF -->|call| ITER[Returns iterator<br/>not executed yet]
ITER -->|next call| Y1[Run until yield<br/>return value + suspended]
Y1 -->|next call| Y2[Resume from yield<br/>next value]
Y2 -->|next call| DONE[done: true<br/>value: undefined]
end
subgraph Use_Cases
LAZY[Lazy sequences<br/>infinite lists]
ASYNC2[Async orchestration<br/>co-routines]
ITER2[Custom iterators<br/>Symbol.iterator]
end
style GF fill:#6e40c9,color:#fff
style ITER fill:#1f6feb,color:#fff
style DONE fill:#d29922,color:#fff
Common Misconception
Why It Matters
Common Mistakes
- Calling the generator function itself — it returns a generator object, not the first value; call .next() to start.
- Not handling generator return values — the final .next() returns {value: returnValue, done: true}.
- Using generators where async/await is clearer — generators are the right tool for lazy sequences, not general async code.
- Not using yield* to delegate to another iterable — manually iterating inside a generator is verbose.
Code Examples
// Loads all pages into memory before processing:
async function getAllUsers() {
const users = [];
let page = 1;
while (true) {
const batch = await fetchPage(page++);
if (!batch.length) break;
users.push(...batch); // Could be millions of users in memory
}
return users;
}
// Generator — processes one page at a time, O(1) memory:
async function* paginatedUsers() {
let page = 1;
while (true) {
const batch = await fetchPage(page++);
if (!batch.length) return;
yield* batch; // Yield each user one at a time
}
}
// Caller:
for await (const user of paginatedUsers()) {
await processUser(user); // Never loads all users at once
}