Event Loop Blocking — Long Tasks
debt(d7/e5/b5/t7)
Closest to 'only careful code review or runtime testing' (d7). The detection_hints list Lighthouse and Chrome DevTools, both of which are specialist profiling tools that require manual invocation and interpretation — not automated linters. The code_pattern hints (.sort, .filter, .map) could theoretically be flagged, but context is needed to know whether input size makes them blocking; there is no automated signal flagged in the metadata.
Closest to 'touches multiple files / significant refactor in one component' (e5). The quick_fix involves moving CPU-intensive work to Web Workers or worker_threads — this is not a single-line swap. It requires introducing a new worker file, message-passing wiring, and restructuring the calling code. Breaking loops into setTimeout(0) chunks is simpler but also requires non-trivial refactoring of the computation logic.
Closest to 'persistent productivity tax' (b5). The term applies_to both web and cli contexts broadly. Any codebase doing significant data processing must continuously reason about whether computation stays within event-loop-safe bounds. It doesn't define the system's shape, but it imposes an ongoing discipline tax across many work streams whenever new data-processing features are added.
Closest to 'serious trap — contradicts how a similar concept works elsewhere' (t7). The misconception is explicit: developers believe async/await prevents blocking because it looks like asynchronous code, but async/await only yields on I/O. CPU-bound synchronous code inside async functions still fully blocks the event loop. This directly contradicts the intuition built from using async/await for I/O, making it a serious and well-documented gotcha.
TL;DR
Explanation
The browser and Node.js run JS on a single thread via an event loop. Synchronous operations hold the thread exclusively — while sorting a 100k array, no clicks, animations, or API callbacks can run. The 50ms threshold (from RAIL model) is the point beyond which users perceive delay. Solutions: setTimeout(fn, 0) or requestAnimationFrame() to yield, Web Workers for CPU tasks in browsers, Node.js worker_threads for CPU-bound work. For data processing: process in chunks with setImmediate() between batches. Identify blocking tasks with Lighthouse, Chrome Performance tab, or Long Tasks API.
Common Misconception
Why It Matters
Common Mistakes
- Running heavy synchronous computation (sorting, filtering large arrays) in the main thread.
- Parsing large JSON synchronously: JSON.parse(hugeString).
- Not using Web Workers for CPU-intensive client-side operations.
Code Examples
// Blocks for hundreds of ms on large arrays:
const sorted = bigArray.sort((a, b) => a.value - b.value);
const results = sorted.filter(complexPredicate);
// UI frozen during this
// Chunk-based processing with yields:
async function processInChunks(items, chunkSize = 100) {
for (let i = 0; i < items.length; i += chunkSize) {
const chunk = items.slice(i, i + chunkSize);
processChunk(chunk);
await new Promise(r => setTimeout(r, 0)); // Yield to event loop
}
}
// Web Worker for CPU-intensive work in browsers