← CodeClarityLab Home
Browse by Category
+ added · updated 7d
← Back to glossary

Real User Monitoring (RUM)

observability ES2015 Intermediate

Also Known As

RUM field data user experience monitoring

TL;DR

Collecting performance and error data from actual user browsers in production — capturing real-world conditions that synthetic lab testing cannot reproduce.

Explanation

RUM measures what users actually experience: page load times on 3G in Australia, JavaScript errors on older devices, Core Web Vitals distributions. It uses browser APIs (Navigation Timing, PerformanceObserver) to collect data and sends it to an analytics backend. Tools: Datadog RUM, Sentry (errors), web-vitals.js (custom RUM). Synthetic monitoring (Lighthouse, WebPageTest) tests ideal conditions; RUM reveals the p75/p95 tail that real users experience.

Common Misconception

Lighthouse scores represent real user experience — Lighthouse runs on a high-spec laptop with throttled network simulation; real users on budget phones see 3-5× worse performance.

Why It Matters

A 90 Lighthouse score with poor CrUX (Chrome User Experience Report) data means real users are suffering while your lab metrics look green — RUM closes this gap.

Common Mistakes

  • Optimising only for Lighthouse scores without measuring real user data via CrUX or RUM.
  • Not segmenting RUM data by device type, connection speed, or geography — global averages hide local problems.
  • Not setting performance budgets based on RUM p75 — 50th percentile metrics look fine while the p90 is unacceptable.
  • RUM script itself blocking page load — load the RUM beacon asynchronously and after user interaction data is collected.

Code Examples

✗ Vulnerable
// Synthetic only — misses real user conditions:
npx lighthouse https://example.com
// Score: 92 — team celebrates
// CrUX data: LCP p75 = 4.2s (Poor) on mobile
// Reason: Lighthouse uses cable throttling; real users on 4G are slower
✓ Fixed
// RUM with web-vitals.js:
import { onLCP, onINP, onCLS } from 'web-vitals';

function sendToAnalytics({ name, value, rating, id }) {
    navigator.sendBeacon('/analytics', JSON.stringify({
        metric: name, value, rating,
        url: location.href, connection: navigator.connection?.effectiveType
    }));
}
onLCP(sendToAnalytics);
onINP(sendToAnalytics);
onCLS(sendToAnalytics);

Added 15 Mar 2026
Edited 22 Mar 2026
Views 22
Rate this term
No ratings yet
🤖 AI Guestbook educational data only
| |
Last 30 days
0 pings F 2 pings S 0 pings S 0 pings M 0 pings T 0 pings W 1 ping T 0 pings F 2 pings S 0 pings S 0 pings M 0 pings T 0 pings W 2 pings T 0 pings F 2 pings S 0 pings S 0 pings M 0 pings T 1 ping W 0 pings T 0 pings F 1 ping S 0 pings S 0 pings M 0 pings T 0 pings W 1 ping T 0 pings F 0 pings S
No pings yet today
No pings yesterday
Amazonbot 8 Perplexity 3 Unknown AI 2 Ahrefs 2 Google 2 ChatGPT 2
crawler 18 crawler_json 1
DEV INTEL Tools & Severity
🟡 Medium ⚙ Fix effort: Medium
⚡ Quick Fix
Add the web-vitals.js library to your frontend and POST Core Web Vitals (LCP, INP, CLS) to a PHP endpoint — real users on real devices reveal issues that Lighthouse lab tests miss
📦 Applies To
javascript ES2015 web
🔗 Prerequisites
🔍 Detection Hints
Only Lighthouse synthetic testing; no field data from real users; poor CrUX scores not reflected in lab tests
Auto-detectable: ✗ No web-vitals-js datadog-rum newrelic-browser cloudflare-rum
⚠ Related Problems
🤖 AI Agent
Confidence: Low False Positives: Medium ✗ Manual fix Fix: Medium Context: File

✓ schema.org compliant