Event Storming
debt(d7/e5/b5/t3)
Closest to 'only careful code review or runtime testing' (d7). The detection_hints list Miro, Mural, and Figma — collaborative whiteboarding tools with no automated analysis capability. The code_pattern cites 'New project starting without domain modeling; unclear bounded contexts; business experts and developers speaking different languages' — all of which are organizational and architectural signals invisible to any tool, surfacing only through careful review of project structure, communication breakdowns, or retrospective pain. No automated detection is possible (automated: no).
Closest to 'touches multiple files / significant refactor in one component' (e5). The quick_fix describes running a 2-hour workshop session involving domain experts and developers, mapping events, commands, and actors, and identifying bounded contexts. This is not a one-line patch (e1) nor a simple parameter swap (e3) — it requires coordinating multiple stakeholders, preparing materials, facilitating the session, and then acting on the outputs. However, it stops short of a full architectural rewrite (e7+), making e5 the best fit.
Closest to 'persistent productivity tax' (b5). The term applies_to web and cli contexts broadly, and the common_mistakes reveal ongoing costs: failing to document hotspots, treating output as final, and needing iterative revisits as understanding deepens. The workshop shapes bounded context decisions that ripple across the codebase, imposing a moderate but sustained gravitational pull on architecture decisions. It doesn't redefine the entire system shape (b9) but it does influence many work streams (b5).
Closest to 'minor surprise (one edge case)' (t3). The misconception field identifies a single, well-scoped wrong belief: that Event Storming requires a physical room with sticky notes and cannot be done remotely. This is a modest trap — remote tooling works well and the correction is straightforward — rather than a deep behavioral contradiction or catastrophic misunderstanding. Common mistakes around facilitation and sequencing add minor additional confusion but don't elevate to t5.
Also Known As
TL;DR
Explanation
Event Storming, created by Alberto Brandolini, brings domain experts and developers together to map a business domain using coloured sticky notes: orange (domain events), blue (commands that cause them), yellow (actors), pink (external systems), red (hotspots/questions). The result is a shared understanding of the domain timeline, natural service boundaries, and the ubiquitous language. Big Picture storming explores the entire domain; Process Level drilling into specific flows; Design Level generates aggregate designs.
Common Misconception
Why It Matters
Common Mistakes
- Facilitating without a genuine domain expert — developers storming alone produces a technical model, not a domain model.
- Going straight to Design Level — Big Picture first reveals the bounded contexts before drilling into aggregates.
- Not documenting hotspots — the red sticky notes marking confusion are the most valuable output of the session.
- Treating the output as final — the Event Storm is a starting point; it evolves as understanding deepens.
Code Examples
// Typical outcome without Event Storming:
// Developer models the domain from database schema
class Order extends Model { /* columns mapped directly */ }
class User extends Model { /* columns mapped directly */ }
// Two weeks in: domain expert says 'that is not how orders work'
// Model requires fundamental redesign
// Event Storming output guides the model:
// Domain events discovered:
// OrderPlaced → InventoryReserved → PaymentCaptured → OrderFulfilled → OrderShipped
// Bounded contexts identified:
// Ordering (Order, OrderLine, Customer)
// Inventory (Product, StockLevel, Reservation)
// Payments (Payment, Refund)
// Ubiquitous language agreed:
// 'Order' in Ordering ≠ 'Order' in Fulfillment — separate concepts