API Composition Pattern
debt(d8/e7/b7/t5)
Closest to 'silent in production until users hit it' (d9), -1. The term's detection_hints explicitly state automated detection is 'no'. There are no tools listed that can detect missing or incorrect API composition. However, performance testing or latency monitoring could surface the issue before end users feel it (e.g., noticing 800ms response times), so d8 rather than d9.
Closest to 'cross-cutting refactor across the codebase' (e7). The quick_fix is vague ('see documentation'), indicating no simple one-line fix. Introducing or restructuring an API composition layer touches multiple services, requires new infrastructure (BFF/gateway), changes client contracts, and involves adding parallel fan-out, partial response handling, and caching strategies. This is a cross-cutting architectural change spanning multiple files and services.
Closest to 'strong gravitational pull' (b7). Once an API composition layer is introduced (BFF, gateway aggregator, GraphQL resolver), every new feature that aggregates data flows through it. The common_mistakes show ongoing maintenance costs: managing per-service TTLs, preventing business logic creep into the composition layer, and handling partial failures. It applies across web and CLI contexts and shapes how all client-facing endpoints are designed. Not quite b9 (you can incrementally refactor individual endpoints), but it strongly shapes system architecture.
Closest to 'notable trap (a documented gotcha most devs eventually learn)' (t5). The misconception is that API composition is always the API gateway's responsibility, when in fact any layer can compose (BFF, GraphQL resolver, client). Additionally, common_mistakes reveal non-obvious traps: developers naturally write sequential calls instead of parallel, neglect partial response handling, and let business logic accumulate in the composition layer. These are learned-the-hard-way gotchas but well-documented in microservice literature.
Also Known As
TL;DR
Explanation
When a client needs data from multiple microservices, an intermediate layer (API gateway, BFF, or dedicated composition service) calls all services in parallel and assembles the response. Benefits: fewer client round trips (1 vs N), parallel service calls reduce total latency to the slowest single call, stable API despite backend service changes. Implementation: ReactPHP or Guzzle async for parallel HTTP calls. Distinguished from BFF: composition focuses on data aggregation; BFF focuses on client-specific concerns.
Watch Out
Common Misconception
Why It Matters
Common Mistakes
- Sequential service calls where parallel is possible — adds unnecessary latency
- No partial response handling — one slow service blocks the entire response
- Composition layer accumulating business logic — keep it thin, business logic stays in services
- Caching composed responses without considering different per-service TTLs
Avoid When
- Avoid composition when services have strong transactional requirements — aggregating across services does not give you atomicity.
- Do not compose across services with very different SLAs — the slowest dependency sets the latency of the whole response.
- Avoid deep composition chains (composer calling composer) — cascading failures become difficult to trace and retry.
When To Use
- Use API composition in a BFF (Backend for Frontend) or API gateway to aggregate data from multiple services into one client response.
- Apply parallel fan-out when the downstream calls are independent — collects all results concurrently rather than sequentially.
- Use composition to shield clients from internal service topology changes — the client calls one stable endpoint.
Code Examples
// Client makes 5 serial requests — 500ms latency:
const user = await fetch('/api/users/42'); // 100ms
const orders = await fetch('/api/orders?userId=42'); // 100ms
const reviews = await fetch('/api/reviews?userId=42'); // 100ms
const balance = await fetch('/api/wallet/42'); // 100ms
const settings = await fetch('/api/settings/42'); // 100ms
// Total: 500ms serial latency
// Composition service — parallel server-side calls:
class DashboardComposer {
public function compose(int $userId): array {
// All 5 calls in parallel via Guzzle async:
[$user, $orders, $reviews, $balance, $settings] = Promise\all([
$this->users->get($userId),
$this->orders->byUser($userId),
$this->reviews->byUser($userId),
$this->wallet->balance($userId),
$this->settings->get($userId),
]);
// Client: 1 request, ~100ms (parallel) vs 500ms (serial)
return compact('user', 'orders', 'reviews', 'balance', 'settings');
}
}