Skip to main content
Frontend for Backend Engineers

Streaming UIs and Suspense

Ravinder··6 min read
FrontendReactTypeScriptStreamingServer ComponentsReact 19
Share:
Streaming UIs and Suspense

HTTP chunked transfer encoding is not new to you. You have streamed large datasets, piped file downloads, and sent server-sent events. React 19's streaming rendering applies exactly the same principle to HTML: send what you have, stream the rest as it becomes available.

This is the convergence point where your backend intuitions are most directly applicable. The problem being solved is identical — latency hiding for slow data dependencies — and the solution is structurally the same.

The Problem: Waterfall Rendering

Traditional SSR blocks on all data before sending a byte of HTML. If your dashboard has five data dependencies and one takes 800ms, the user stares at a blank screen for 800ms before seeing anything.

sequenceDiagram participant B as Browser participant S as Server participant DB as Database B->>S: GET /dashboard S->>DB: fetch user (10ms) DB-->>S: user data S->>DB: fetch orders (800ms) ← slow DB-->>S: orders data S->>DB: fetch notifications (50ms) DB-->>S: notification data Note over S: All data ready — NOW render HTML S-->>B: Full HTML (860ms TTFB)

The fix: start streaming HTML immediately, and slot in the slow parts as they resolve.

React Suspense: The Client-Side Primitive

Suspense lets you declare a loading boundary. When a child component is "suspended" (waiting for async data), React renders the fallback instead:

import { Suspense } from "react";
 
function Dashboard() {
  return (
    <div className="dashboard">
      {/* Renders immediately — no async dependency */}
      <Header />
 
      {/* Suspended until user data resolves */}
      <Suspense fallback={<ProfileSkeleton />}>
        <UserProfile />
      </Suspense>
 
      {/* Independently suspended — slow orders don't block notifications */}
      <Suspense fallback={<OrdersSkeleton />}>
        <OrderHistory />
      </Suspense>
 
      <Suspense fallback={<NotificationsSkeleton />}>
        <NotificationList />
      </Suspense>
    </div>
  );
}

Each Suspense boundary is an independent loading unit. Slow components do not block fast siblings — analogous to parallel database queries in your server code.

React Server Components (RSC)

Server Components run on the server (or at build time) and never ship their JavaScript to the browser. They can be async, fetch data directly, and pass the result to client components as props.

// app/dashboard/page.tsx — Server Component (no "use client" directive)
// This file runs on the server. No JS bundle cost.
async function DashboardPage() {
  // Direct DB/API call — no fetch overhead, no API route needed
  const user = await db.users.findById(getCurrentUserId());
 
  return (
    <div>
      {/* Server component: renders to HTML, no client JS */}
      <ServerRenderedHeader user={user} />
 
      {/* Client component: interactive, ships JS */}
      <Suspense fallback={<OrdersSkeleton />}>
        <OrdersSection userId={user.id} />
      </Suspense>
    </div>
  );
}
 
// app/dashboard/orders-section.tsx — also a Server Component, can be async
async function OrdersSection({ userId }: { userId: string }) {
  const orders = await fetchOrders(userId); // server-side, no client exposure
  return <OrderList orders={orders} />;
}
// components/interactive-button.tsx — Client Component
"use client"; // this directive makes it a client component
 
import { useState } from "react";
 
export function ToggleButton({ label }: { label: string }) {
  const [active, setActive] = useState(false);
  return (
    <button onClick={() => setActive(a => !a)}>
      {active ? "Active" : label}
    </button>
  );
}

Streaming SSR with Suspense Boundaries

In Next.js App Router, Suspense boundaries become streaming flush points. The server sends the initial HTML immediately, then streams <script> tags that hydrate each boundary as its data resolves:

sequenceDiagram participant B as Browser participant S as Server B->>S: GET /dashboard S-->>B: Initial HTML chunk (shell: header, skeletons) ← immediate Note over B: User sees layout instantly S-->>B: Chunk: UserProfile HTML (10ms dependency resolved) Note over B: Profile appears S-->>B: Chunk: Notifications HTML (50ms dependency resolved) Note over B: Notifications appear S-->>B: Chunk: Orders HTML (800ms dependency resolved) Note over B: Orders appear — full page loaded

The browser's TCP connection stays open (HTTP/1.1 chunked or HTTP/2 stream). The server flushes HTML as each Suspense boundary's data resolves. TTFB drops dramatically; the slow dependency no longer blocks the fast ones.

// Next.js App Router: loading.tsx is an automatic Suspense fallback
// app/dashboard/loading.tsx
export default function DashboardLoading() {
  return <DashboardSkeleton />;
}
 
// app/dashboard/page.tsx
// Async components suspend automatically — no explicit Suspense needed at page level
export default async function DashboardPage() {
  // These run in parallel via Promise.all or independent resolution
  const [user, config] = await Promise.all([
    fetchUser(),
    fetchConfig(),
  ]);
 
  return <Dashboard user={user} config={config} />;
}

React 19: use() Hook and Actions

React 19 formalizes the patterns that emerged from RSC. The use() hook lets you read a Promise or Context inside a component, integrating with Suspense:

import { use, Suspense } from "react";
 
// Pass a Promise from server to client — resolves with Suspense
function PostPage({ postPromise }: { postPromise: Promise<Post> }) {
  const post = use(postPromise); // suspends until resolved
  return <article>{post.content}</article>;
}
 
// Server Action: form submission handled server-side
"use server";
async function submitComment(formData: FormData) {
  const comment = formData.get("comment") as string;
  await db.comments.create({ content: comment });
  revalidatePath("/posts");
}
 
// Client component uses the server action directly
"use client";
function CommentForm({ postId }: { postId: string }) {
  return (
    <form action={submitComment}>
      <input name="comment" required />
      <button type="submit">Post</button>
    </form>
  );
}

Server Actions are the framework-integrated equivalent of your API endpoints — form submissions go directly to server-side logic without a separately defined API route.

Parallel Data Fetching

The most common mistake with async server components is sequential awaits:

// SLOW: sequential — total time = t1 + t2 + t3
async function SlowPage() {
  const user    = await fetchUser();     // 100ms
  const orders  = await fetchOrders();   // 200ms
  const reviews = await fetchReviews();  // 150ms
  // Total: 450ms
}
 
// FAST: parallel — total time = max(t1, t2, t3)
async function FastPage() {
  const [user, orders, reviews] = await Promise.all([
    fetchUser(),     // 100ms
    fetchOrders(),   // 200ms
    fetchReviews(),  // 150ms
  ]);
  // Total: 200ms
}

For independent data that should stream independently (so a slow dependency does not block a fast one), split into separate async child components each wrapped in their own Suspense boundary.

Key Takeaways

  • React streaming SSR applies HTTP chunked transfer semantics to HTML: send the shell immediately, flush slow sections as their data resolves.
  • Suspense boundaries are independent streaming flush points — wrap each slow section in its own Suspense rather than a single outer boundary.
  • React Server Components run on the server with zero client JS cost; mark only interactive components with "use client".
  • Prefer Promise.all for independent parallel fetches in server components — sequential await creates the same waterfall problem you avoid on the backend.
  • React 19's use() hook and Server Actions integrate promises and form submissions directly into the React model without explicit API route indirection.
  • The mental model is identical to backend streaming: start responding early, fill in slow data as it becomes available, and keep dependencies independent.
Share: