Streaming & Suspense
Stream server-rendered content progressively using React Suspense boundaries and loading.tsx.
Recipe
Quick-reference recipe card -- copy-paste ready.
// app/dashboard/loading.tsx -- instant loading UI for the entire route
export default function Loading() {
return <div className="animate-pulse">Loading dashboard...</div>;
}// app/dashboard/page.tsx -- granular streaming with Suspense
import { Suspense } from "react";
export default function DashboardPage() {
return (
<main>
<h1>Dashboard</h1>
<Suspense fallback={<p>Loading stats...</p>}>
<SlowStats />
</Suspense>
<Suspense fallback={<p>Loading chart...</p>}>
<SlowChart />
</Suspense>
</main>
);
}
async function SlowStats() {
const stats = await fetchStats(); // 2s delay
return <StatsGrid data={stats} />;
}When to reach for this: You have a page with slow data sources and want to show content progressively instead of blocking the entire page on the slowest query.
Working Example
// app/dashboard/page.tsx
import { Suspense } from "react";
import { RecentOrders } from "./recent-orders";
import { RevenueChart } from "./revenue-chart";
import { TopProducts } from "./top-products";
export default function DashboardPage() {
return (
<main className="grid grid-cols-2 gap-6 p-6">
<h1 className="col-span-2 text-2xl font-bold">Dashboard</h1>
{/* Each section streams independently */}
<Suspense fallback={<ChartSkeleton />}>
<RevenueChart />
</Suspense>
<Suspense fallback={<ListSkeleton rows={5} />}>
<TopProducts />
</Suspense>
<div className="col-span-2">
<Suspense fallback={<TableSkeleton />}>
<RecentOrders />
</Suspense>
</div>
</main>
);
}
function ChartSkeleton() {
return <div className="h-64 bg-gray-100 rounded animate-pulse" />;
}
function ListSkeleton({ rows }: { rows: number }) {
return (
<div className="space-y-3">
{Array.from({ length: rows }).map((_, i) => (
<div key={i} className="h-8 bg-gray-100 rounded animate-pulse" />
))}
</div>
);
}
function TableSkeleton() {
return <div className="h-48 bg-gray-100 rounded animate-pulse" />;
}// app/dashboard/revenue-chart.tsx (Server Component)
import { db } from "@/lib/db";
import { ChartClient } from "./chart-client";
export async function RevenueChart() {
// Simulate a slow query
const revenue = await db.order.aggregate({
_sum: { total: true },
where: { createdAt: { gte: new Date(Date.now() - 30 * 86400000) } },
});
const dailyData = await db.$queryRaw`
SELECT DATE(created_at) as date, SUM(total) as total
FROM orders
WHERE created_at > NOW() - INTERVAL '30 days'
GROUP BY DATE(created_at)
ORDER BY date
`;
return <ChartClient data={dailyData} total={revenue._sum.total ?? 0} />;
}// app/dashboard/loading.tsx
// This file creates an automatic Suspense boundary around the page
export default function DashboardLoading() {
return (
<div className="grid grid-cols-2 gap-6 p-6">
<h1 className="col-span-2 text-2xl font-bold">Dashboard</h1>
<div className="h-64 bg-gray-100 rounded animate-pulse" />
<div className="h-64 bg-gray-100 rounded animate-pulse" />
<div className="col-span-2 h-48 bg-gray-100 rounded animate-pulse" />
</div>
);
}What this demonstrates:
loading.tsxas a route-level Suspense boundary for instant navigation feedback- Granular
<Suspense>boundaries so each dashboard widget streams independently - Skeleton components that match the layout of the real content to prevent layout shift
- Server Components fetching data asynchronously -- each one resolves and streams when ready
Deep Dive
How It Works
- Streaming SSR: Instead of waiting for all data to resolve before sending any HTML, Next.js sends the initial shell immediately and streams additional HTML chunks as each Suspense boundary resolves.
loading.tsx: Next.js automatically wraps the page component in a<Suspense>boundary usingloading.tsxas the fallback. This provides an instant loading state during navigation.- Nested Suspense: You can nest
<Suspense>boundaries at any granularity. Each boundary independently resolves and replaces its fallback with the real content. - Order of resolution: Components stream in the order they resolve, not the order they appear in the tree. A widget at the bottom of the page can appear before one at the top if its data arrives first.
- Client-side navigation: During client-side navigation (using
<Link>), React renders theloading.tsxfallback immediately while fetching the RSC payload for the new route. - HTTP streaming: The response uses
Transfer-Encoding: chunkedto send HTML progressively. This requires a runtime that supports streaming (Node.js, Edge).
Variations
Streaming with a passed promise (defer pattern):
// Server Component passes a promise without awaiting
export default async function Page() {
const analyticsPromise = fetchAnalytics(); // not awaited
return (
<Suspense fallback={<p>Loading analytics...</p>}>
<Analytics dataPromise={analyticsPromise} />
</Suspense>
);
}
// Client Component consumes the promise with use()
"use client";
import { use } from "react";
export function Analytics({ dataPromise }: { dataPromise: Promise<Data> }) {
const data = use(dataPromise); // suspends until resolved
return <Chart data={data} />;
}Sequential vs parallel streaming:
// Sequential -- each awaits in order (waterfall)
async function Sequential() {
const a = await fetchA(); // blocks
const b = await fetchB(); // waits for a
return <>{a}{b}</>;
}
// Parallel -- separate Suspense boundaries
function Parallel() {
return (
<>
<Suspense fallback={<p>A...</p>}><AsyncA /></Suspense>
<Suspense fallback={<p>B...</p>}><AsyncB /></Suspense>
</>
);
}TypeScript Notes
// loading.tsx must be a default export returning ReactNode
export default function Loading(): React.ReactNode {
return <Skeleton />;
}
// Suspense fallback accepts ReactNode
<Suspense fallback={<div>Loading...</div>}>
<AsyncComponent />
</Suspense>
// use() hook type inference
const data: Data = use(dataPromise); // infers from Promise<Data>Gotchas
-
Suspense boundary too high -- Wrapping your entire page in a single Suspense boundary means nothing shows until all data resolves. Fix: Use multiple granular Suspense boundaries around each independent data source.
-
Suspense boundary too low -- Wrapping every tiny component in Suspense creates excessive loading spinners and visual noise. Fix: Group related components under a single boundary for a cohesive loading experience.
-
Layout shift from skeletons -- If your skeleton does not match the dimensions of the real content, the page jumps when data arrives. Fix: Make skeletons the same height/width as the resolved content using fixed dimensions or aspect ratios.
-
loading.tsxapplies to navigations only -- On a hard refresh (full page load),loading.tsxis rendered as part of the initial HTML, but it does not create a true streaming boundary for the initial SSR in all cases. Fix: Use explicit<Suspense>boundaries inside your page for reliable streaming behavior. -
Static routes do not stream -- If a route is fully static (no dynamic data), it is prerendered at build time and served as a complete HTML file. Streaming only applies to dynamic routes. Fix: This is expected behavior; no fix needed.
-
Error boundaries and Suspense -- If a Suspense child throws, the error bubbles up. Without an
error.tsxor<ErrorBoundary>, the entire page fails. Fix: Pair Suspense boundaries with error boundaries at the same level.
Alternatives
| Alternative | Use When | Don't Use When |
|---|---|---|
loading.tsx | You want a simple route-level loading state | You need granular control over which parts stream |
Nested <Suspense> | Each section has independent data sources | All data comes from a single fast query |
| Client-side fetching (SWR) | You need real-time updates after the initial load | Server-side streaming is sufficient |
| Static generation | Data does not change between deployments | Data is user-specific or frequently updated |
| Partial Prerendering (PPR) | You want a static shell with dynamic holes | Your entire page is dynamic |
Real-World Example
From a production Next.js 15 / React 19 SaaS application (SystemsArchitect.io).
// Production example: SSE streaming with markdown buffering
// File: src/hooks/use-stream-content.ts
const reader = response.body?.getReader();
const decoder = new TextDecoder();
let buffer = '';
const hasIncompleteMarkdown = (text: string): boolean => {
const boldCount = (text.match(/\*\*/g) || []).length;
if (boldCount % 2 !== 0) return true;
const codeBlockCount = (text.match(/```/g) || []).length;
if (codeBlockCount % 2 !== 0) return true;
const openBrackets = (text.match(/\[/g) || []).length;
const closeBrackets = (text.match(/\]/g) || []).length;
if (openBrackets !== closeBrackets) return true;
return false;
};
while (true) {
if (options.signal?.aborted) {
setState({ isStreaming: false, streamedContent: accumulated, error: 'Stream aborted' });
return;
}
const { done, value } = await reader.read();
if (done) {
if (buffer) { accumulated += buffer; }
break;
}
const chunk = decoder.decode(value, { stream: true });
const lines = chunk.split('\n');
for (const line of lines) {
if (line.startsWith('data: ')) {
try {
const data = JSON.parse(line.slice(6));
if (data.chunk) {
buffer += data.chunk;
const { complete, remaining } = extractCompleteUnits(buffer);
if (complete) {
accumulated += complete;
buffer = remaining;
setState({ isStreaming: true, streamedContent: accumulated, error: null });
options.onChunk?.(complete);
}
}
} catch { /* skip malformed SSE lines */ }
}
}
}What this demonstrates in production:
response.body.getReader()returns a ReadableStreamDefaultReader for processing data as it arrivesTextDecoder({ stream: true })handles multi-byte UTF-8 characters that may be split across chunkshasIncompleteMarkdown()prevents rendering partial markdown (unmatched bold markers, unclosed code blocks, incomplete links) which would flash broken formattingoptions.signal?.abortedchecks the AbortController signal allowing the user to cancel mid-streamaccumulatedbuilds the full content for final save, whilesetStatetriggers incremental UI renders- The buffer pattern ensures only complete sentences and paragraphs are rendered, not partial tokens
JSON.parseon each SSE line could fail on malformed data. The silent catch is correct behavior for streaming
FAQs
What is the difference between loading.tsx and an explicit <Suspense> boundary?
loading.tsxcreates an automatic route-level Suspense boundary around the entire page- Explicit
<Suspense>boundaries give granular control over which parts stream independently - Use
loading.tsxfor a quick loading state; use<Suspense>for fine-grained streaming
Do components stream in the order they appear in the JSX tree?
- No. Components stream in the order they resolve, not their position in the tree
- A widget at the bottom can appear before one at the top if its data arrives first
- Each Suspense boundary resolves independently
What happens if a Suspense boundary is too high in the component tree?
- Nothing shows until all data within that boundary resolves
- You lose the benefit of progressive rendering
- Use multiple granular Suspense boundaries around each independent data source
What happens if a Suspense child throws an error?
- The error bubbles up past the Suspense boundary
- Without an
error.tsxor<ErrorBoundary>, the entire page fails - Always pair Suspense boundaries with error boundaries at the same level
Do static routes use streaming?
- No. Fully static routes are prerendered at build time and served as complete HTML
- Streaming only applies to dynamic routes with async data
- This is expected behavior and requires no fix
How do you pass a promise from a Server Component to a Client Component for streaming?
// Server Component: do NOT await the promise
export default async function Page() {
const dataPromise = fetchAnalytics();
return (
<Suspense fallback={<p>Loading...</p>}>
<AnalyticsClient dataPromise={dataPromise} />
</Suspense>
);
}
// Client Component: consume with use()
"use client";
import { use } from "react";
function AnalyticsClient({ dataPromise }: { dataPromise: Promise<Data> }) {
const data = use(dataPromise);
return <Chart data={data} />;
}How do you prevent layout shift when streaming replaces skeletons with real content?
- Make skeletons match the height and width of the resolved content
- Use fixed dimensions or aspect ratios on skeleton components
- Mismatched dimensions cause the page to jump when data arrives
What is the TypeScript return type for loading.tsx?
export default function Loading(): React.ReactNode {
return <Skeleton />;
}- Must be a default export returning
React.ReactNode
What HTTP mechanism enables streaming in Next.js?
- The response uses
Transfer-Encoding: chunkedto send HTML progressively - This requires a runtime that supports streaming (Node.js or Edge)
- Static hosting without streaming support cannot use this feature
How does the use() hook infer types from a promise?
const data: Data = use(dataPromise);
// TypeScript infers Data from Promise<Data> automaticallyuse()unwraps the promise type, souse(Promise<T>)returnsT
Why does loading.tsx not always create a true streaming boundary on full page loads?
- On hard refresh,
loading.tsxis rendered as part of the initial HTML - It does not always create a true streaming boundary for the initial SSR
- Use explicit
<Suspense>boundaries inside your page for reliable streaming on initial load
What does the hasIncompleteMarkdown function in the real-world example protect against?
- It prevents rendering partial markdown like unmatched bold markers (
**) or unclosed code blocks - Without this check, incomplete tokens would flash broken formatting to the user
- The buffer pattern ensures only complete sentences and paragraphs are rendered
Related
- Fetching -- Setting up async data fetching
- Caching -- How caching interacts with streaming
- Partial Prerendering -- Static shells with streamed dynamic content
- Server Components -- Async components that power streaming