Performance Basics
13 examples to get you started with React Performance -- 9 basic and 4 intermediate.
Prerequisites
All examples assume a Next.js 15+ App Router project with React 19 and TypeScript. A few examples use extra tooling:
- React DevTools (browser extension) for profiling.
@next/bundle-analyzerfor bundle inspection:npm install --save-dev @next/bundle-analyzer.web-vitalsfor Core Web Vitals monitoring:npm install web-vitals.zustandfor the state-performance example:npm install zustand.
Two guiding rules for every example below:
- Measure first. Profile, don't guess -- the React DevTools Profiler and Lighthouse will tell you where the real cost is.
- Server Components and the React Compiler do a lot of this work automatically. Reach for manual
memo,useMemo,useCallbackonly after profiling proves the win.
Looking for a systematic review? See the Performance Checklist -- a 30-point audit you can run in CI.
Basic Examples
1. Profile Before You Optimize
Use React DevTools to record a real interaction and see which components actually re-render.
import { Profiler, type ProfilerOnRenderCallback } from "react";
const onRender: ProfilerOnRenderCallback = (id, phase, actualDuration) => {
console.log(`[${id}] ${phase} took ${actualDuration.toFixed(1)}ms`);
};
export default function Page() {
return (
<Profiler id="Dashboard" onRender={onRender}>
<Dashboard />
</Profiler>
);
}
function Dashboard() {
return <p>Content</p>;
}- The built-in
<Profiler>component logs render timings for its subtree -- great for spot measurements. - For interactive diagnosis, open the React DevTools Profiler tab and record -- it shows a flame chart of every commit.
- Note
actualDurationvsbaseDuration-- large deltas between them highlight wasted re-renders you can memoize away. - Never ship a
<Profiler>to production -- it has measurable overhead. Wrap its use in a dev-only check if it lives in the tree.
Related: React DevTools Profiler -- flame charts, commit inspection, interactions | Performance Checklist -- what to measure in CI
2. Stop Unnecessary Re-renders
A parent's re-render does not have to cascade -- split state and keep stable references so children skip work.
"use client";
import { memo, useState } from "react";
const Row = memo(function Row({ label }: { label: string }) {
console.log("Render", label);
return <li>{label}</li>;
});
export default function List({ items }: { items: string[] }) {
const [count, setCount] = useState(0);
return (
<>
<button onClick={() => setCount((c) => c + 1)}>{count}</button>
<ul>
{items.map((i) => (
<Row key={i} label={i} />
))}
</ul>
</>
);
}memo(Component)skips re-renders when props are referentially equal to the last render.- Clicking the counter re-renders
List, butRowprops (label) have not changed, so every row is skipped. - Inline objects (
style={{ ... }}) and inline functions breakmemo-- their reference changes every render. - Use
keyon lists to let React reuse DOM nodes when items move around -- akey={index}on a reordered list is a frequent bug source.
Related: Preventing Unnecessary Re-renders -- split state, lift down, keys | Memoization -- the primitives behind this pattern
3. useMemo, useCallback, and React.memo
Stabilize expensive values and function references so memoized children actually benefit.
"use client";
import { memo, useCallback, useMemo, useState } from "react";
const ExpensiveChart = memo(function ExpensiveChart({
points,
onPick,
}: {
points: number[];
onPick: (n: number) => void;
}) {
return <p>{points.length} points</p>;
});
export default function Dashboard({ raw }: { raw: number[] }) {
const [picked, setPicked] = useState<number | null>(null);
const points = useMemo(() => raw.filter((n) => n > 0), [raw]);
const onPick = useCallback((n: number) => setPicked(n), []);
return (
<>
<p>Picked: {picked ?? "none"}</p>
<ExpensiveChart points={points} onPick={onPick} />
</>
);
}useMemo(fn, deps)caches the value;useCallback(fn, deps)caches the function reference -- both keep prop identity stable for memoized children.- Without these,
raw.filter(...)and(n) => setPicked(n)would be new references every render, invalidatingmemo. - Do not wrap every value -- premature memoization adds code and has its own overhead.
- Once the React Compiler ships in your project, you can usually delete these calls -- the compiler memoizes automatically.
Related: Memoization -- when memoization actually helps | useMemo / useCallback -- the hook APIs
4. React Compiler (Auto-memoization)
Turn on the React Compiler and let it insert memoization for you -- delete most manual memo/useMemo/useCallback.
// next.config.ts
import type { NextConfig } from "next";
const config: NextConfig = {
experimental: {
reactCompiler: true,
},
};
export default config;// After the compiler is on, write plain components:
function ProductList({ products }: { products: Product[] }) {
const total = products.reduce((sum, p) => sum + p.price, 0);
const handleClick = (id: string) => console.log(id);
return <p>{total} ({products.length} items)</p>;
}- The compiler analyzes your components at build time and inserts memoization where it is safe.
- No API changes -- just opt in and write idiomatic React. Existing
useMemo/useCallbackstay working. - Requires React 19 and a compatible version of Next/Babel.
npx react-compiler-healthcheckvalidates your codebase. - Does not replace profiling -- it only fixes reference-identity problems, not algorithmic ones.
Related: React Compiler -- setup, bailouts, debugging | React Compiler (React 19) -- the React 19 feature page
5. Dynamic Import for Code Splitting
Defer loading a heavy component until the user actually needs it.
"use client";
import dynamic from "next/dynamic";
import { useState } from "react";
const Chart = dynamic(() => import("./Chart"), {
loading: () => <p>Loading chart...</p>,
ssr: false,
});
export default function Dashboard() {
const [open, setOpen] = useState(false);
return (
<>
<button onClick={() => setOpen(true)}>Show chart</button>
{open && <Chart />}
</>
);
}next/dynamicreturns a component that code-splits into a separate chunk; it downloads only when rendered.- Pair it with a conditional (
open && <Chart />) so the chunk is fetched on demand -- ideal for modals, charts, and rich text editors. ssr: falseopts out of server rendering when the component depends on browser-only APIs (window, document).- For big libraries (chart, markdown, WYSIWYG), dynamic import alone can cut your initial bundle by hundreds of KB.
Related: Bundle Size Optimization -- analyzers, tree-shaking, package cost | Image & Font Performance -- other ways to shave bytes
6. Server Components Ship Less JS
Default to Server Components; drop "use client" only where interactivity starts.
// app/dashboard/page.tsx -- a Server Component
import { ClientChart } from "./chart";
interface Product {
id: number;
name: string;
price: number;
}
export default async function DashboardPage() {
const res = await fetch("https://api.example.com/products");
const products: Product[] = await res.json();
const total = products.reduce((sum, p) => sum + p.price, 0);
return (
<>
<h1>Total: ${total}</h1>
<ul>
{products.map((p) => (
<li key={p.id}>{p.name}</li>
))}
</ul>
<ClientChart data={products} />
</>
);
}- Server Components render only on the server -- their code never ships to the browser.
- Fetching and computing on the server keeps the client bundle lean; the interactive
<ClientChart>is the only JS shipped. - Typical savings: 30-70% reduction in client JS after moving list/detail pages to Server Components.
- You can only pass serializable props across the server/client boundary -- no functions, no class instances.
Related: Server Component Performance -- boundaries, patterns, measurements | Server Components (React 19) -- the primitive
7. next/image for Fast LCP
Use Next.js's <Image> so the browser ships the right size, lazy-loads below the fold, and reserves space to avoid layout shift.
import Image from "next/image";
export default function Hero() {
return (
<Image
src="/hero.jpg"
alt="Landing hero"
width={1200}
height={630}
priority
sizes="(max-width: 768px) 100vw, 1200px"
/>
);
}next/imageserves modern formats (AVIF/WebP), generates multiple sizes, and lazy-loads automatically.- The
widthandheightreserve space -- zero layout shift even before the image arrives. prioritydisables lazy-loading for above-the-fold images so they start loading immediately (big LCP win).sizestells the browser how wide the image will render at each breakpoint -- needed to pick the right source.
Related: Image & Font Performance -- font loading, preload, OG images | next/image -- full
<Image>API
8. Measure Core Web Vitals
Send real-user LCP, INP, and CLS metrics to analytics so you know what the real world sees.
// app/web-vitals.tsx
"use client";
import { useReportWebVitals } from "next/web-vitals";
export function WebVitals() {
useReportWebVitals((metric) => {
console.log(metric.name, metric.value);
// fetch("/api/metrics", { method: "POST", body: JSON.stringify(metric) });
});
return null;
}
// app/layout.tsx
// <WebVitals />- LCP < 2.5s (largest content paint), INP < 200ms (interaction latency, replaced FID in 2024), CLS < 0.1 (layout shift).
useReportWebVitalsfires once per metric per page -- forward them to your analytics, Sentry, or Vercel Analytics.- Lab scores (Lighthouse) are a ceiling; field metrics from real users are what Google ranks and what you should watch.
- Set budgets in CI (Lighthouse CI, Speed Insights) so regressions fail the build, not your users.
Related: Core Web Vitals Optimization -- per-metric fixes and measurement | Performance Checklist -- CI gates and budgets
9. Suspense for Faster Perceived Load
Break a route into streaming boundaries so fast panels render while slow ones are still fetching.
// app/dashboard/page.tsx
import { Suspense } from "react";
import { FastStats } from "./fast-stats";
import { SlowChart } from "./slow-chart";
export default function Dashboard() {
return (
<div>
<Suspense fallback={<p>Loading stats...</p>}>
<FastStats />
</Suspense>
<Suspense fallback={<p>Loading chart...</p>}>
<SlowChart />
</Suspense>
</div>
);
}- Each
<Suspense>streams independently -- the user sees the fast panel immediately instead of waiting for the whole route. - Skeleton fallbacks should match the final layout so content arrival does not shift the page.
- Use
loading.tsxfor route-level streaming, and nested<Suspense>for granular per-panel streaming. - Pair with parallel fetches inside each boundary (
Promise.all) so each panel is fully loaded the moment its JS streams in.
Related: Suspense & Streaming Performance -- boundary strategy | Suspense (patterns) -- the primitive | Streaming (Next.js Data) -- route-level streaming
Intermediate Examples
10. Zustand Selector to Scope Re-renders
Subscribe to a slice of the store so only components that read that slice re-render.
"use client";
import { create } from "zustand";
interface CartStore {
items: { id: string; qty: number }[];
count: number;
addItem: (id: string) => void;
}
const useCart = create<CartStore>((set) => ({
items: [],
count: 0,
addItem: (id) =>
set((s) => {
const items = [...s.items, { id, qty: 1 }];
return { items, count: items.length };
}),
}));
// Only re-renders when `count` changes, not when any other field does
function CartBadge() {
const count = useCart((s) => s.count);
return <span>{count}</span>;
}
// Only re-renders when `addItem` reference changes (never, unless store recreates)
function AddButton({ id }: { id: string }) {
const addItem = useCart((s) => s.addItem);
return <button onClick={() => addItem(id)}>Add</button>;
}- Passing a selector function to the hook tells Zustand to compare only that slice across renders.
- Components that only read actions (no state) effectively never re-render from the store -- actions have stable identity.
- Returning a new object from a selector every call defeats the optimization -- use
shallowequality or return primitives. - Context has no built-in selector model; reach for Zustand/Jotai when context re-renders become the bottleneck.
Related: State Management Performance -- context splitting, derived state | Zustand Selectors -- deeper selector patterns
11. Parallel Data Fetching to Eliminate Waterfalls
Kick off independent fetches in parallel so total wait time is the slowest, not the sum.
// app/dashboard/page.tsx
interface User { name: string; }
interface Stats { totalSales: number; }
interface Activity { events: string[]; }
async function getUser(): Promise<User> {
return (await fetch("https://api.example.com/me")).json();
}
async function getStats(): Promise<Stats> {
return (await fetch("https://api.example.com/stats")).json();
}
async function getActivity(): Promise<Activity> {
return (await fetch("https://api.example.com/activity")).json();
}
export default async function Dashboard() {
const [user, stats, activity] = await Promise.all([
getUser(),
getStats(),
getActivity(),
]);
return (
<p>
{user.name} — ${stats.totalSales} — {activity.events.length} events
</p>
);
}- Sequential
awaits create a waterfall -- each request waits for the one before to resolve. Promise.allfires all requests at once; total time equals the slowest request.- Use
Promise.allSettledwhen one failure should not reject the whole set (e.g., optional sidebar data). - For parallel work across the tree, prefer per-panel Suspense boundaries so fast data streams in first.
Related: Data Fetching Performance -- waterfalls, parallelism, caching | Parallel Promises -- Promise.all, allSettled patterns
12. Cleanup to Prevent Memory Leaks
Always tear down subscriptions, timers, and listeners in the useEffect cleanup.
"use client";
import { useEffect, useState } from "react";
export default function LiveCounter() {
const [n, setN] = useState(0);
useEffect(() => {
const controller = new AbortController();
const id = setInterval(() => setN((x) => x + 1), 1000);
window.addEventListener("resize", () => console.log("resize"), {
signal: controller.signal,
});
return () => {
clearInterval(id);
controller.abort(); // removes the listener
};
}, []);
return <p>Ticks: {n}</p>;
}- Every
setInterval,setTimeout,addEventListener, and subscription must have a matching cleanup, or the browser holds onto the component forever. AbortControlleris the modern way to remove listeners -- oneabort()tears down every listener registered with the signal.- Aborting in-flight
fetchcalls with the same signal avoids "setState after unmount" warnings. - Use the Chrome DevTools Memory tab to take heap snapshots before and after navigation -- detached DOM nodes are the tell.
Related: Memory Leaks -- detection, WeakMap patterns, long-lived refs | useEffect -- cleanup rules and timing
13. Next.js Cache + Tag-Based Revalidation
Cache expensive fetches across requests and invalidate them surgically when data changes.
// app/lib/data.ts — server-only helper
import { unstable_cache } from "next/cache";
interface Post {
id: number;
title: string;
}
export const getPosts = unstable_cache(
async (): Promise<Post[]> => {
const res = await fetch("https://api.example.com/posts");
return res.json();
},
["posts"],
{ tags: ["posts"], revalidate: 300 },
);
// app/posts/actions.ts
"use server";
import { revalidateTag } from "next/cache";
export async function createPost(title: string) {
await fetch("https://api.example.com/posts", {
method: "POST",
body: JSON.stringify({ title }),
});
revalidateTag("posts");
}unstable_cache(fn, keys, options)memoizes the result across requests -- subsequent callers get a cache hit.tags: ["posts"]lets you invalidate every cached entry with that tag in one call viarevalidateTag("posts").revalidate: 300adds a 5-minute time-based ceiling in case you forget to tag-invalidate somewhere.- For request-scoped memoization (dedup within one render tree), plain
fetchalready memoizes -- reserveunstable_cachefor cross-request sharing.
Related: Next.js Caching Deep Dive -- four-layer model, cache lifecycles | Caching (Next.js Data) -- fetch-level cache options | Revalidation -- revalidatePath vs revalidateTag vs ISR