Async Generators & Iterators
Process large datasets, paginated APIs, and streaming data with async generators in Server Components and Route Handlers.
Recipe
Quick-reference for async generator patterns.
// Basic async generator
async function* fetchAllPages(baseUrl: string) {
let page = 1;
let hasMore = true;
while (hasMore) {
const res = await fetch(`${baseUrl}?page=${page}`);
const data = await res.json();
yield data.items;
hasMore = data.hasNextPage;
page++;
}
}
// Consuming with for-await-of
for await (const batch of fetchAllPages("/api/products")) {
processBatch(batch);
}When to reach for this: You need to process paginated API responses, stream large datasets without loading everything into memory, or produce data incrementally in a Route Handler.
Working Example
// lib/paginated-fetch.ts
interface PaginatedResponse<T> {
items: T[];
nextCursor: string | null;
}
async function* fetchAllItems<T>(
url: string,
options?: RequestInit
): AsyncGenerator<T[], void, unknown> {
let cursor: string | null = null;
do {
const fetchUrl = cursor ? `${url}?cursor=${cursor}` : url;
const res = await fetch(fetchUrl, options);
const data: PaginatedResponse<T> = await res.json();
yield data.items;
cursor = data.nextCursor;
} while (cursor !== null);
}
// app/admin/export/route.ts - Stream CSV export
import { NextResponse } from "next/server";
interface Order {
id: string;
customer: string;
total: number;
date: string;
}
export async function GET() {
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
// CSV header
controller.enqueue(encoder.encode("id,customer,total,date\n"));
// Stream rows from paginated API
for await (const batch of fetchAllItems<Order>(
"https://api.example.com/orders"
)) {
for (const order of batch) {
const row = `${order.id},${order.customer},${order.total},${order.date}\n`;
controller.enqueue(encoder.encode(row));
}
}
controller.close();
},
});
return new NextResponse(stream, {
headers: {
"Content-Type": "text/csv",
"Content-Disposition": 'attachment; filename="orders.csv"',
},
});
}What this demonstrates:
- Generic async generator for cursor-based pagination
- Streaming CSV export via Route Handler
- Processing data in batches without loading all into memory
- Typed generator with TypeScript generics
Deep Dive
How It Works
- An async generator function uses
async function*syntax andyieldto produce values yieldpauses the generator and returns a value to the consumer- The consumer uses
for await...ofto iterate over yielded values - Each iteration resumes the generator until the next
yieldorreturn - Memory efficient: only one batch is in memory at a time
- The generator function body runs lazily (only when values are requested)
Parameters & Return Values
| Concept | Syntax | Description |
|---|---|---|
| Async generator function | async function* name() | Defines a generator that yields promises |
yield | yield value | Pauses and produces a value |
yield* | yield* otherGenerator() | Delegates to another generator |
for await...of | for await (const x of gen()) | Consumes an async iterable |
return | return value | Ends the generator |
.next() | gen.next() | Manually advances the generator |
.return() | gen.return() | Terminates the generator early |
Variations
Paginated API with offset:
async function* paginateWithOffset<T>(
fetcher: (offset: number, limit: number) => Promise<T[]>,
limit = 100
): AsyncGenerator<T[]> {
let offset = 0;
while (true) {
const batch = await fetcher(offset, limit);
if (batch.length === 0) break;
yield batch;
if (batch.length < limit) break; // Last page
offset += limit;
}
}
// Usage in a Server Component
export default async function AllProductsPage() {
const allProducts: Product[] = [];
for await (const batch of paginateWithOffset(
(offset, limit) =>
db.product.findMany({ skip: offset, take: limit }),
50
)) {
allProducts.push(...batch);
}
return <ProductGrid products={allProducts} />;
}Streaming Server-Sent Events (SSE):
// app/api/events/route.ts
export async function GET() {
const encoder = new TextEncoder();
async function* eventStream() {
let id = 0;
while (true) {
const data = await getLatestUpdate();
yield `id: ${id++}\ndata: ${JSON.stringify(data)}\n\n`;
await new Promise((resolve) => setTimeout(resolve, 1000));
}
}
const stream = new ReadableStream({
async start(controller) {
for await (const event of eventStream()) {
controller.enqueue(encoder.encode(event));
}
},
});
return new NextResponse(stream, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
Connection: "keep-alive",
},
});
}Transform generator (pipeline pattern):
// Chain generators for data transformation pipelines
async function* map<T, U>(
source: AsyncIterable<T>,
transform: (item: T) => U | Promise<U>
): AsyncGenerator<U> {
for await (const item of source) {
yield await transform(item);
}
}
async function* filter<T>(
source: AsyncIterable<T>,
predicate: (item: T) => boolean | Promise<boolean>
): AsyncGenerator<T> {
for await (const item of source) {
if (await predicate(item)) yield item;
}
}
async function* take<T>(
source: AsyncIterable<T>,
count: number
): AsyncGenerator<T> {
let taken = 0;
for await (const item of source) {
yield item;
if (++taken >= count) break;
}
}
// Pipeline: fetch all users -> filter active -> take first 10 -> transform
const pipeline = take(
map(
filter(
fetchAllItems<User>("https://api.example.com/users"),
(users) => users.filter((u) => u.active).length > 0
),
(users) => users.filter((u) => u.active)
),
10
);Collecting all results:
// Helper to collect an async iterable into an array
async function collect<T>(iterable: AsyncIterable<T[]>): Promise<T[]> {
const results: T[] = [];
for await (const batch of iterable) {
results.push(...batch);
}
return results;
}
// Usage
const allOrders = await collect(
fetchAllItems<Order>("https://api.example.com/orders")
);Rate-limited generator:
async function* rateLimited<T>(
source: AsyncIterable<T>,
delayMs: number
): AsyncGenerator<T> {
for await (const item of source) {
yield item;
await new Promise((resolve) => setTimeout(resolve, delayMs));
}
}
// Fetch pages with 200ms delay between requests
for await (const batch of rateLimited(fetchAllPages(url), 200)) {
processBatch(batch);
}TypeScript Notes
// Typing async generators
async function* counter(): AsyncGenerator<number, void, unknown> {
// AsyncGenerator<Yield, Return, Next>
// Yield = type of values produced by yield
// Return = type of the return value
// Next = type of values passed to .next()
let i = 0;
while (true) {
yield i++;
}
}
// AsyncIterable is the consumer-side type
async function processItems(source: AsyncIterable<string[]>) {
for await (const batch of source) {
console.log(batch.length);
}
}
// Generic paginated fetcher type
type PaginatedFetcher<T> = (
cursor: string | null
) => Promise<{ items: T[]; nextCursor: string | null }>;Gotchas
-
Generators are lazy. Nothing runs until you consume the generator with
for await...ofor.next(). If you create a generator but never iterate it, the function body never executes. Fix: This is a feature, not a bug. Just remember to consume it. -
Error handling in for-await-of. If a yield throws, the loop terminates and the error propagates. Unconsumed items are lost. Fix: Wrap the loop in try/catch, or handle errors inside the generator with try/catch around yield.
-
Memory leaks with infinite generators. A generator that never returns keeps its closure alive. Fix: Use
breakin the consumer or.return()on the generator to allow cleanup.for await...ofcalls.return()automatically onbreak. -
Cannot use generators in Client Components. Generators run on the server. Client Components need the final data or a streaming pattern (SSE, WebSocket). Fix: Use generators in Server Components, Route Handlers, or Server Actions only.
-
No backpressure by default. The generator produces as fast as the consumer requests, but if you yield into a stream, the stream might buffer. Fix: Add delays with
rateLimitedor useReadableStreamwith pull-based backpressure. -
Generators are single-use. Once consumed, you cannot iterate a generator again. Calling the generator function creates a new iterator. Fix: Call the generator function again for a fresh iterator.
Alternatives
| Alternative | Use When | Don't Use When |
|---|---|---|
Promise.all | Fixed number of parallel fetches | Unknown number of pages |
| Array + loop | Simple sequential fetches with known count | Large or unbounded datasets |
ReadableStream | Streaming HTTP responses (SSE, file downloads) | Simple data aggregation |
| Database cursors | Direct database pagination (Prisma, SQL) | External API pagination |
| Web Streams API | Browser-compatible streaming | Server-only processing |
FAQs
What is the difference between a regular async function and an async generator?
- A regular async function returns a single
Promisethat resolves to one value - An async generator uses
async function*andyieldto produce multiple values over time - Generators are lazy -- they only execute code when the consumer requests the next value
Why are generators useful for paginated API responses?
- Only one page of data is in memory at a time
- The generator pauses between pages, keeping memory usage constant
- The consumer controls when to fetch the next page via
for await...of
Can you use async generators in Client Components?
- No. Generators run on the server only
- Client Components need the final data or a streaming pattern like SSE or WebSocket
- Use generators in Server Components, Route Handlers, or Server Actions
What happens if you create a generator but never iterate it?
- Nothing runs. Generators are lazy -- the function body never executes until consumed
- You must use
for await...ofor manually call.next()to start execution - This is a feature, not a bug
How do you type an async generator function in TypeScript?
async function* counter(): AsyncGenerator<number, void, unknown> {
// AsyncGenerator<Yield, Return, Next>
// Yield = type of values produced
// Return = type of the final return value
// Next = type of values passed to .next()
let i = 0;
while (true) yield i++;
}How do you handle errors inside a for await...of loop?
- If a
yieldthrows, the loop terminates and the error propagates - Unconsumed items are lost
- Wrap the loop in try/catch, or handle errors inside the generator around each
yield
What is the difference between yield and yield*?
yield valuepauses the generator and produces a single valueyield* otherGenerator()delegates to another generator, yielding all of its valuesyield*is useful for composing generators
How do you prevent memory leaks with infinite generators?
- Use
breakin the consumer loop or call.return()on the generator for await...ofautomatically calls.return()when youbreak- Without cleanup, an infinite generator keeps its closure alive indefinitely
How do you stream a CSV export using an async generator in a Route Handler?
export async function GET() {
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
controller.enqueue(encoder.encode("id,name\n"));
for await (const batch of fetchAllItems<Item>(url)) {
for (const item of batch) {
controller.enqueue(encoder.encode(`${item.id},${item.name}\n`));
}
}
controller.close();
},
});
return new NextResponse(stream, {
headers: { "Content-Type": "text/csv" },
});
}What does AsyncIterable vs AsyncGenerator mean in TypeScript?
AsyncIterable<T>is the consumer-side type -- anything you can use withfor await...ofAsyncGenerator<T>is the producer-side type returned byasync function*- Accept
AsyncIterablein function parameters for flexibility
How do you add rate limiting between generator iterations?
async function* rateLimited<T>(
source: AsyncIterable<T>,
delayMs: number
): AsyncGenerator<T> {
for await (const item of source) {
yield item;
await new Promise((r) => setTimeout(r, delayMs));
}
}Are generators single-use or reusable?
- Generators are single-use. Once consumed, you cannot iterate them again
- Calling the generator function again creates a fresh iterator
- If you need to iterate the same data twice, call the generator function twice
Related
- Parallel Promises - Promise.all for fixed parallel fetches
- Streaming - Suspense-based streaming SSR
- Server Actions - mutations and form handling
- SWR Pagination - client-side infinite scrolling