Skip to main content

How to Use Async Generators and Async Iteration in JavaScript

Regular generators pause and resume synchronously. They produce values on demand, but each value must be available immediately when next() is called. In the real world, many data sources are inherently asynchronous: API endpoints that return paginated results, WebSocket streams that deliver messages over time, file systems that read chunks on demand, or databases that stream rows from a query. You need a way to combine the lazy, on-demand nature of generators with the asynchronous nature of these data sources.

Async generators solve this exactly. An async function* can use both yield and await, pausing to wait for asynchronous operations and then yielding results as they arrive. Combined with the for await...of loop, they create a clean, readable pattern for consuming asynchronous data streams one piece at a time.

This guide covers the syntax and mechanics of async generators, the async iteration protocol they implement, and the real-world problems they solve, from paginated API consumption to streaming data processing.

async function*: Async Generator Functions

An async generator function combines two concepts you already know: async functions (which can await promises) and generator functions (which can yield values). The syntax is simply async function*.

Basic Syntax

async function* asyncCounter(start, end, delay) {
for (let i = start; i <= end; i++) {
await new Promise(resolve => setTimeout(resolve, delay));
yield i;
}
}

This function counts from start to end, but waits delay milliseconds between each number. It could not exist as a regular generator (which cannot await) or as a regular async function (which can only return once, not yield multiple values).

Calling an Async Generator

Like regular generators, calling an async generator function does not execute its body. It returns an async generator object:

async function* greetSlowly() {
await new Promise(r => setTimeout(r, 500));
yield "Hello";

await new Promise(r => setTimeout(r, 500));
yield "World";
}

const gen = greetSlowly();
console.log(gen); // Object [AsyncGenerator] {}

The key difference from regular generators: calling next() on an async generator returns a Promise that resolves to { value, done }, instead of returning { value, done } directly.

async function* simple() {
yield 1;
yield 2;
yield 3;
}

const gen = simple();

// Each next() returns a Promise
const result1 = gen.next();
console.log(result1); // Promise { <pending> }

// Await the promise to get the value
console.log(await result1); // { value: 1, done: false }
console.log(await gen.next()); // { value: 2, done: false }
console.log(await gen.next()); // { value: 3, done: false }
console.log(await gen.next()); // { value: undefined, done: true }

Mixing await and yield

Inside an async generator, await pauses until a promise resolves, and yield pauses to send a value out. They work together naturally:

async function* fetchUserNames(userIds) {
for (const id of userIds) {
// await pauses for the async operation
const response = await fetch(`https://jsonplaceholder.typicode.com/users/${id}`);
const user = await response.json();

// yield pauses to emit the result
yield user.name;
}
}

// Usage (inside an async context)
const gen = fetchUserNames([1, 2, 3]);

console.log(await gen.next()); // { value: "Leanne Graham", done: false }
console.log(await gen.next()); // { value: "Ervin Howell", done: false }
console.log(await gen.next()); // { value: "Clementine Bauch", done: false }
console.log(await gen.next()); // { value: undefined, done: true }

Each user is fetched only when the consumer asks for the next value. No unnecessary requests are made.

Error Handling Inside Async Generators

Async generators support try...catch for both synchronous errors and rejected promises:

async function* resilientFetcher(urls) {
for (const url of urls) {
try {
const response = await fetch(url);

if (!response.ok) {
throw new Error(`HTTP ${response.status} for ${url}`);
}

const data = await response.json();
yield { url, data, error: null };
} catch (error) {
yield { url, data: null, error: error.message };
}
}
}

const urls = [
"https://jsonplaceholder.typicode.com/posts/1",
"https://invalid-url.example/fail",
"https://jsonplaceholder.typicode.com/posts/2"
];

for await (const result of resilientFetcher(urls)) {
if (result.error) {
console.log(`Failed: ${result.url} - ${result.error}`);
} else {
console.log(`Success: ${result.url} - "${result.data.title}"`);
}
}

The generator does not crash when a fetch fails. It catches the error, yields an error result, and continues to the next URL.

throw() and return() on Async Generators

Async generators support the same throw() and return() methods as regular generators, but they return promises:

async function* managed() {
try {
yield 1;
yield 2;
yield 3;
} catch (e) {
console.log("Generator caught:", e.message);
yield "recovered";
} finally {
console.log("Cleanup in finally");
}
}

const gen = managed();

console.log(await gen.next()); // { value: 1, done: false }

// Inject an error
console.log(await gen.throw(new Error("external failure")));
// Generator caught: external failure
// { value: "recovered", done: false }

// Force completion
console.log(await gen.return("done"));
// Cleanup in finally
// { value: "done", done: true }

for await...of: Async Iteration

The for await...of loop is the primary way to consume async generators and async iterables. It awaits each value automatically, making the consumption code clean and sequential.

Basic Usage

async function* countdown(from, delay) {
for (let i = from; i > 0; i--) {
await new Promise(r => setTimeout(r, delay));
yield i;
}
await new Promise(r => setTimeout(r, delay));
yield "Go!";
}

// for await...of handles all the awaiting automatically
async function run() {
for await (const value of countdown(3, 1000)) {
console.log(value);
}
console.log("Finished!");
}

run();
// (1 second) 3
// (1 second) 2
// (1 second) 1
// (1 second) Go!
// Finished!

Without for await...of, the equivalent code would be verbose:

// Manual iteration: what for await...of does under the hood
async function runManual() {
const gen = countdown(3, 1000);

while (true) {
const { value, done } = await gen.next();
if (done) break;
console.log(value);
}

console.log("Finished!");
}

The for await...of loop handles calling next(), awaiting the returned promise, extracting value, and checking done automatically.

Error Handling with for await...of

Errors thrown inside the async generator or from rejected promises propagate out of the for await...of loop and can be caught with try...catch:

async function* unstable() {
yield 1;
yield 2;
throw new Error("Generator exploded at item 3");
yield 4; // Never reached
}

async function consume() {
try {
for await (const value of unstable()) {
console.log("Received:", value);
}
} catch (e) {
console.log("Caught:", e.message);
}
}

consume();
// Received: 1
// Received: 2
// Caught: Generator exploded at item 3

Early Termination with break

When you break out of a for await...of loop, it automatically calls return() on the async generator, triggering any finally blocks:

async function* infiniteStream() {
let i = 0;
try {
while (true) {
await new Promise(r => setTimeout(r, 100));
yield i++;
}
} finally {
console.log("Stream closed, cleanup performed");
}
}

async function consumePartially() {
for await (const value of infiniteStream()) {
console.log(value);
if (value >= 4) break; // Triggers return() → finally block
}
}

consumePartially();
// 0
// 1
// 2
// 3
// 4
// Stream closed, cleanup performed

This cleanup guarantee is essential for streams that manage resources like network connections or file handles.

for await...of with Regular Iterables of Promises

for await...of is not limited to async generators. It also works with regular iterables that contain promises:

// Array of promises
const promises = [
fetch("https://jsonplaceholder.typicode.com/posts/1").then(r => r.json()),
fetch("https://jsonplaceholder.typicode.com/posts/2").then(r => r.json()),
fetch("https://jsonplaceholder.typicode.com/posts/3").then(r => r.json())
];

async function processSequentially() {
for await (const post of promises) {
console.log(post.title);
}
}

processSequentially();
caution

When using for await...of with an array of promises, all promises start executing immediately (when the array is created). The loop just awaits them in order. This is different from an async generator, where each async operation starts only when the next value is requested. For true lazy sequential execution, use an async generator.

Async Iterables and the Symbol.asyncIterator Protocol

Just as regular iterables implement Symbol.iterator, async iterables implement Symbol.asyncIterator. This protocol defines how objects expose an asynchronous sequence of values.

The Async Iteration Protocol

An object is async iterable if it has a [Symbol.asyncIterator]() method that returns an async iterator. An async iterator is an object with a next() method that returns a Promise resolving to { value, done }.

// Manual implementation of the async iterable protocol
const asyncRange = {
from: 1,
to: 5,

[Symbol.asyncIterator]() {
let current = this.from;
const last = this.to;

return {
async next() {
// Simulate async work
await new Promise(r => setTimeout(r, 300));

if (current <= last) {
return { value: current++, done: false };
}
return { value: undefined, done: true };
}
};
}
};

async function run() {
for await (const num of asyncRange) {
console.log(num);
}
}

run();
// (300ms) 1
// (300ms) 2
// (300ms) 3
// (300ms) 4
// (300ms) 5

Using Generators to Simplify Async Iterables

Writing the protocol manually is verbose. Async generators make it trivial:

// Manual protocol: lots of boilerplate
const manualAsyncIterable = {
[Symbol.asyncIterator]() {
let i = 0;
return {
next() {
if (i < 3) {
return new Promise(resolve => {
setTimeout(() => {
resolve({ value: i++, done: false });
}, 100);
});
}
return Promise.resolve({ value: undefined, done: true });
},
return() {
console.log("Cleanup");
return Promise.resolve({ value: undefined, done: true });
}
};
}
};

// Async generator: clean and readable
const generatorAsyncIterable = {
async *[Symbol.asyncIterator]() {
for (let i = 0; i < 3; i++) {
await new Promise(r => setTimeout(r, 100));
yield i;
}
}
};

// Both work identically with for await...of
async function test() {
for await (const val of generatorAsyncIterable) {
console.log(val);
}
}

The generator version handles all the protocol details (next, return, promise wrapping, done tracking) automatically.

Making a Class Async Iterable

class EventStream {
#events;
#delay;

constructor(events, delay = 500) {
this.#events = events;
this.#delay = delay;
}

async *[Symbol.asyncIterator]() {
for (const event of this.#events) {
await new Promise(r => setTimeout(r, this.#delay));
yield {
...event,
timestamp: new Date().toISOString()
};
}
}
}

const stream = new EventStream([
{ type: "click", target: "button" },
{ type: "input", target: "textfield" },
{ type: "submit", target: "form" }
], 300);

async function processEvents() {
for await (const event of stream) {
console.log(`[${event.timestamp}] ${event.type} on ${event.target}`);
}
}

processEvents();
// [2024-01-15T10:00:00.300Z] click on button
// [2024-01-15T10:00:00.600Z] input on textfield
// [2024-01-15T10:00:00.900Z] submit on form

Symbol.asyncIterator vs. Symbol.iterator

An object can implement both protocols. for await...of first looks for Symbol.asyncIterator. If it does not find one, it falls back to Symbol.iterator and wraps each value in a resolved promise:

const dualIterable = {
// Sync version
*[Symbol.iterator]() {
yield "sync-1";
yield "sync-2";
},

// Async version
async *[Symbol.asyncIterator]() {
await new Promise(r => setTimeout(r, 100));
yield "async-1";
await new Promise(r => setTimeout(r, 100));
yield "async-2";
}
};

// for...of uses Symbol.iterator
for (const val of dualIterable) {
console.log(val); // "sync-1", "sync-2"
}

// for await...of prefers Symbol.asyncIterator
async function run() {
for await (const val of dualIterable) {
console.log(val); // "async-1", "async-2"
}
}
info

Spread syntax ([...iterable]), Array.from(), and destructuring do not work with async iterables. They only support the synchronous Symbol.iterator protocol. To collect async iterable values into an array, use a for await...of loop with manual accumulation:

async function collectAsync(asyncIterable) {
const results = [];
for await (const item of asyncIterable) {
results.push(item);
}
return results;
}

// Or with Array.fromAsync (ES2024)
const results = await Array.fromAsync(asyncIterable);

Async Iterator Helpers (Coming Soon)

The TC39 Iterator Helpers proposal introduces methods like map, filter, take, and toArray directly on iterators. Some environments already support these:

// Future / polyfilled syntax
async function* numbers() {
for (let i = 1; i <= 10; i++) {
await new Promise(r => setTimeout(r, 50));
yield i;
}
}

// When available:
// const evens = numbers().filter(n => n % 2 === 0);
// const doubled = evens.map(n => n * 2);
// const result = await doubled.toArray();
// console.log(result); // [4, 8, 12, 16, 20]

Until these are widely available, you can build similar utilities yourself:

async function* asyncMap(asyncIterable, fn) {
for await (const item of asyncIterable) {
yield fn(item);
}
}

async function* asyncFilter(asyncIterable, predicate) {
for await (const item of asyncIterable) {
if (predicate(item)) {
yield item;
}
}
}

async function* asyncTake(asyncIterable, count) {
let taken = 0;
for await (const item of asyncIterable) {
yield item;
if (++taken >= count) return;
}
}

async function asyncToArray(asyncIterable) {
const result = [];
for await (const item of asyncIterable) {
result.push(item);
}
return result;
}

Real-World Use Cases: Paginated APIs, Streaming Data

Async generators shine in scenarios where data arrives over time or in chunks. Here are the patterns you will use most often.

Paginated API Consumption

Most REST APIs return data in pages. An async generator can abstract away all pagination logic, presenting a clean stream of items:

async function* fetchAllUsers(baseUrl, pageSize = 10) {
let page = 1;
let hasMore = true;

while (hasMore) {
const url = `${baseUrl}/users?page=${page}&limit=${pageSize}`;
const response = await fetch(url);

if (!response.ok) {
throw new Error(`API error: ${response.status} ${response.statusText}`);
}

const data = await response.json();

for (const user of data.results) {
yield user;
}

hasMore = data.nextPage !== null;
page++;
}
}

// The consumer sees a simple stream of users: no pagination details
async function findUser(name) {
for await (const user of fetchAllUsers("https://api.example.com")) {
if (user.name === name) {
console.log("Found:", user);
return user; // break stops pagination: no wasted requests
}
}
console.log("User not found");
return null;
}

The generator fetches pages on demand. If the user is found on page 1, pages 2 through N are never requested. The consumer does not know or care about pagination.

Cursor-Based Pagination (GitHub API Style)

Many modern APIs use cursors instead of page numbers:

async function* fetchGitHubRepos(org) {
let url = `https://api.github.com/orgs/${org}/repos?per_page=30`;

while (url) {
const response = await fetch(url, {
headers: { "Accept": "application/vnd.github.v3+json" }
});

if (!response.ok) {
throw new Error(`GitHub API error: ${response.status}`);
}

const repos = await response.json();

for (const repo of repos) {
yield {
name: repo.name,
stars: repo.stargazers_count,
language: repo.language
};
}

// Parse the Link header for the next page URL
const linkHeader = response.headers.get("Link");
const nextLink = linkHeader?.match(/<([^>]+)>;\s*rel="next"/);
url = nextLink ? nextLink[1] : null;
}
}

async function getTopRepos(org, minStars = 100) {
const popular = [];

for await (const repo of fetchGitHubRepos(org)) {
if (repo.stars >= minStars) {
popular.push(repo);
}
}

return popular.sort((a, b) => b.stars - a.stars);
}

Polling with Backoff

async function* pollEndpoint(url, options = {}) {
const {
interval = 2000,
maxInterval = 30000,
backoffMultiplier = 1.5,
maxAttempts = Infinity
} = options;

let currentInterval = interval;
let attempts = 0;

while (attempts < maxAttempts) {
try {
const response = await fetch(url);
const data = await response.json();

yield {
data,
attempt: ++attempts,
timestamp: new Date().toISOString()
};

// Reset interval on success
currentInterval = interval;
} catch (error) {
yield {
error: error.message,
attempt: ++attempts,
timestamp: new Date().toISOString()
};

// Increase interval on failure (exponential backoff)
currentInterval = Math.min(currentInterval * backoffMultiplier, maxInterval);
}

await new Promise(r => setTimeout(r, currentInterval));
}
}

// Monitor a health endpoint
async function monitorService() {
const poller = pollEndpoint("https://api.example.com/health", {
interval: 5000,
maxAttempts: 100
});

for await (const result of poller) {
if (result.error) {
console.warn(`[${result.timestamp}] Health check failed: ${result.error}`);
} else if (result.data.status !== "healthy") {
console.warn(`[${result.timestamp}] Service degraded:`, result.data);
} else {
console.log(`[${result.timestamp}] Service healthy`);
}
}
}

Streaming Large Files in Chunks

async function* readFileInChunks(file, chunkSize = 64 * 1024) {
const reader = file.stream().getReader();
let buffer = new Uint8Array(0);

try {
while (true) {
const { value, done } = await reader.read();

if (done) {
// Yield remaining buffer
if (buffer.length > 0) {
yield buffer;
}
break;
}

// Append new data to buffer
const combined = new Uint8Array(buffer.length + value.length);
combined.set(buffer);
combined.set(value, buffer.length);
buffer = combined;

// Yield complete chunks
while (buffer.length >= chunkSize) {
yield buffer.slice(0, chunkSize);
buffer = buffer.slice(chunkSize);
}
}
} finally {
reader.releaseLock();
}
}

// Process a file upload chunk by chunk
async function uploadFile(file) {
let uploaded = 0;

for await (const chunk of readFileInChunks(file)) {
await sendChunkToServer(chunk, uploaded);
uploaded += chunk.length;
console.log(`Uploaded ${uploaded} / ${file.size} bytes`);
}

console.log("Upload complete");
}

Server-Sent Events as an Async Iterable

async function* sseStream(url) {
const eventSource = new EventSource(url);

// Create a queue-based bridge between EventSource callbacks and the generator
const queue = [];
let resolve = null;
let done = false;

eventSource.onmessage = (event) => {
const data = JSON.parse(event.data);
if (resolve) {
const r = resolve;
resolve = null;
r(data);
} else {
queue.push(data);
}
};

eventSource.onerror = () => {
done = true;
eventSource.close();
if (resolve) {
const r = resolve;
resolve = null;
r(null);
}
};

try {
while (!done) {
if (queue.length > 0) {
yield queue.shift();
} else {
const data = await new Promise(r => { resolve = r; });
if (data === null) break;
yield data;
}
}
} finally {
eventSource.close();
}
}

// Consume live notifications
async function listenForNotifications() {
for await (const notification of sseStream("/api/notifications")) {
console.log("New notification:", notification);
updateUI(notification);
}
}

Processing Pipeline

Async generators compose beautifully into data processing pipelines where each stage transforms the stream:

// Stage 1: Fetch raw data pages
async function* fetchLogPages(baseUrl) {
let page = 1;
while (true) {
const response = await fetch(`${baseUrl}/logs?page=${page}`);
const data = await response.json();

if (data.logs.length === 0) break;

for (const log of data.logs) {
yield log;
}
page++;
}
}

// Stage 2: Filter by severity
async function* filterBySeverity(logs, minSeverity) {
const levels = { debug: 0, info: 1, warn: 2, error: 3, fatal: 4 };
const minLevel = levels[minSeverity] || 0;

for await (const log of logs) {
if ((levels[log.level] || 0) >= minLevel) {
yield log;
}
}
}

// Stage 3: Enrich with user data
async function* enrichWithUserData(logs) {
const userCache = new Map();

for await (const log of logs) {
if (log.userId && !userCache.has(log.userId)) {
const response = await fetch(`/api/users/${log.userId}`);
userCache.set(log.userId, await response.json());
}

yield {
...log,
user: userCache.get(log.userId) || null
};
}
}

// Stage 4: Batch for bulk insert
async function* batch(items, size) {
let buffer = [];

for await (const item of items) {
buffer.push(item);
if (buffer.length >= size) {
yield buffer;
buffer = [];
}
}

if (buffer.length > 0) {
yield buffer;
}
}

// Compose the pipeline
async function processLogs() {
const rawLogs = fetchLogPages("https://api.example.com");
const errorLogs = filterBySeverity(rawLogs, "error");
const enrichedLogs = enrichWithUserData(errorLogs);
const batches = batch(enrichedLogs, 50);

for await (const logBatch of batches) {
await saveToDatabase(logBatch);
console.log(`Saved batch of ${logBatch.length} error logs`);
}
}

Each stage in the pipeline is an async generator that consumes from the previous stage and yields to the next. Data flows through one item at a time. No stage buffers the entire dataset in memory.

Timeout and Cancellation

async function* withTimeout(asyncIterable, timeoutMs) {
const iterator = asyncIterable[Symbol.asyncIterator]();

try {
while (true) {
const result = await Promise.race([
iterator.next(),
new Promise((_, reject) =>
setTimeout(() => reject(new Error("Iteration timeout")), timeoutMs)
)
]);

if (result.done) break;
yield result.value;
}
} finally {
// Ensure the inner iterator is properly closed
if (iterator.return) {
await iterator.return();
}
}
}

// Usage: abort if any single item takes more than 5 seconds
async function processWithTimeout() {
const dataStream = fetchAllUsers("https://api.example.com");

try {
for await (const user of withTimeout(dataStream, 5000)) {
console.log("Processing:", user.name);
}
} catch (e) {
if (e.message === "Iteration timeout") {
console.log("Stream timed out - processing stopped");
} else {
throw e;
}
}
}

Comparing Async Generators with Other Approaches

To understand when async generators are the right choice, here is how the same problem (fetching all pages from an API) looks with different approaches:

// Approach 1: Collect everything into an array first
async function fetchAllAtOnce(url) {
const allItems = [];
let page = 1;
let hasMore = true;

while (hasMore) {
const response = await fetch(`${url}?page=${page}`);
const data = await response.json();
allItems.push(...data.results);
hasMore = data.nextPage !== null;
page++;
}

return allItems; // Must wait for ALL pages before processing starts
}
// Problem: high memory usage, long wait before first result

// Approach 2: Callback-based
async function fetchWithCallback(url, onItem) {
let page = 1;
let hasMore = true;

while (hasMore) {
const response = await fetch(`${url}?page=${page}`);
const data = await response.json();

for (const item of data.results) {
await onItem(item); // Hard to compose, hard to break early
}

hasMore = data.nextPage !== null;
page++;
}
}
// Problem: inversion of control, hard to compose or early-terminate

// Approach 3: Async generator (recommended)
async function* fetchAsStream(url) {
let page = 1;
let hasMore = true;

while (hasMore) {
const response = await fetch(`${url}?page=${page}`);
const data = await response.json();

for (const item of data.results) {
yield item; // Consumer controls the pace
}

hasMore = data.nextPage !== null;
page++;
}
}
// Advantages: lazy, composable, breakable, low memory
ApproachMemoryTime to First ItemComposableEarly Termination
Collect allHigh (stores everything)Slow (waits for all pages)LimitedNo
CallbacksLowFastHardHard
Async generatorLowFastEasy (yield*, pipeline)Easy (break)

Summary

Async generators bridge the gap between lazy iteration and asynchronous data sources. They let you write clean, sequential-looking code that processes data on demand, one piece at a time, while handling network latency, pagination, and streaming transparently.

ConceptKey Point
async function*Combines async (can await) with generator (can yield). Returns an async generator object.
next() returns a PromiseEach next() call returns Promise<{ value, done }>, not { value, done } directly.
for await...ofThe standard way to consume async generators and async iterables. Awaits each value automatically.
Symbol.asyncIteratorThe protocol for async iterables. Must return an object with a next() method returning promises.
Lazy executionValues are produced only when requested. Unneeded pages, chunks, or records are never fetched.
Early terminationbreak in for await...of calls return() on the generator, triggering finally cleanup.
Error handlingtry...catch works inside async generators for both sync errors and rejected promises.
Pipeline compositionAsync generators can consume other async generators, forming data processing pipelines.
for await...of with promisesWorks with regular arrays of promises, but all promises start immediately (not lazy).
Array.fromAsync()ES2024 method to collect async iterable values into an array.

Key rules to remember:

  • Use async generators when data arrives over time or in pages and you want to process it incrementally
  • The first next() argument is ignored, just like with regular generators
  • for await...of only works in async functions (or at the top level of ES modules)
  • break and return inside for await...of properly clean up the generator via return()
  • Async generators are single-use: once exhausted, they stay done
  • Prefer async generators over collecting everything into an array when the dataset is large or potentially unbounded
  • Compose pipelines by writing async generators that consume other async generators via for await...of