Use Cases for JavaScript Generators
JavaScript generators (function*) have been part of the language since ES2015, yet many frontend developers still reach for arrays or promise chains when generators would be a cleaner fit. The core value isn’t raw speed — it’s laziness, composability, and precise control over iteration. This article covers where generators genuinely earn their place in modern frontend code.
Key Takeaways
- Generators produce values on demand through lazy evaluation, avoiding unnecessary computation and intermediate arrays
- The Iterator Helpers API brings built-in
map,filter, andtakemethods to generator-returned iterators, removing the need for custom utility functions yield*makes recursive traversal of trees and graphs both readable and lazy- Async generators (
async function*) pair withfor await...ofto handle paginated or batched data fetching with minimal state management
What Makes JavaScript Generators Different
A generator function returns an iterator. Calling the function doesn’t execute any code — it hands you an object with a .next() method. Each call to .next() runs the function body until the next yield, then pauses, preserving local state across calls.
function* range(start, end) {
for (let i = start; i < end; i++) yield i
}
for (const n of range(0, 5)) {
console.log(n) // 0, 1, 2, 3, 4
}
Because generators implement the iterator protocol, they work directly with for...of, spread syntax, and destructuring — no adapter needed.
Lazy Iteration in JavaScript: Processing Data Without Materializing It
The primary reason to use a generator over an array is lazy iteration: values are produced only when requested. This matters when:
- The full dataset is large and you only need part of it
- Computing each value is expensive
- The sequence is conceptually infinite
function* naturals() {
let n = 0
while (true) yield n++
}
// Only computes values up to the break point
for (const n of naturals()) {
if (n > 100) break
}
No intermediate array is created. No values beyond the break point are computed.
The Iterator Helpers API: Built-in Lazy Pipelines
Writing custom map, filter, and take utilities used to be necessary boilerplate. The Iterator Helpers API — now available in all modern browsers — adds these directly to synchronous iterators:
const result = naturals()
.filter(n => n % 2 === 0)
.map(n => n * n)
.take(5)
.toArray() // [0, 4, 16, 36, 64]
Each step is lazy. .toArray() is what triggers evaluation. This makes generator-based pipelines significantly cleaner without third-party libraries. Note that these helpers apply to synchronous iterators — async iterator helpers are not yet standardized across environments.
Discover how at OpenReplay.com.
Tree and Graph Traversal
Generators are a natural fit for traversing recursive structures. Depth-first traversal of a DOM-like tree becomes straightforward:
function* walkTree(node) {
yield node
for (const child of node.children ?? []) {
yield* walkTree(child)
}
}
for (const node of walkTree(rootNode)) {
if (node.type === 'input') processInput(node)
}
yield* delegates to a nested generator, keeping the recursion readable and the traversal lazy — you stop as soon as you find what you need.
Async Generators in JavaScript: Paginated and Batched Data Fetching
async function* extends the pattern to asynchronous sequences. Combined with for await...of, it’s well-suited for paginated API responses:
async function* fetchPages(url) {
let nextUrl = url
while (nextUrl) {
const res = await fetch(nextUrl)
const data = await res.json()
yield data.items
nextUrl = data.nextPage ?? null
}
}
for await (const batch of fetchPages('/api/records')) {
processBatch(batch)
}
Each page is fetched only when the loop advances. There’s no need to collect all pages upfront or manage pagination state externally — the generator holds it.
When Not to Use Generators
Generators add indirection. For a simple array transformation you’ll consume entirely, chained array methods are clearer. Use generators when the sequence is large, potentially infinite, or when you need to stop early without wasted computation.
Conclusion
JavaScript generators shine in three areas: lazy iteration over large or infinite sequences, composable data pipelines (especially with the Iterator Helpers API), and async data fetching where you need sequential, stateful control. They’re not a replacement for arrays or async/await — they’re the right tool when you need to produce values on demand rather than all at once.
FAQs
Yes. Generators work well for producing sequences of data that React components consume. You can call a generator inside a useEffect or useMemo hook to lazily compute values. However, don't use a generator as the component function itself — React expects components to return JSX, not iterators.
The generator stays paused at its last yield point. It becomes eligible for garbage collection once no references to its iterator object remain. If you need cleanup logic to run when iteration stops early, wrap the yield in a try-finally block. The finally block executes when the iterator's return method is called or the generator is garbage collected.
For small, fully consumed collections, generators carry slight overhead from the pause-and-resume mechanism. The performance difference is negligible in most applications. Generators become faster in practice when you process large datasets partially, because they avoid allocating intermediate arrays and skip computation for values you never request.
An async generator yields values incrementally as they become available, while a Promise-based approach waits until all data is collected before returning. This means async generators let you start processing the first batch of results immediately, reduce peak memory usage, and give you finer control over when each subsequent fetch occurs.
Complete picture for complete understanding
Capture every clue your frontend is leaving so you can instantly get to the root cause of any issue with OpenReplay — the open-source session replay tool for developers. Self-host it in minutes, and have complete control over your customer data.
Check our GitHub repo and join the thousands of developers in our community.