Type-safe errors in TypeScript with generators
Discover why most type-safe error handling libraries in TypeScript are built using generators.
Recently, a lot of libraries have been created to handle errors in TypeScript in a type-safe way. We know that in Typescript, we can use the try/catch block to handle errors, but it’s not type-safe, that means we are not able to know the specific type of the error we should handle at build time. Although we can use the instanceof operator to check the type of the error, it’s not a very good practice because it’s not really type-safe as other languages solutions like Java or Rust.
To solve this problem, recently a lot of libraries have been created. For example:
Most of them has something in common: they are built using javascript generators. Some people think that generators are just a nicer syntax, but it’s not the case. Indeed, they aren’t a nicer syntax if you are already familiar with async/await and try/catch blocks. Generators are used because they unlock a very specific combination of control flow + type inference that is extremely hard (arguably impossible) to reproduce cleanly with plain functions, promises, or method chaining.
Let’s build the concept step by step.
First: What is an iterator?
Before talking about generators, we need to understand what an iterator is. An iterator is any object that follows a specific interface:
interface Iterator<T> {
next(): {
value: T;
done: boolean;
};
}
So an iterator is just something with a .next() method that returns { value, done }. And an Iterable is any object that has a [Symbol.iterator] method that returns an iterator.
interface Iterable<T> {
[Symbol.iterator](): Iterator<T>;
}
What generators actually are
A generator is just a function that can pause and resume execution. They are also a tool to create iterators but adding other condiments that we will see later.
function* generatorDemo() {
const x = yield 1;
return x + 1;
}
const it = generatorDemo();
console.log( it.next() ) // { value: 1, done: false }
console.log( it.next(5) ); // { value: 6, done: true }
Key idea:
yieldpauses execution and returns a value.next(value)resumes execution and injects a value back in
So a generator is basically a controllable execution that you can step through manually. But the real magic is yield*.
function* inner() {
yield 1;
yield 2;
return 3;
}
function* outer() {
const result = yield* inner();
console.log(result);
}
const it = outer();
const a = it.next(); // 1 -> Don't log
const b = it.next(); // 2 -> Don't log
const c = it.next(); // logs 3
console.log(a,b,c)
// Output:
// { value: 1, done: false }
// { value: 2, done: false }
// { value: undefined, done: true }
Some technical details that are important to understand:
function*is just a syntactic sugar forfunctionthat returns a generator. That means that when the function is called, it won’t be executed immediately, it will return a generator object that we can use to control the execution.yieldis just a keyword that pauses the execution and returns a value.yield*is a keyword that pauses the execution and delegates to another generator..next()is a method that resumes the execution and returns the value of the next yield.
Let’s take a look at the step-by-step execution of the example.
-
- Create iterator: Nothing runs yet. Generator is paused at the start.
const it = outer();
-
- First
.next()
- First
const a = it.next();
// Execution begins inside `outer`:
const result = yield* inner();
// Creates `inner()` iterator and starts running it
-
- Inside
inner():
- Inside
yield 1;
So:
inneryields1yield*forwards that yield outwardouterpauses. Nothing is logged yet.
Result:
a = { value: 1, done: false }
-
- Second
.next()
- Second
const b = it.next();
Resume where we left off.
We’re still inside yield* inner().
Continue inner():
yield 2;
So:
inneryields2yield*forwards itouterpauses again
Result:
b = { value: 2, done: false }
Still no console.log.
-
- Third
.next()
- Third
const c = it.next();
Resume again.
Now inner() continues:
return 3;
Important:
- this does not yield
- it finishes the iterator
- returns
{ value: 3, done: true }toyield*
Now yield* does this:
const result = 3;
So now outer continues:
console.log(result); // logs 3
Then outer finishes (no return → undefined)
So the final result is:
c = { value: undefined, done: true }
Final output:
a = { value: 1, done: false }
b = { value: 2, done: false }
c = { value: undefined, done: true }
And during the third .next() the number 3 is logged because it’s the return value of the inner() generator.
This line:
const result = yield* inner();
means:
“Run
inner()completely. Forward all its yields. When it finishes, give me its return value.”
Note: At this point you should be able to explain what an iterator is, how generators work and how they are used to create iterators, the difference between yield and yield* and how they work together. If you don’t understand this, you should probably read this introduction and example again because this is the foundation of the next part of the article.
Iterators that explicitly yield (control flow in action)
Now let’s build an iterator that always yields once:
const AlwaysYield = (value: string) => ({
*[Symbol.iterator]() {
console.log("about to yield");
yield value;
console.log("this runs only if resumed");
return "done";
},
});
Using it:
function* test() {
const result = yield* AlwaysYield("pause here");
console.log("result:", result);
}
const it = test();
it.next();
// logs: "about to yield"
// { value: "pause here", done: false }
Notice:
- execution paused
"result:"was never logged- generator is now frozen
If we resume:
it.next();
// logs: "this runs only if resumed"
// logs: "result: done"
// { value: undefined, done: true }
Key takeaway:
If an iterator yields, it forces the outer generator to pause.
The trick: iterators don’t have to yield
Here’s the first “wait… what?” moment.
const Ok = <T>(value: T) => ({
type: "Ok",
value,
*[Symbol.iterator]() {
return this.value;
},
});
function* example() {
const a = yield* Ok(5);
return a;
}
const it = example();
it.next(); // { value: 5, done: true }
There was no pause at all.
So If an iterator doesn’t yield, yield* behaves like a normal function call.
The opposite: iterators that always yield
const Err = <E>(error: E) => ({
type: "Err",
error,
*[Symbol.iterator]() {
yield { type: "Err", error: this.error };
},
});
function* example() {
const a = yield* Err("boom");
return a;
}
const it = example();
it.next();
// { value: { type: "Err", error: "boom" }, done: false }
Now the generator is paused. And nothing else runs.
Combining the two
We now have two behaviors:
| Type | Behavior |
|---|---|
Ok | does NOT yield → execution continues |
Err | yields once → execution pauses |
This is the entire foundation of generator-based error handling.
Now let’s write the simplest possible executor:
function run(gen: () => Generator<any, any, any>) {
const it = gen();
const state = it.next(); // only once
return state.value;
}
That’s it. If a generator doesn’t yield (everything is an Ok iterator), the function will just return the value. If a generator yields (some Err is found and has a yield instruction in the iterator), the function will be paused and never resumed again so the value will be the Err object.
Inside, the executor is working as an iterator with kind of an early exit mechanism when an Err is found. It’s not a real early exit mechanism because the generator is not stopped, it’s just a way to stop the execution of the generator and return the Err object.
Putting it together
const Ok = <T>(value: T) => ({
type: "Ok",
value,
*[Symbol.iterator]() {
return this.value;
},
});
const Err = <E>(error: E) => ({
type: "Err",
error,
*[Symbol.iterator]() {
yield { type: "Err", error: this.error };
},
});
function run(gen: () => Generator<any, any, any>) {
const it = gen();
const state = it.next();
return state.value;
}
Success case (step-by-step)
const result = run(function* () {
const a = yield* Ok(1);
const b = yield* Ok(2);
return yield* Ok(a + b);
});
Execution:
- Start generator
yield* Ok(1)→ no yield →a = 1yield* Ok(2)→ no yield →b = 2- return
yield* Ok(3)
Output: 3
Error case (step-by-step)
const result = run(function* () {
const a = yield* Err("fail");
const b = yield* Ok(2); // never runs
return yield* Ok(a + b);
});
Execution:
- Start generator
yield* Err("fail")Erryields → generator pausesrun()returns that value immediately
Output:
{ type: "Err", error: "fail" }
Notice:
bis never evaluated- no
ifstatements - no exceptions
What this really means
This pattern works because we encode control flow into the iterator protocol.
- Success = “do nothing special”
- Failure = “pause execution”
And then we only run the generator once and treat the first pause as the result.
Now what I wanted to talk about: Why TypeScript loves this
Now we get to the real reason generators are used.
The yield type becomes a union
type Err<_T, E> = { type: "Err"; error: E };
type InferYieldErr<Y> = Y extends Err<never, infer E> ? E : never;
If your generator yields (at the type level):
Err<never, A> | Err<never, B>
TypeScript infers:
A | B
Imagine we have the following types as a simpler implementation of the Result type included in some of the example libraries. Each variant is iterable so yield* can delegate: Ok completes immediately with the success value; Err yields once so the outer driver can observe the error and stop.
Libraries often expose [Symbol.iterator] as a generator method *[Symbol.iterator]() on Ok / Err: the body may yield or return, and TypeScript types it as Generator<Yield, Return, Next>. The leading * is JavaScript syntax for a generator function (including methods), not a separate TypeScript operator; the compiler only adds static checking of yields, completion values, and optional next arguments.
type Err<_T, E> = {
type: "Err";
error: E;
[Symbol.iterator](): Generator<Err<never, E>, never, unknown>;
};
type Ok<T, E = never> = {
type: "Ok";
value: T;
[Symbol.iterator](): Generator<Err<never, E>, T, unknown>;
};
type Result<T, E> = Ok<T, E> | Err<T, E>;
/** Any Ok/Err pair — use `unknown` so inference flows through `R` in `Result.gen`. */
type AnyResult = Ok<unknown, unknown> | Err<unknown, unknown>;
function ok<T, E = never>(value: T): Ok<T, E> {
return {
type: "Ok",
value,
*[Symbol.iterator](): Generator<Err<never, E>, T, unknown> {
return this.value;
},
};
}
function err<T = never, E = unknown>(error: E): Err<T, E> {
const result: Err<T, E> = {
type: "Err",
error,
*[Symbol.iterator](): Generator<Err<never, E>, never, unknown> {
yield result as unknown as Err<never, E>;
return undefined as never;
},
};
return result;
}
/** Discriminated — keeps `ErrorA | ErrorB` in error unions (plain `msg: string` classes collapse). */
class ErrorA {
readonly _tag = "ErrorA" as const;
}
class ErrorB {
readonly _tag = "ErrorB" as const;
}
const getA = (): Result<number, ErrorA> => ok(1);
const getB = (): Result<number, ErrorB> => err(new ErrorB());
type InferYieldErr<Y> = Y extends Err<never, infer E> ? E : never;
type InferOk<R> = R extends Ok<infer T, unknown> ? T : never;
type InferErr<R> = R extends Err<unknown, infer E> ? E : never;
/**
* Infer `Yield` and `R` from `Generator<Yield, R, unknown>` directly.
* Avoids `G extends Generator<infer Y, …>` — that path can lose members of the yield union.
*/
function gen<Yield extends Err<never, unknown>, R extends AnyResult>(
fn: () => Generator<Yield, R, unknown>,
): Result<InferOk<R>, InferYieldErr<Yield> | InferErr<R>> {
const it = fn();
const state = it.next();
return state.value as Result<InferOk<R>, InferYieldErr<Yield> | InferErr<R>>;
}
const result = gen(function* () {
const a = yield* getA();
const b = yield* getB();
return ok(a + b);
});
console.log(result);
TypeScript infers:
Result<number, ErrorA | ErrorB>
But Why generators (and not something else)?
I think there are 3 main reasons why generators are used:
- They separate control flow from logic
- They allow early exit without
return - They integrate with TypeScript’s type system
Let’s explore each one of them.
1. They separate control flow from logic
You write:
const a = yield* getA();
Instead of:
const r = getA();
if (r is error) return r;
2. They allow early exit without return
No exceptions. No branching. Just:
- yield → pause
- don’t resume → exit
- cleaner code (no
ifstatements on every call)
3. They integrate with TypeScript’s type system
This is the killer feature and main reason why TypeScript loves this pattern. Although I think that Typescript itself should provide a more native way to handle errors, this is a way to achieve it by using existing features of the language type system. Mostly because:
- all
yieldtypes are collected into a union -> We have errors as values and also errors and success cases in the same type. - TypeScript can extract error types automatically
- We leverage on the inference typescript provides for returns in generators to get the result type
Generators are used for type-safe error handling in TypeScript not because they’re convenient but because they uniquely combine:
- lazy execution
- interruptible control flow
- composable delegation (
yield*) - type-level inference over unions
That combination is what enables:
clean, linear, type-safe error handling with zero runtime branching.
If you understand this, you don’t just understand a library, you now understand why multiple libraries independently converge on the same pattern: generators.
One little disclaimer: all the examples and explanations here are simpler than the real implementations of the libraries. The real implementations are more complex because they handle more cases like asynchronous operations, multiple yields, error mapping, result chaining, etc. But the core concept is the same.
Hope you enjoyed this article. If you have any feedback, please don’t hesitate to contact me. See you next time!