Streaming RPC
If a backend export is an async function*, the frontend gets a StreamCall<Y, R> handle: awaitable (resolves to the generator’s return value) and async-iterable (yields each chunk). Cancellation propagates end-to-end via iterator.return().
Basic shape
backend/main.ts
export async function* processFiles(paths: string[]) {
let ok = 0;
for (const [i, path] of paths.entries()) {
await doWork(path);
ok++;
yield { path, progress: (i + 1) / paths.length };
}
return { ok, failed: paths.length - ok };
}src/main.ts
const stream = api.processFiles(["a.txt", "b.txt"]);
for await (const chunk of stream) {
render(chunk.progress);
}
const summary = await stream; // { ok, failed }
// or early-stop: await stream.cancel();StreamCall<Y, R>
type StreamCall<Y, R> = AsyncIterable<Y> & PromiseLike<R> & {
cancel(): Promise<void>;
};[Symbol.asyncIterator]()— yieldsYchunks..then(…)/await— resolves to the finalRreturn value..cancel()— sends a cancel signal; iterator throws, promise rejects.
Marshalling
- Each
yieldsends one JSON line (full mode) or one__tynd_yield__native call (lite). - The final
returnvalue is delivered as a single resolve on the promise side. - Values must be JSON-serializable (same constraints as regular RPC).
Flow control
Three mechanisms keep streaming safe at arbitrary yield rates:
- Per-stream credit — backend starts with 64 credits; every yield decrements; at 0 the generator awaits. Frontend replenishes with
{ type: "ack", id, n: 32 }per 32 consumed chunks. - Yield batching — Rust buffers yields and flushes every 10 ms or at 64 items. One
evaluate_scriptper flush per webview. - Cleanup on window close — the Rust host auto-cancels every active stream that originated in a closed window.
See the Streaming RPC guide for details.
Cancellation
const stream = api.processFiles(hugeBatch);
setTimeout(() => void stream.cancel(), 3000);Also triggers automatically on:
breakout offor await→ implicititerator.return()→ cancel signal.- Originating window closes → Rust cancels every stream in
dispatch::call_labels()for that label.
Error handling
try {
for await (const n of api.flaky()) {
console.log(n);
}
} catch (err) {
console.error(err.message);
}Thrown errors in the generator propagate into both for await and await stream.
Typical patterns
- Progress + summary — yield progress, return summary.
- Cancellable long task — bind a cancel button to
stream.cancel(). - LLM tokens — yield decoded fragments from
res.body.getReader().
Examples: Streaming RPC guide.
Limitations
- No transferable objects; values are JSON-cloned each time.
- Generator back-pressure is credit-based — not propagated to the producer’s I/O source (it’s a generator, not a stream).
- For multi-MB binary chunks, yield handles/IDs over JSON and fetch the bytes separately via binary IPC.
Next
Last updated on