Exploration · Hono backend / streaming

Three ways to do SSE streaming in Hono

Compare three approaches to SSE streaming in a Hono backend for a sandboxed agent integration — focus on backpressure, cancellation, and fitting the existing Hono middleware chain.

01

Hand-rolled ReadableStream

Build the SSE stream by hand and return it from the handler with the right headers.

app.get('/chat/stream', (c) => {
  const stream = new ReadableStream({
    start(controller) {
      const enc = new TextEncoder();
      const send = (data) =>
        controller.enqueue(enc.encode(`data: ${data}\n\n`));

      const interval = setInterval(() => send(JSON.stringify({tok: 'hi'})), 50);
      c.req.raw.signal.addEventListener('abort', () => {
        clearInterval(interval); controller.close();
      });
    },
  });
  return new Response(stream, {
    headers: { 'content-type': 'text/event-stream' },
  });
});
ProCon
Zero dependenciesManual SSE framing — easy to forget the double newline
Total control over backpressureCancellation is wired by hand on every route
Native to Web Streams; runs anywhere Hono runsEasy to leak intervals/timers if abort is missed
Bundle
+0 kb
Backpressure
manual
Cancel
via req.raw.signal
Edge runtime
yes
02

Hono's streamSSE helper

Hono ships a helper that handles SSE framing, abort signal wiring, and ping/keepalive.

import { streamSSE } from 'hono/streaming';

app.get('/chat/stream', (c) => {
  return streamSSE(c, async (stream) => {
    stream.onAbort(() => console.log('client gone'));
    while (!stream.aborted) {
      await stream.writeSSE({
        data: JSON.stringify({ tok: 'hi' }),
        event: 'token',
      });
      await stream.sleep(50);
    }
  });
});
ProCon
SSE framing handled — no manual data: + double newlineOne more abstraction layer in stack traces
onAbort + stream.aborted are first-classTied to Hono — locks you in slightly
Built-in sleep respects abort, no leaked timersHelper is fairly young; a few sharp edges remain
Bundle
+0 kb (in Hono)
Backpressure
via await write
Cancel
built-in
Edge runtime
yes
03

SDK pass-through

Stream directly from the Anthropic SDK's messages.stream() response into Hono's response — no intermediate buffer.

import Anthropic from '@anthropic-ai/sdk';
import { streamSSE } from 'hono/streaming';

const client = new Anthropic();

app.post('/chat/stream', (c) => {
  return streamSSE(c, async (stream) => {
    const upstream = await client.messages.stream({ /* … */ });
    for await (const ev of upstream) {
      if (stream.aborted) { upstream.controller.abort(); return; }
      await stream.writeSSE({ event: ev.type, data: JSON.stringify(ev) });
    }
  });
});
ProCon
Backpressure propagates end-to-end (no buffer)Couples your SSE schema to the SDK's event shape
Client cancellation correctly aborts upstream API callRe-emitting structured events to the browser is your job
Minimal code; single source of truth for eventsHarder to inject your own events (cost, audit, custom)
Bundle
SDK already imported
Backpressure
end-to-end
Cancel
propagates upstream
Edge runtime
yes (with fetch transport)