Skip to main content
  • Changelog
  • Introducing Queues for Rivet Actors
Introducing Queues for Rivet Actors

Today we’re releasing Queues for Rivet Actors: per-actor durable queues with a programmable run handler. Also known as the actor mailbox pattern.

  • Durable and ordered: messages persist through sleep, crashes, and deploys, processed one at a time
  • Handles traffic spikes: absorbs bursts of messages without dropping any
  • Request/response: callers can await a typed response from queued work
  • Programmable run handler: run is a long-lived async function you control, not a callback. Selectively consume named queues, race messages against each other, cancel work mid-flight.
  • Pairs with workflows: use queues as input to durable, replayable workflows
  • Built into the actor: queues, state, SQLite, events, and workflows, all in one place. No external broker to provision.

Show Me The Code

Define typed queues, process them in a run handler, and send messages from a client.

Sending messages from a client:

import { createClient } from "rivetkit/client";
import type { registry } from "./actors";

const client = createClient<typeof registry>();
const handle = client.counter.getOrCreate(["main"]);

// Fire-and-forget
await handle.send("increment", { amount: 1 });

// Wait for a typed response
const result = await handle.send(
  "increment",
  { amount: 5 },
  { wait: true, timeout: 5_000 },
);

if (result.status === "completed") {
  console.log(result.response); // { value: 6 }
} else if (result.status === "timedOut") {
  console.log("timed out");
}

The Run Handler

The run handler is the heart of an actor. It’s a long-lived async function that owns the actor’s main processing. Instead of registering callbacks, you write it yourself: iterate queues, sleep between ticks, race signals against each other. You control exactly how and when messages are consumed.

Queues for Agents

Queues are a natural fit for AI agents. Use a prompt queue for incoming messages, a stop queue for cancellation, and SQLite for persistent chat history. The run handler processes messages durably, so the agent survives crashes and picks up where it left off.

Request/Response

Three delivery modes depending on what the caller needs:

  • Fire-and-forget: send and move on
  • Completable: send and wait for acknowledgment
  • Request/response: send and await a typed reply

Pairs with Workflows

Feed queue messages into durable workflows. Each workflow step is checkpointed, so crashes pick up where they left off. Combine queues with sleep, join, race, rollback, and human-in-the-loop patterns.

import { actor, queue, setup } from "rivetkit";
import { workflow } from "rivetkit/workflow";

const worker = actor({
  state: { processed: 0 },
  queues: {
    orders: queue<{ orderId: string }>(),
  },
  // Workflow replays safely on crash or restart
  run: workflow(async (ctx) => {
    for await (const message of ctx.queue.iter()) {
      await ctx.step("charge", async () =>
        charge(message.body.orderId),
      );
      await ctx.step("fulfill", async () =>
        fulfill(message.body.orderId),
      );
      await ctx.step("notify", async () =>
        notify(message.body.orderId),
      );
    }
  }),
});

async function charge(orderId: string) { /* ... */ }
async function fulfill(orderId: string) { /* ... */ }
async function notify(orderId: string) { /* ... */ }

export const registry = setup({ use: { worker } });

Built into the Actor

Queues are part of the actor, not a separate service. The same actor has state, SQLite, events, and workflows, all built in. No external broker to provision, no connection strings, no infrastructure to manage.

Plus everything else that comes with Rivet Actors: scale to millions of instances, scale to zero, TypeScript-native, deploy on Cloudflare Workers, Vercel, Railway, or your own infra.

Get Started

Queues are available today in RivetKit.

npm install rivetkit
import { queue } from "rivetkit";
N

Nathan Flurry

Co-founder & CTO