Open source · MIT · TypeScript

Everything is a
conversation.

Verbum gives every participant in your system, AI models, terminals, MCP servers, APIs, humans, a seat at the table. They send messages. They receive messages. The rest is routing.

$npm install verbum-ai
MITTypeScriptNode 20+

The right abstraction
changes everything downstream.

Most frameworks model agent systems as function calls that happen to use language. You invoke tools. You query models. You stitch results together. It works until you need to observe it, debug it, replay it, or hand control to a human.

Verbum starts from a different premise. Every participant is an Actor. Every interaction is a Message. Your terminal is not a tool, it is in the conversation. Your MCP server is not a plugin, it is a participant. Your human is not a callback, they are first-class.

What if every part of your system could talk to every other part in a language you could actually read?

The result is a system that is observable by construction, replayable by design, and composable without ceremony.

The Primitives

Seven actor types.
One universal interface.

ModelActor

Any LLM, wrapped uniformly. One line to swap providers.

  • Anthropic (Claude)
  • OpenAI / GPT
  • OpenAI-compatible
  • Scripted / Mock

ProcessActor

A persistent shell session as a first-class participant. Any CLI, any subprocess, any REPL.

  • bash / zsh / fish
  • Python / Node REPL
  • Any binary CLI
  • Docker exec

MCPActor

Speaks the Model Context Protocol natively. Connect to any MCP server over stdio and auto-discover its tools.

  • stdio servers
  • Auto capability registry
  • JSON-RPC transport

ToolActor

Deterministic functions and API wrappers. Tools that can also initiate conversations.

  • REST / GraphQL APIs
  • Pure functions
  • Webhooks

HumanActor

A human is just another actor. Pause any flow, inject a response, resume. Native to the model.

  • stdin / CLI
  • Custom transports
  • Pluggable interface

MemoryActor

Persistent context that participates in conversation. Ask it anything, it responds like an actor.

  • In-memory
  • Keyword search
  • Conversation-scoped
  • Pluggable backends

Router

The runtime. It dispatches, records every hop, and turns the whole run into a readable graph.

  • Recursive dispatch
  • Replayable runs
  • Forkable history
  • Observable by default
Show me the code

Same API. Every actor.

agent.ts
import { Router, ModelActor, ProcessActor, MemoryActor, anthropicAdapter } from "verbum-ai"

const router = new Router()

router.register(new ModelActor({
  id: "claude",
  provider: "anthropic",
  model: "claude-sonnet-4-20250514",
  adapter: anthropicAdapter()
}))

router.register(new ProcessActor({ id: "shell" }))
router.register(new MemoryActor({ id: "memory" }))

await router.send({
  from: "user",
  to: "claude",
  role: "user",
  conversationId: "demo",
  content: { type: "text", text: "Run the tests and tell me if we can ship." }
})
Why it matters

Five things you get for free
from the right abstraction.

01

Observable by construction

Every interaction, model to shell, agent to MCP server, human to model, is a structured, readable message. Not a log line. Not a trace ID.

02

Replayable and forkable

Replay any run. Fork from any message. Explore what would have happened if the model had responded differently. Debug with time travel.

03

Portable context

Move a conversation between providers mid-flight. Start with Claude, hand off to a local model, resume with GPT. The context belongs to no one.

04

Composable without ceremony

An agent that uses a shell, an MCP server, and a memory store is three registered actors and a routing rule. No pipelines to wire. No abstractions to fight.

05

Middleware out of the box

Logging, cost tracking, and rate limiting ship as built-in middleware. Write your own with a single function that wraps the dispatch chain.

Verbum App · macOS

The god-view of every
conversation your system is having.

A native Mac app that ingests streams from Verbum agents, Claude Code, Codex CLI, terminals, and custom sources, then renders them as one unified conversation surface.

Master conversation

Your main conversation with Verbum is the master thread. Spawn focused side conversations when you need them.

Claude + Codex companion

Claude task files, Codex exec runs, and terminal sessions all stream into the same typed message feed.

#

Typed custom sources

Bring your own JSONL source and render it beside Claude Code and Codex without inventing a one-off UI.

Claude Code
Task watcher plus one-off prompt bridge through the local CLI.
Companion
Codex
Structured `codex exec --json` runs show up as typed messages with usage.
Companion
Terminals
Tracked shell sessions let the demo show repo work and machine work side by side.
Replacement
Custom Source
Emit typed JSONL and Verbum renders it in the same master conversation model.
Extensible
Download the Mac app DMGRuns the live Verbum client for Claude Code, Codex, terminals, and search.
Get started

Stop wiring. Start talking.

MIT licensed. Built in public. The docs site is static. The Mac app is where the live integrations run.