We're dogfooding OpenWalrus as our own community bot — join us and help build it on Discord →

OpenWalrus

Agents without the bloat.

8 MB binary, use what you need, skip what you don't.

curl -sSL https://openwalrus.xyz/install | sh

macOS and Linux. Apple Silicon, Intel, ARM.

Install. Attach. Done.

Terminal

One command to extend.

Gateways, memory, search — install from the hub. Same protocol as builtin extensions.

$ walrus hub install openwalrus/telegram

Swap any piece. Break nothing.

The daemon
8 MB
Agent lifecycle. Tool dispatch. Sessions.
Extensions — swap, remove, or write your own
Inference
Any provider
Memory
Graph + vector
Search
Meta engine
Telegram
Gateway
Discord
Gateway
Yours
50 lines
Same protocolas official extensions
Zero restartsto swap

What you get. What you don't.

Any model provider

OpenAI, Claude, DeepSeek, Ollama. Switch in one line of TOML.

Skills and MCP

Native support for both. Install from the hub in one command, or drop in your own.

Extension ecosystem

Memory, gateways, search — all community extensions. Same protocol as builtin. Write yours in 50 lines.

No cloud dependency

No vendor lock-in

Frequently asked questions

An open-source composable agent runtime written in Rust. One 8MB daemon — inference is builtin, memory, search, and gateways are community extensions. Run one binary, swap any piece.