Every model you run.
Local and cloud.
In one menubar.
CrabDash is the personal LLM gateway that lives in your menubar. Route cloud and local models through one port, watch every request stream in real time, see what you spent — without sending a single byte through someone else’s server.

The word people reach for first is gateway, and it’s accurate but not vivid. CrabDash is an OpenAI-compatible proxy that lives in your menubar, routes every request through crabllm, and shows you — in real time — what just happened.
Plug your SDK into one port. The gateway routes to Anthropic, OpenAI, Gemini, xAI, Groq, or local MLX models on Apple Silicon — and shows you every request as it streams.
Free does live. Pro does history.
Free forever. Pro when you’re ready.
The full gateway moment. No signup, no credit card.
- Gateway routing · all providers · one port
- Live request tail, menubar stats
- Quick Chat (⌘K)
- Local MLX models on Apple Silicon
- Auto-failover and retries
- 7-day usage analytics window
One-time license, Charles-style. Launch price lands with the public beta.
- Persistent request history · search · export
- Session recording & replay across models
- 30 / 90 / 365-day cost breakdowns
- Routing rules · per-app and per-domain (TUN)
- Budget caps with desktop notifications
- MCP gateway for tool-call routing
Answers, before you ask.
How is this different from LiteLLM or Bifrost?
CrabDash ships a native macOS app, not just a proxy binary. You get a menubar, live request tail, local MLX models, Quick Chat, and budget tracking out of the box. The routing engine (crabllm) is open source — CrabDash is the dashboard and glue that makes it feel like a Mac app.
What data leaves my machine?
Nothing. The gateway runs locally on 127.0.0.1, your API keys stay in the macOS Keychain, and request history is stored in a local SQLite database. CrabDash has no telemetry, no login, no backend. If a request goes to Anthropic or OpenAI, it's because you sent it there.
Does it actually run models locally?
Yes — on Apple Silicon via MLX. Pull a model from the menubar, route specific apps or domains to it, and it answers through the same gateway port as cloud models. Your code never needs to know which is which.
When does Pro ship?
Public beta in Q3 2026. Launch price will be a one-time license, Charles Proxy / Surge style — buy once, no subscription. Join the waitlist to get pricing and a launch discount.
Is CrabDash open source?
The routing engine (crabllm) and the CrabTalk daemon are open source. The CrabDash macOS app is closed-source — that's what funds the open-source work beneath it.
Open the menubar. See everything.
One signed Universal binary. Public beta lands in Q3 2026. Star the repo to hear the moment releases drop — or pair it with crabllm today and build your own dashboard on top.