Gateway API
Use the local OpenAI-compatible and Anthropic-compatible endpoints from any app, SDK, or CLI tool.
CrabDash runs a CrabLLM gateway on 127.0.0.1:5635. It exposes both an OpenAI-compatible and an Anthropic-compatible API — use whichever format your tools expect.
Endpoints
| Format | Base URL |
|---|---|
| OpenAI-compatible | http://127.0.0.1:5635/v1 |
| Anthropic-compatible | http://127.0.0.1:5635/anthropic |
No API key is required by default (the gateway is local-only). If you enable virtual keys in settings, pass them via the Authorization header as usual.
Supported routes
| Route | Description |
|---|---|
GET /v1/models | List available models across all configured providers |
POST /v1/chat/completions | OpenAI-format chat completions (streaming and non-streaming) |
POST /anthropic/v1/messages | Anthropic-format messages (streaming and non-streaming) |
GET /docs | OpenAPI documentation for the full gateway API |
For the complete API reference, open http://127.0.0.1:5635/docs in your browser while CrabDash is running, or see the CrabLLM API docs.
Usage with SDKs
Python (OpenAI SDK)
from openai import OpenAI
client = OpenAI(
base_url="http://127.0.0.1:5635/v1",
api_key="unused",
)
response = client.chat.completions.create(
model="claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello"}],
)Python (Anthropic SDK)
import anthropic
client = anthropic.Anthropic(
base_url="http://127.0.0.1:5635/anthropic",
api_key="unused",
)
message = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}],
)Node.js (OpenAI SDK)
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "http://127.0.0.1:5635/v1",
apiKey: "unused",
});
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello" }],
});Node.js (Anthropic SDK)
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
baseURL: "http://127.0.0.1:5635/anthropic",
apiKey: "unused",
});
const message = await client.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello" }],
});curl (OpenAI format)
curl http://127.0.0.1:5635/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gemini-2.0-flash",
"messages": [{"role": "user", "content": "Hello"}],
"stream": true
}'curl (Anthropic format)
curl http://127.0.0.1:5635/anthropic/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: unused" \
-d '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello"}]
}'Pointing apps at CrabDash
Any app that lets you set a custom base URL works with CrabDash. Use whichever format the app expects:
- OpenAI-format apps: Set base URL to
http://127.0.0.1:5635/v1 - Anthropic-format apps: Set base URL to
http://127.0.0.1:5635/anthropic
Common examples:
- Continue (VS Code): Set
apiBasetohttp://127.0.0.1:5635/v1in your model config - Cursor: Settings → Models → OpenAI API Base →
http://127.0.0.1:5635/v1 - aider:
aider --openai-api-base http://127.0.0.1:5635/v1 - Claude Code:
ANTHROPIC_BASE_URL=http://127.0.0.1:5635/anthropic claude
Model routing
When multiple providers serve the same model, CrabDash routes using weighted random selection with automatic failover. Configure weights and routing rules in Settings → Routing.
For full routing configuration details, see CrabLLM Configuration.