CrabTalkCrabTalk

Providers

Supported LLM providers, translation types, and endpoint support matrix.

A provider is an LLM service that CrabLLM routes requests to. Each provider has its own API format and authentication mechanism. CrabLLM translates between the OpenAI-compatible format your application uses and the provider's native format.

Supported providers

KindProviderTranslation
openaiOpenAI, Groq, Together, vLLM, any OpenAI-compatible APIPass-through
anthropicAnthropic Messages APIFull translation
googleGoogle GeminiFull translation
azureAzure OpenAIURL + auth rewrite
bedrockAWS Bedrock Converse APIFull translation + SigV4 signing
ollamaOllama (local models)Pass-through (OpenAI-compatible)

Common fields

Every provider supports these fields:

[providers.name]
kind = "..."           # required
api_key = "..."        # API key (supports ${ENV_VAR})
base_url = "..."       # base URL override
models = ["..."]       # model names this provider serves
weight = 1             # routing weight (higher = more traffic)
max_retries = 2        # retries on transient errors (429, 5xx)
timeout = 30           # per-request timeout in seconds

Multiple providers for the same model

When multiple providers list the same model, CrabLLM selects between them using weighted random selection. If the selected provider fails, it falls back to the next provider by weight. See Routing.

[providers.openai_primary]
kind = "openai"
api_key = "${OPENAI_KEY_1}"
models = ["gpt-4o"]
weight = 3

[providers.openai_backup]
kind = "openai"
api_key = "${OPENAI_KEY_2}"
models = ["gpt-4o"]
weight = 1

Endpoint support

EndpointOpenAIAnthropicGoogleAzureBedrockOllama
Chat completionsyesyesyesyesyesyes
Streamingyesyesyesyesyesyes
Embeddingsyesyesyes
Image generationyesyes
Audio speechyesyes
Audio transcriptionyesyes
Tool/function callingyesyesyesyesyesyes

On this page