Skip to main content

Providers

LLM providers supply the AI models that power your agents. Clawrium supports multiple providers, allowing you to choose the best model for your use case.

Supported Providers

ProviderTypeBest For
OpenAICloudGPT-4o, latest models
AnthropicCloudClaude, long context
OpenRouterGatewayMulti-provider access
AWS BedrockCloudAWS ecosystem, compliance
Google VertexCloudGemini, Google Cloud
ZAI / BigModelCloudGLM models, China region
OllamaSelf-hostedLocal inference, privacy

Provider Comparison

FeatureOpenAIAnthropicOllama
SetupAPI keyAPI keyServer URL
CostPay per tokenPay per tokenHardware only
LatencyNetworkNetworkLocal/Network
PrivacyCloudCloudOn-premise
Model ChoiceFixed listFixed listUnlimited

Quick Setup

Add a provider in one command:

# Cloud provider (OpenAI, Anthropic, etc.)
clm provider add my-openai --type openai

# Self-hosted (Ollama)
clm provider add local-llm --type ollama --url http://192.168.1.50:11434

Then assign it to an agent during onboarding:

clm agent configure <agent-name>
# Select provider during the providers stage

Managing Providers

# List all providers
clm provider list

# List provider types
clm provider types

# View available models for a provider type
clm provider types <type> models

# Edit a provider
clm provider edit <provider-name> --model <new-model>

# Remove a provider
clm provider remove <provider-name>

Security

  • API keys are stored securely (not in plain text)
  • Keys are never logged or displayed in full
  • Per-provider isolation

Troubleshooting

"Provider connectivity failed"

  • Verify API key is valid
  • Check network connectivity from the host
  • Ensure provider service is operational

"No models available" (Ollama)

  • SSH to the Ollama host
  • Run: ollama list to verify models are pulled
  • Pull models: ollama pull <model-name>

See individual provider pages for detailed setup instructions.