Providers
LLM providers supply the AI models that power your agents. Clawrium supports multiple providers, allowing you to choose the best model for your use case.
Supported Providers
| Provider | Type | Best For |
|---|---|---|
| OpenAI | Cloud | GPT-4o, latest models |
| Anthropic | Cloud | Claude, long context |
| OpenRouter | Gateway | Multi-provider access |
| AWS Bedrock | Cloud | AWS ecosystem, compliance |
| Google Vertex | Cloud | Gemini, Google Cloud |
| ZAI / BigModel | Cloud | GLM models, China region |
| Ollama | Self-hosted | Local inference, privacy |
Provider Comparison
| Feature | OpenAI | Anthropic | Ollama |
|---|---|---|---|
| Setup | API key | API key | Server URL |
| Cost | Pay per token | Pay per token | Hardware only |
| Latency | Network | Network | Local/Network |
| Privacy | Cloud | Cloud | On-premise |
| Model Choice | Fixed list | Fixed list | Unlimited |
Quick Setup
Add a provider in one command:
# Cloud provider (OpenAI, Anthropic, etc.)
clm provider add my-openai --type openai
# Self-hosted (Ollama)
clm provider add local-llm --type ollama --url http://192.168.1.50:11434
Then assign it to an agent during onboarding:
clm agent configure <agent-name>
# Select provider during the providers stage
Managing Providers
# List all providers
clm provider list
# List provider types
clm provider types
# View available models for a provider type
clm provider types <type> models
# Edit a provider
clm provider edit <provider-name> --model <new-model>
# Remove a provider
clm provider remove <provider-name>
Security
- API keys are stored securely (not in plain text)
- Keys are never logged or displayed in full
- Per-provider isolation
Troubleshooting
"Provider connectivity failed"
- Verify API key is valid
- Check network connectivity from the host
- Ensure provider service is operational
"No models available" (Ollama)
- SSH to the Ollama host
- Run:
ollama listto verify models are pulled - Pull models:
ollama pull <model-name>
See individual provider pages for detailed setup instructions.