OpenClaw Model Providers: From Anthropic to Local Ollama
28 providers, hundreds of models. How to configure primary models, fallbacks, and local-first setups for privacy and cost control.
Tags: providers, models, ollama, anthropic, openai
Category: guide
Frequently Asked Questions
Which model provider should I start with?
Anthropic (Claude) or OpenAI (GPT-4) are the best starting points — both have excellent tool use support which is critical for OpenClaw skills. Anthropic's Claude models tend to follow agent instructions more precisely, while OpenAI offers the widest ecosystem.
Can I use a local model with Ollama instead of paying for API access?
Yes, OpenClaw supports Ollama as a provider. You'll need sufficient hardware (8GB+ RAM for small models, 16GB+ for capable ones). Local models are free to run but currently lag behind cloud models in tool use reliability. Great for privacy-sensitive setups.
How do I set up model fallbacks?
Configure a primary model in your openclaw.yaml and add fallback providers. If the primary provider returns an error or times out, OpenClaw automatically routes to the fallback. This ensures your agent stays responsive even during provider outages.
What's OpenRouter and why would I use it?
OpenRouter is a unified gateway that gives you access to dozens of model providers through a single API key. It's useful for experimenting with different models without managing multiple accounts, and it handles routing, rate limits, and billing centrally.
How much does typical agent usage cost per month?
For personal use with Claude or GPT-4, expect $10-40/month depending on how chatty you are and how many automated tasks run. Using smaller models for routine tasks and reserving top-tier models for complex work can cut costs significantly.