Claude Code with Local Ollama Models Setup
Detailed guide on connecting Claude Code to free local Ollama models using Anthropic API compatibility, including specific model recommendations.
Share:
Original Tweet
I stopped paying for Claude Code. $200/month for an API subscription to write code. Then Ollama dropped Anthropic API compatibility. Now Claude Code connects to free, local models on my machine. Here's the exact setup (took me 10 minutes): 1. Install Ollama ā curl -fsSL https:// ollama .com/install .sh | sh 2. Pull a model ā ollama pull qwen2.5-coder 3. Point Claude Code at localhost ā ANTHROPIC_BASE_URL=http: //localhost :11434 claude That's it. Claude Code thinks it's talking to Anthropic. It's talking to your laptop. Best models I've tested: ⢠Qwen 2.5 Coder - best all-around for code generation ⢠DeepSeek-Coder - strongest at debugging and refactoring ⢠Llama 3 - solid general reasoning Local models aren't Sonnet or Opus. Complex multi-file refactors still stumble. Long context windows get messy. But for everyday coding - scaffolding, tests, quick edits, boilerplate - they handle it fine. Your code never leaves your machine. Your bill goes from $200/month to $0. Your API key stays in your pocket. (Save this for later.)View on X ā
claude-codeollamalocal-modelsapi-compatibility