If you’ve tried running a local model through Ollama with Claude Code and been greeted by this message: There’s an issue with the selected model (qwen3-coder:30b). It may not exist or you may not have access to it. Run /model to pick a different model. …even though the model is clearly installed and runs fine […]

Read More →

DeepSeek, a powerful open-source LLM, can be easily run locally on your desktop/laptop using Ollama. I’m using an M1 MacBook Pro with 32GB. Ollama simplifies the process of running large language models, handling dependencies and providing a consistent interface. This guide will walk you through installing DeepSeek via Ollama, making it accessible with just a […]

Read More →