Privacy is becoming a luxury in the AI world. If you’re tired of sending your data to the cloud every time you ask a question, running a model locally is the answer. Today, we’re looking at Qwen 3.5 9B—a powerhouse model from Alibaba—and how to get it running on your own machine using Ollama. Whether you’re a […]

Read More →

If you’ve ever tried to train a machine learning model or just wondered why your computer fans start screaming when you open too many Chrome tabs, you’ve probably run into the alphabet soup of processors: CPU, GPU, and TPU. They all “process” things, but they do it in ways that are fundamentally different. Choosing the […]

Read More →

Cursor IDE integrates several AI models to enhance the coding experience. Here’s an overview of the available models and their capabilities. Available AI Models in Cursor OpenAI Models GPT-3.5 Turbo: A cost-effective model for general coding assistance with good performance across most programming tasks. GPT-4: More powerful than GPT-3.5, with better reasoning capabilities and code […]

Read More →

The AI Toolkit for Visual Studio Code is a game-changer for developers and AI engineers looking to simplify the creation of AI applications. This powerful extension streamlines the entire process, from development and testing to deployment, by seamlessly integrating with generative AI models both locally and in the cloud. Supporting a wide range of commercially […]

Read More →

DeepSeek, a powerful open-source LLM, can be easily run locally on your desktop/laptop using Ollama. I’m using an M1 MacBook Pro with 32GB. Ollama simplifies the process of running large language models, handling dependencies and providing a consistent interface. This guide will walk you through installing DeepSeek via Ollama, making it accessible with just a […]

Read More →