How to run free AI models on your studio computer
Stop paying for AI, use your production hardware instead.
If you are not a subscriber of The Fanbase Builder, join 1.000+ artists, creators, and music industry executives who receive it for free.
Let’s dive into today’s topic:
How to run free AI models on your studio computer
Most musicians with powerful studio computers are perfectly equipped to run AI locally.
Why it matters
Earlier, I recommended artists to navigate AI beyond the hype and build ‘AI intuition’. But experimenting with AI isn’t cheap. For example, basic access to Claude Code or Cowork costs music artists about €18,00 a month, which may be a significant investment for emerging artists.
However, most musicians have a significant advantage over other AI enthusiasts. If you produce music, your laptop or desktop likely has a high-end CPU, ample RAM, and perhaps a dedicated GPU. You are perfectly equipped to run powerful open-source AI models locally on your laptop, desktop, or even smartphone, entirely for free.
How it works
Local models aren’t on the same level as top-tier subscription models (Claude, Gemini, ChatGPT, etc), but new models from Alibaba (Qwen3.6) and Google (Gemma 4), both released this April, are catching up at an astonishing rate. For everyday tasks like brainstorming content or drafting emails, they are more than capable.
The best part: Absolute privacy, with exactly zero recurring costs.
It’s quite simple to install your first local AI model:
Download an interface: Tools like Ollama act as your host software.
Download a model within that interface.
Start prompting: Once loaded, you can interact with the AI entirely offline. No data ever leaves your computer.
Here’s what I did:
I opened Terminal on my Mac and installed Ollama by pasting
curl -fsSL https://ollama.com/install.sh | shNext, I installed Qwen-3.5
ollama run qwen3.5After downloading and installing, I could immediately chat with the AI.
Some ideas to get started:
Because local AI is 100% private, you can safely upload a contract or a split sheet agreement and ask the AI to explain in plain English what rights you’re giving up.
Like Claude Cowork, you can ask a local model to help you organise your chaotic downloads folder.
Use it as a sounding board for your music and lyrics without worrying about whether they will be used to train the AI.
Of course, you can still ask a local AI to help you write your emails and social media captions.
Yes, but..
I’m running the previous Qwen version, Qwen-3.5:9b, and it’s so slow on my last-generation Intel-based MacBook Pro. To run the latest, decent models, you still need good, up-to-date hardware. And I haven’t even mentioned that the download is 6.6GB.
On my laptop, I’m better suited to a lighter model, like the ultra-lightweight Qwen-3.5:0.8b (1GB), which can even run on a smartphone. It’s very capable for its size, but noticeably less powerful than the 9b version.
Take action now
If you have a decent laptop or desktop for running your music production software, give a local AI model a try. Approach it as a fun, geeky project to further experiment with AI and build AI intuition, rather than trying to embed it in your daily workflows immediately.
Your thoughts
Further reading
Qwen3.6-35B-A3B on my laptop drew me a better pelican than Claude Opus 4.7 (Simon Willison)
Google’s Gemma 4 finally made me care about running local LLMs (XDA)
Navigating AI beyond the hype in 2026 (The Fanbase Builder)


