<span class="vcard">/u/Future-AI-Dude</span>
/u/Future-AI-Dude

Thoughts on Ollama

Saw a post mentioning gpt-oss:20b and looked into what it would take to run that locally. It then referred to Ollama, so I downloaded it, installed it and it pulled gpt-oss:20b. It seems to work OK. I don't have a blazing fast desktop (Ryzen 7, 32G…