Thoughts on Ollama
Thoughts on Ollama

Thoughts on Ollama

Saw a post mentioning gpt-oss:20b and looked into what it would take to run that locally. It then referred to Ollama, so I downloaded it, installed it and it pulled gpt-oss:20b.

It seems to work OK. I don't have a blazing fast desktop (Ryzen 7, 32GB RAM, old GFX1080 GPU) but its running, albeit a little slowly.

Anyone else have opinions about it? I kind of (well actually REALLY like) like the idea of running it locally. Another question is if it is "truly" running locally?

submitted by /u/Future-AI-Dude
[link] [comments]