<span class="vcard">/u/InsideResolve4517</span>
/u/InsideResolve4517

I built VSCode extenstion "Knowivate Autopilot (beta)" which can create, edit, context addition, project structure addition etc and still working on it and It uses localllm

If you are programmer, have ollama & local llm installed then continue reading else skip it I am continously working on completely offline vsode extenstion and my purpose is to add agent mode capabilites using local llms. So I started buildin…

Best OLLAMA (CPU‑only) model for AMD Ryzen 5 5600G + 46 GiB RAM?

Hi everyone, I’ve got a local dev box with: OS: Linux 5.15.0-130-generic CPU: AMD Ryzen 5 5600G (12 threads) RAM: 48 GiB total Disk: 1 TB NVME + 1 Old HDD GPU: AMD Radeon (no NVIDIA/CUDA) I have ollama installed and currently I have 2 local llm install…