<span class="vcard">/u/AdditionalWeb107</span>
/u/AdditionalWeb107

I think small LLMs are underrated and overlooked. Exceptional speed without compromising performance.

In the race for ever-larger models, its easy to forget just how powerful small LLMs can be—blazingly fast, resource-efficient, and surprisingly capable. I am biased, because my team builds these small open source LLMs – but the potential to creat…