<span class="vcard">/u/MasterDisillusioned</span>
/u/MasterDisillusioned

Most consumers will never have the resources to run large language models locally

I've seen some insist that eventually, we won't need to pay for subscriptions for services like Chatgpt because phones will be powerful enough to run stuff like that locally. I disagree. To run LLMs locally, you need 100+GB of RAM, and even t…