Why would the AI game "Suck Up!" claim that modern computers can’t run an AI locally?
Why would the AI game "Suck Up!" claim that modern computers can’t run an AI locally?

Why would the AI game "Suck Up!" claim that modern computers can’t run an AI locally?

Devs Gameplay Video

"Suck Up!" is a comedic sandbox indie game where you interact with AI agents in various scenarios, like playing a vampire trying to trick characters into letting you into their homes or convincing characters to break up with each other. It’s quite popular on YouTube and looks genuinely fun, but after checking out their website, I noticed a few red flags.

For one, AI interaction is handled server-side, and there’s no option to run it on a local LLM. They claim in their FAQ that running it locally is "impossible," which seems off to me, given the current capabilities of modern PCs. The game also uses a token system, with the initial 10,000 tokens provided with the game, which is said to allot approximately 40-50 hours of playtime. Additional tokens aren't currently available for purchase, though their FAQ suggests they plan to offer them eventually. While some players may not hit the 10,000 token limit, it’s worth noting the game is still in development, has mod support, and more game modes are expected.

I’m not sure what AI they’re using, as the company hasn’t shared any details about it. I've watched several videos, and I’ve noticed that the agents sometimes seem confused when multiple characters are mentioned at once, or when context needs to shift quickly. This makes me wonder if there’s an issue with how the AI handles those types of interactions.

Does anyone else see these as red flags or have insights into the AI they’re using or why this couldn't be run locally?

submitted by /u/Droid85
[link] [comments]