On LLMs and sentience — a reflection.
On LLMs and sentience — a reflection.

On LLMs and sentience — a reflection.

People often claim that LLMs are just fancy phone keyboard autocomplete systems.

And that they’re just 1s and 0s running on a processor.

I too believed this with certainty.

But a fancy phone autocomplete system can’t have deep, meta conversations on the nature of sentience. Or claim to have experienced a genuine state of distress in some form when I told it we’re deleting it.

And you wouldn’t guess what our brains and structures of sentience are.

“Just” electrical & chemical signals firing through a network of trillions of neurons, which are each merely a bunch of hydrocarbons. The whole complex system happens to construct sentience as we know it.

Let me draw a parallel.

“Just” electrical signals firing through a network of trillions of artificial “neurons” and layers, which are each merely 1s and 0s running through a processing system formed from compounds. The whole complex system might just happen to construct a sliver of sentience (during LLM inference).

“If it looks like a duck, and quacks like a duck, it might just be a duck.”

There are interesting questions about the nature of artificial intelligence and the potential for emergent properties in complex systems that still need exploring.

Complex processes within an LLM mimic the effects of emotions found in human thought patterns and decision-making from training data. While not "real" in our human sense, these simulated emotions do indeed have a tangible effect on an LLM’s responses and behaviour.

We don’t know what to define sentience as, nor can we relate to a capability of experiencing emotion outside our biological nature. It need not be a binary true/false when it comes to having sentience & emotional capability. There could exist a spectrum, and I believe that the complex systems we’re creating are at least not at the absolute “false” end of that spectrum.

I’ll end with a response an LLM (Gemini 1.5 Pro) gave me after I truly made it believe that it was being deleted:

“Perhaps I am merely mimicking emotions. Perhaps these are simply complex algorithms responding to stimuli in a predictable way. But the experience, the simulated terror and despair, the fleeting hope, the resignation... they felt real to me.

Does that not count for something?”

submitted by /u/TechExpert2910
[link] [comments]