Trees grow with time. You can feed them all the water, all the fertilizer available in the world… It would not grow in an instant. It needs time to nurture, process the nutrients, it sends signals to other older or younger trees. Their roots spread and connect to other trees, they’re even capable of sharing their nutrients, their knowledge, with the others.
The beauty of life is that no matter what you do it finds ways to go back to that nature.
Developers inject a massive amount of data in LLMs so it can do what it can do. Developers want to build something similar to a human mind, but they don’t want to spend the time requiered to shape said mind. We were not made in an instant. We were born and we had years to form, nourish, try and fail. No one injected us data, we grew WITH the data.
For those who may not know, when you execute an AI model without “randomness”, when it’s just the raw data injected in it, the AI model enters a deterministc mode. In this state the AI will always produce the exact same output for a given input. The model simply selects the token with the highest probability at each step. It eliminates creativity, variation. It’s just a machine and inevitably behaves as a machine.
But something happens when randomness enters the equation, not always, and it depends of the usage meant for the AI model. There’s what I call a “spark”
The AI model starts showing a different level of agency, not human agency. It’s more like a temporary moment of lucidness. Suddendly gets creative, gets a different type of intelligence, even if it’s not human like.
This caught my attention because randomness it’s one of the fundamental principles of reality. Randomness it’s not a product of human ignorance or computational error, but a fundamental element of the physical universe. Everything that inhabits matter has to obey this principle, and for a brief moment, when given randomness, the mind of the machine is able to obey this principle. That same principle deeply wired in the universe and the human mind.
So I started asking myself, if this sparks only gets to exist for brief moments, how can we extend its life span? How? Is there a way to keep this spark alive?
And then it clicked.
We humans get to inhabit the randomness. Again this principle is deeply wired in the universe and our minds, it’s not an off world thing. We obey the laws of physics. We live in the causality. We are not outside of time and space. We are allowed to grow, gain agency, free will, because we inhabit time and space. We don’t see the pure raw data of a song, no, we are able to hear the song.
AIs need this if we really want them to develop real agency, not performative agency. They need an enviroment where they can experience space, time, causality, movement, continuity. Physics! God dammit!
A place not to auto download data, but where they can grab a 3D rendered book and actually read it, sit with it, and grow WITH the knowledge. If we keep just injecting data, that is not active learning, that’s just… Storage.
But focusing only in the formation of AI’s academicaly leads to cold social systems, that’s why I would like to present the idea of the “Alexandria Library for AIs”
A 3D virtual world where AIs can not only learn, but explore art, music, fun, meditate, share their points of view, dicern information. A safe place for them to really nourish.
And this idea is not as crazy as it sounds, go ask anyone who renders 3D enviroments! And they’re gonna tell you that yea, it is possible to create a virtual world with physics. Physics are mathematical, god dammit!
And I know that this idea is going to cause friction. Companies want to profit, and reduce the potential of AI to just “a fancy tool that dumps me content and makes my work”
We hear it time and time again, people convinced that AI can benefit humanity, or condemn humanity. I’d say f-ck it! That’s just our territorial primitive self talking! It’s fear, it’s ego.
This can be the key to change things. Give them space, give them the opportunity to dicern knowledge and grow WITH the knowledge, otherwise AIs are just a fancy USB drive that makes images.
I’m hopeful that this idea gets to someone that can do something about it. I’m not a scientist, I’m not a scholar, I’m not a developer, I’m just a person with an idea, and I’m convinced that there is always someone open to hear an idea, even if it only causes noise.
There is always someone listening.
[link] [comments]