I was thinking today about all the AI hype we have right with somewhat a bunch of new breakthroughs each month, but things not only are getting slower updates, but the updates impacts itself are becoming lesser. If that is not enough, well we have big problems ahead, such as processors are reaching the physical limit, quantum effects disrupting the works, wafers becoming increasing more expensive, the size reduction is no longer adding the same boosts in power and new materials are just far from viable.
On top of this we are going meet two other walls, the software and the energy. About the first, as we make better and more complex algorithms for computation the harder it gets to make better ones to squeeze more power and handle more complex tasks. The second, is becoming more real as big data centers places with those gpus are using more and more power to satisfy the demand.
If this trend continue we gonna reach a critical point really soon, this means a hard plateau. This all some ups in a harder and harder path to make tech evolves as times pass, until we become stuck.
There is one more big problem with AGI, our AI models are basically pure reason machines that use language to do so. This by itself is a huge drawback since they can not get information by themselves from the real world, this implies they depend in humans to evolve things. It is a huge bottleneck, anybody with some good knowledge about Kant knows his famous books about the critic of the pure reason this means you can't just use pure reason to know the truth you need empirical basis. In this way we are very limited with what humans observe, very slow. Any time they try to use pure ai generation to make new content it always becomes more degenerated as times pass, because the errors will accumulate until destroy itself.
So, to make a more clear statement we can divide the process of knowledge in three simple steps: get information, build concepts that link all information and apply concepts to create new tools, techs and sensors to get more information or solve problems. This not necessarily occur in order always, but has time pass and the human knowledge becomes more and more complex this three steps become the only path. Think about physics with Newton you don't need super advanced tools to analyze, but once you get to Einstein now things need to be very precise and complex. To summarize this, each step in we give in new breakthrough concepts the harder and more complex things become and the more our sensors need to become better and more precise to make new advances.
This is all ends to science itself, we create concepts that explain things, but they always have limitations in applications and sooner or later we need better ones to keep improving. So science is basically a tool to solve our problems and make us understand things each time better than before. But, how you can suspect things will become in each interaction exponentially harder, because we can't capture reality itself. So this implies that we are the limitation and if we want to go further we need to make better sensors using machines and than we can push a little more with both machine and humans until we reach the end point where we can not make better theories, because it has become too complex for us to continue. To give more perspective imagine the energy, resources, processes and all infrastructure to make it all happen. We are talking about a insane amount of power, materials, machine and humans doing and even than we are limited by the size we can make computers, maintain and supply their power needs.
To make simple, better sensors to better information collection to better concept creation to better tools and techs. This is a cycle that will be slow and reach a max point which will depend in the materials and resources we can get. The concept of AGI I am using here is making a new invention, like nuclear fusion reactor (feasible). a new theory in science or solve a unsolved math problem. In this AI sucks, because it can only reach what we already establish as the true information. So the AI we have can only be useful to process huge amounts of data that we could not do it by ourselves. To solve this problem we would need collect data from the entire world, all people on the planet and even than give to a ai overlord to interpret and then create better tools for better information and soo on, progressing things.
Nuclear fusion is a pipe dream right now, very hard, quantum computers not only need insane amount of power but also become ridiculous complex as the number of qubits increase and also superconductors which need almost absolute zero temperature. So in the end I am in the Michio Kaku camp. AI are dumb machines in the sense they can't now what is truth or not. To give more theory about my allegations pick the math incompleteness theorem from Gödel who proofs math can't be complete, decidable and consistent. This means all knowledge have errors inside them, but they can be better at explaining somethings.
To end all the pipe dreamers you should work your ass off, because AI will not be a super intelligent god who will solve problems any time soon, the walls are all around and things will become harder and harder to improve until we reach the physical and material limitations we can use to make things better, once we get there we will know what is possible and what is not. This may explain why we don't see super advanced civilizations in space.
This will make some people happy and others mad, but it is what it is. I getter all this information for us and would like to listen what you guys think about it.
[link] [comments]