As much as I’m rooting for AGI to come in and help (potentially) fix a lot of the problems the human race has, I’d like to look more in depth to the other side of the story. I want to make it clear that I make no claims to be super knowledgeable about anything I’m saying here. I genuinely just want to learn more.
It seems the commonly held belief foundation in this space is that number theory, computational theory, information theory, and whatever else related to computer science, results in this direction that we just need enough compute and the right algorithm to simulate an intelligence that surpasses the general intelligence of us humans. I know this is still up for debate and nothing is set in stone etc.
From what I know about philosophy (which is also debated ofc), all science is based on empiricism, and if there is something we end up not being able to measure, then we will not be able to control or deal with that thing using science. Also, materialism and physicalism come to mind, there are obviously alternative beliefs to those. The hard problem of consciousness comes to mind. Is there something special to flesh?
So, somewhere, there should be interesting discussion and analysis of what’s going on with AI, and therefore counters to the claims it can ever truly surpass our intelligence. I know there’s a lot of philosophy, but I’d prefer secondary sources since it can get pretty hard to digest. Counterpoints that directly address AI the closer to modern appreciated just as much as old philosophy, I want as much perspective as possible.
Also, good faith arguments without hate for the other side is preferred. The whole recent art community/tech controversy is just exhausting. I just want to be happy learning about the implications, not argumentative nor embittered!
Thanks if you read my post. If this isn’t the right place to ask, I’d appreciate being pointed in a better direction.
[link] [comments]