Does anyone feel like agi is a hoax and ai will just end up being some convient reference tool .I just don’t see how people think ai is going to be able to make scientific breakthroughs when it all it does is try to predict the next word on the vast amount of data it’s trained on. It just doesn’t seem fundamentally right to tell a bunch of 0 and 1s to think
[link] [comments]