This is wrong right?
This is wrong right?

This is wrong right?

This is wrong right?

https://preview.redd.it/7exegfnfz62c1.jpg?width=1400&format=pjpg&auto=webp&s=7fef3798517833345123ede1bcb3f93b5535785f

Once we create AGI it will immediately be better at every applicable task than a human. Which will make it fall into the category of ASI.

Therefore there are only two states; either it is less able than a human or more but there is no state where A.I. will be simply be equal to an average human's ability solve task (or if does hover at this state it won't be for very long) .

This kills every robot-searching-for-humanity fantasy.

What am I overlooking here?

submitted by /u/Yenii_3025
[link] [comments]