Anybody who says that there is a 0% chance of AIs being sentient is overconfident. Nobody knows what causes consciousness. We have no way of detecting it & we can barely agree on a definition. So we should be less than 100% certain about anything to do with consciousness and AI.
Anybody who says that there is a 0% chance of AIs being sentient is overconfident. Nobody knows what causes consciousness. We have no way of detecting it & we can barely agree on a definition. So we should be less than 100% certain about anything to do with consciousness and AI.

Anybody who says that there is a 0% chance of AIs being sentient is overconfident. Nobody knows what causes consciousness. We have no way of detecting it & we can barely agree on a definition. So we should be less than 100% certain about anything to do with consciousness and AI.

To be fair, I think this is true of most philosophical questions.

submitted by /u/katxwoods
[link] [comments]