<span class="vcard">/u/katxwoods</span>
/u/katxwoods

How could a superintelligent AI cause human extinction? 1. Create a pandemic or two 2. Hack the nuclear codes and launch all of them 3. Disrupt key supply chains 4. Armies of drones and other autonomous weapons 5. Countless ways that are beyond human comprehension

submitted by /u/katxwoods [link] [comments]

Anybody who says that there is a 0% chance of AIs being sentient is overconfident. Nobody knows what causes consciousness. We have no way of detecting it & we can barely agree on a definition. So we should be less than 100% certain about anything to do with consciousness and AI.

To be fair, I think this is true of most philosophical questions. submitted by /u/katxwoods [link] [comments]

Dario Amodei says at the beginning of the year, models scored ~3% at a professional software engineering tasks benchmark. Ten months later, we’re at 50%. He thinks in another year we’ll probably be at 90%

submitted by /u/katxwoods [link] [comments]

They named it Stargate, the fictitious portal thru which hostile alien civilizations try to invade earth. I just hope we get the same amount of completely unrealistic plot armor that was protecting Stargate Command in SG1

submitted by /u/katxwoods [link] [comments]