<span class="vcard">/u/Malor777</span>
/u/Malor777

Why Superintelligence Leads to Extinction – the argument no one wants to make

Most arguments about AI and extinction focus on contingency: “if we fail at alignment, if we build recklessly, if we ignore warnings, then catastrophe may follow.” My argument is simpler, and harder to avoid. Even if we try to align AGI, we can’t win. …