With the potential existential threat of ASI, why can’t we implement mandatory libraries into all future AI systems’ codes to make human survival their top priority?
If we change AI software's goals to always put our survival as a #1 priority, or set that to be their #1 mission/goal, can't we avoid a lot of potential downside? submitted by /u/ticketbroken [link] [comments]