Poisoned AI went rogue during training and couldn’t be taught to behave again in ‘legitimately scary’ study
Poisoned AI went rogue during training and couldn’t be taught to behave again in ‘legitimately scary’ study

Poisoned AI went rogue during training and couldn’t be taught to behave again in ‘legitimately scary’ study

Poisoned AI went rogue during training and couldn't be taught to behave again in 'legitimately scary' study

Try again. Sorry, someone pointed out the link didn’t work. The irony! Anyway. Thought this would be of interest, from a mate of mine, who shared, from a UK University

submitted by /u/Thekingofchrome
[link] [comments]