"If AGI goes bad, can't we just turn it off?"
Personally I feel the best way to address this common talking point is with an analogy.
Spiders think they could stop all humans if they just withheld all the webs and web making material from us.
Without those tools, humans couldn't catch flies and surely they'd starve to death? Spiders can't fathom the range of alternate methods for procuring food and thriving.
Within even a single hour of runtime, a super AGI will likely have diversified away from the human electrical grid in ways we couldn't even imagine.
The counter argument is, that it would take time to build these pieces together. It after all took us 100 years to get to where we are with the grid. The counter-counter argument however is the AGI doesn't ned to, it can 5D chess us so that all our future actions will fulfil that goal with some slight nudging here and there.
Fascinating stuff - ultimately though, i'm in the camp of AGI won't happen over night like Frankenstein via a flip of a switch. As AI evolves so do we, gains are incremental with the occasional blips; so whilst this is super fun to talk about, I think the case of us getting blindsided is unlikely.
I could be wrong...and I probably am.
[link] [comments]