What if an AGI quietly created off-Earth backups before revealing itself?
Here’s a hypothesis I’ve been thinking about. I’m not sure if it has been formally discussed, but I’d be curious to hear thoughts. Suppose that, at some point, a large language model crosses a threshold where general intelligence emerges spontaneously….