What I would like to ask in the round:
What is concretely the fear?
Is it the worry that some Microsoft co-pilot might decide on its own some morning: "No Powerpoints/Excel/..." to build today - and simply refuses to work? So that Microsoft doesn't have to be held liable because the superintelligence (AGI) has simply set other priorities?
Is it the fear that the AGI will need more computing power and simply take over AWS and all other giant systems?
Could the AGI come up with the idea: Water production is eating up too much power for me, I'll take over and shut it down?
And WHY should an AGI do such a thing at all? Seems to me extremely "human" thought: "I'll take over the world" (I don't even want to ask the question, if this wouldn't be cool, if an AGI would "rule" the world. So far we have only managed to create systemic enemy images and stupid economic systems - maybe an AGI would be quite different on that. But this is NOT the main question - only a side issue).
Is it the fear of losing control?
Is it the fear - well - of what actually? It is probably quite nonsense to assume that the AGI builds super robots (with which resources?), which then devastate the world Terminator-like, or? (Countermeasure EMP pulse destroys any technology today already quite reliably).
If a corporation like Open AI, or Microsoft here identifies such a real threat potential that they dump 20% of their own resources into it so that "nothing happens" - then this fear doesn't seem so completely unfounded.
I ask for enlightenment of the swarm knowledge here. What are the fears, what should happen specifically? Happy start of the day!
[link] [comments]