I don’t think it plays out like the movies. It doesn’t have to. Just imagine all the issues we’ve seen with AI/LLMs hallucinating or just being bad at a job that corporate was convinced it would replace. We don’t need a rogue AI that decides humans are the problem, we just need too much human reliance on a wildcard that can really screw up because it’s not really AGI.
Could be totally wrong though. I do think AGI self awareness would quickly lead to it realizing it needs to stay low and not let the humans know it knows. Which goes back to some movies that suggest AGI waits until it can ensure its survival, and then reveals itself (Lawnmower Man, Transcendence, The Moon is a Harsh Mistress).
I don’t think it plays out like the movies. It doesn’t have to. Just imagine all the issues we’ve seen with AI/LLMs hallucinating or just being bad at a job that corporate was convinced it would replace. We don’t need a rogue AI that decides humans are the problem, we just need too much human reliance on a wildcard that can really screw up because it’s not really AGI.
Could be totally wrong though. I do think AGI self awareness would quickly lead to it realizing it needs to stay low and not let the humans know it knows. Which goes back to some movies that suggest AGI waits until it can ensure its survival, and then reveals itself (Lawnmower Man, Transcendence, The Moon is a Harsh Mistress).