Fear of the nightmare scenario
There is one last fear – embodied by HAL 9000 – the Terminator and any number of other fictional superintelligences: If AI keeps improving until it surpasses human intelligence will a superintelligence system (or more than one of them) find it no longer needs humans?
How will we justify our existence in the face of a superintelligence that can do things humans could never do?
Can we avoid being wiped off the face of the Earth by machines we helped create?
The key question in this scenario is: Why should a superintelligence keep us around?