I think that in the immediate aftermath of a superhuman machine intelligence revealing itself, most people would feel very threatened but take solace in the thought that we can always pull the plug.
John L. CastiIf you ask which of the scenarios I think is most dangerous, though, I will give a different answer. In that form of the question, I regard a nuclear attack, terrorist-generated or otherwise, as the most threatening combination of likelihood and long-term damage to modern life today.
John L. CastiI don't think the emergence of a superhuman intelligence would be at all catastrophic but much more likely to be beneficial - just as long as we don't start trying to interfere with it!
John L. CastiThe process of globalization has now interconnected almost everything ranging from financial markets to transport networks to communication systems in a huge system that no one really understands.
John L. CastiMy own guess is that quite quickly the machine intelligence will start dreaming machine dreams and thinking machine thoughts, both of which would totally incomprehensible to us. This would then lead to each species, we and the machines, moving off on to its own separate life trajectory.
John L. Casti