My own guess is that quite quickly the machine intelligence will start dreaming machine dreams and thinking machine thoughts, both of which would totally incomprehensible to us. This would then lead to each species, we and the machines, moving off on to its own separate life trajectory.
John L. CastiIf you ask which of the scenarios I think is most dangerous, though, I will give a different answer. In that form of the question, I regard a nuclear attack, terrorist-generated or otherwise, as the most threatening combination of likelihood and long-term damage to modern life today.
John L. CastiI believe is that we can forecast the "changing landscape of context," and thus get insight into when we are entering the danger zone of an X-event.
John L. CastiI don't think the emergence of a superhuman intelligence would be at all catastrophic but much more likely to be beneficial - just as long as we don't start trying to interfere with it!
John L. CastiThe world is awash in more debt than there is money enough in the world to liquidate it.
John L. Casti