The first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.
Nick BostromThe greatest existential risks over the coming decades or century arise from certain, anticipated technological breakthroughs that we might make in particular, machine super intelligence, nanotechnology and synthetic biology. Each of these has an enormous potential for improving the human condition by helping cure disease, poverty, etc. But one could imagine them being misused, used to create powerful weapon systems, or even some kind of accidental destructive scenario, where we suddenly are in possession of some technology that's far more powerful than we are able to control or use wisely.
Nick BostromIt’s unlikely that any of those natural hazards will do us in within the next 100 years if we’ve already survived 100,000. By contrast, we are introducing, through human activity, entirely new types of dangers by developing powerful new technologies. We have no record of surviving those.
Nick BostromThe cognitive functioning of a human brain depends on a delicate orchestration of many factors, especially during the critical stages of embryo development-and it is much more likely that this self-organizing structure, to be enhanced, needs to be carefully balanced, tuned, and cultivated rather than simply flooded with some extraneous potion.
Nick Bostrom