[...] intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists. After a certain point, if you're smart, have spent time thinking about and defending your religion, and still haven't escaped the grip of Dark Side Epistemology, the inside of your mind ends up as an Escher painting.
Eliezer YudkowskyIf you want to build a recursively self-improving AI, have it go through a billion sequential self-modifications, become vastly smarter than you, and not die, you've got to work to a pretty precise standard.
Eliezer YudkowskyMaybe you just can't protect people from certain specialized types of folly with any sane amount of regulation, and the correct response is to give up on the high social costs of inadequately protecting people from themselves under certain circumstances.
Eliezer YudkowskyBetween hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.
Eliezer YudkowskyIf cryonics were a scam it would have far better marketing and be far more popular.
Eliezer YudkowskyPart of the rationalist ethos is binding yourself emotionally to an absolutely lawful reductionistic universe - a universe containing no ontologically basic mental things such as souls or magic and pouring all your hope and all your care into that merely real universe and its possibilities, without disappointment.
Eliezer Yudkowsky