"I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future. One of those interests is the human pursuit of truth, which has strengthened slowly over the generations (for there was not always science). I wish to strengthen that pursuit further, in this generation. That is a wish of mine, for the Future. For we are all of us players upon that vast gameboard, whether we accept the responsibility or not.
"And that makes your rationality my business.
"Is this a dangerous idea? Yes, and not just pleasantly edgy 'dangerous.' People have been burned to death because some priest decided that they didn't think the way they should. Deciding to burn people to death because they 'don't think properly'—that's a revolting kind of reasoning, isn't it? You wouldn't want people to think that way, why, it's disgusting. People who think like that, well, we'll have to do something about them...
"I agree! Here's my proposal: Let's argue against bad ideas but not set their bearers on fire."
Human intelligence is a superweapon: an amazing capacity that has single-handedly put humans in a dominant position on Earth. When human intelligence defeats itself and goes off the rails, the fallout therefore tends to be a uniquely big deal. In "How to Actually Change Your Mind," decision theorist Eliezer Yudkowsky asks how we can better identify and sort out our biases, integrate new evidence, and achieve lucidity in our daily lives. Because it really seems as though we should be able to do better–
–and a three-pound all-purpose superweapon is a terrible thing to waste.