More on this book
Community
Kindle Notes & Highlights
When human lives are at stake, we have a duty to maximize, not satisfice; and this duty has the same strength as the original duty to save lives. Whoever knowingly chooses to save one life, when they could have saved two—to say nothing of a thousand lives, or a world—they have damned themselves as thoroughly as any murderer.
Hobbes said, “I don’t know what’s worse, the fact that everyone’s got a price, or the fact that their price is so low.” So very low the price, so very eager they are to be bought. They don’t look twice and then a third time for alternatives, before deciding that they have no option left but to transgress—though they may look very grave and solemn when they say it.
Historically speaking, science won because it displayed greater raw strength in the form of technology, not because science sounded more reasonable. To this very day, magic and scripture still sound more reasonable to untrained ears than science. That is why there is continuous social tension between the belief systems. If science not only worked better than magic, but also sounded more intuitively reasonable, it would have won entirely by now.
The first virtue is curiosity.
The second virtue is relinquishment.
The third virtue is lightness.
The fourth virtue is evenness.
The fifth virtue is argument.
The sixth virtue is empiricism.
The seventh virtue is simplicity.
The eighth virtue is humility.
The ninth virtue is perfectionism.
The tenth virtue is precision.
The eleventh virtue is scholarship.
Every step of your reasoning must cut through to the correct answer in the same movement. More than anything, you must think of carrying your map through to reflecting the territory.
In light of evidence that training in statistics—and some other fields, such as psychology—improves reasoning skills outside the classroom, statistical literacy is directly relevant to the project of overcoming bias. (Classes in formal logic and informal fallacies have not proven similarly useful.)12,13
Yudkowsky and other writers on Less Wrong have helped seed the effective altruism movement, a vibrant and audacious effort to identify the most high-impact humanitarian charities and causes.
These writings also sparked the establishment of the Center for Applied Rationality, a nonprofit organization that attempts to translate results from the science of rationality into useable techniques for self-improvement.
Thus, modern rabbis are not allowed to overrule ancient rabbis. Crawly things are ordinarily unkosher, but it is permissible to eat a worm found in an apple—the ancient rabbis believed the worm was spontaneously generated inside the apple, and therefore was part of the apple. A modern rabbi cannot say, “Yeah, well, the ancient rabbis knew diddly-squat about biology. Overruled!” A modern rabbi cannot possibly know a halachic principle the ancient rabbis did not, because how could the ancient rabbis have passed down the answer from Mount Sinai to him? Knowledge derives from authority, and
...more
“Torah loses knowledge in every generation. Science gains knowledge with every generation. No matter where they started out, sooner or later science must surpass Torah.”
Hunter-gatherer tribes are usually highly egalitarian (at least if you’re male)—the all-powerful tribal chieftain is found mostly in agricultural societies, rarely in the ancestral environment. Among most hunter-gatherer tribes, a hunter who brings in a spectacular kill will carefully downplay the accomplishment to avoid envy.
The first elementary technique of epistemology—it’s not deep, but it’s cheap—is to distinguish the quotation from the referent. Talking about snow is not the same as talking about “snow.” When I use the word “snow,” without quotes, I mean to talk about snow; and when I use the word ““snow,”” with quotes, I mean to talk about the word “snow.” You have to enter a special mode, the quotation mode, to talk about your beliefs. By default, we just talk about reality.
And I realized that the word “impossible” had two usages: Mathematical proof of impossibility conditional on specified axioms; “I can’t see any way to do that.”
I’m not ordinarily a fan of the theory that opposing biases can cancel each other out, but sometimes it happens by luck. If I’d seen that whole mountain at the start—if I’d realized at the start that the problem was not to build a seed capable of improving itself, but to produce a provably correct Friendly AI—then I probably would have burst into flames.
Richard Hamming used to go around asking his fellow scientists two questions: “What are the important problems in your field?,” and, “Why aren’t you working on them?”
To do things that are very difficult or “impossible,” First you have to not run away. That takes seconds. Then you have to work. That takes hours. Then you have to stick at it. That takes years. Of these, I had to learn to do the first reliably instead of sporadically; the second is still a constant struggle for me; and the third comes naturally.
There is a level beyond the virtue of tsuyoku naritai (“I want to become stronger”). Isshoukenmei was originally the loyalty that a samurai offered in return for his position, containing characters for “life” and “land.” The term evolved to mean “make a desperate effort”: Try your hardest, your utmost, as if your life were at stake. It was part of the gestalt of bushido, which was not reserved only for fighting. I’ve run across variant forms issho kenmei and isshou kenmei; one source indicates that the former indicates an all-out effort on some single point, whereas the latter indicates a
...more
Every now and then, someone asks why the people who call themselves “rationalists” don’t always seem to do all that much better in life, and from my own history the answer seems straightforward: It takes a tremendous amount of rationality before you stop making stupid damn mistakes.
As I’ve mentioned a couple of times before: Robert Aumann, the Nobel laureate who first proved that Bayesians with the same priors cannot agree to disagree, is a believing Orthodox Jew. Surely he understands the math of probability theory, but that is not enough to save him. What more does it take? Studying heuristics and biases? Social psychology? Evolutionary psychology? Yes, but also it takes isshoukenmei, a desperate effort to be rational—to rise above the level of Robert Aumann.
This is hardly an original observation on my part: but entrepreneurship, risk-taking, leaving the herd, are still advantages the West has over the East. And since Japanese scientists are not yet preeminent over American ones, this would seem to count for at least as much as desperate efforts.
Even so, I think that we could do with more appreciation of the virtue “make an extraordinary effort.” I’ve lost count of how many people have said to me something like: “It’s futile to work on Friendly AI, because the first AIs will be built by powerful corporations and they will only care about maximizing profits.” “It’s futile to work on Friendly AI, the first AIs will be built by the military as weapons.” And I’m standing there thinking: Does it even occur to them that this might be a time to try for something other than the default outcome? They and I have different basic assumptions
...more
And the root of this problem, I do suspect, is that we haven’t really gotten together and systematized our skills. We’ve had to create all of this for ourselves, ad-hoc, and there’s a limit to how much one mind can do, even if it can manage to draw upon work done in outside fields.
Daniel Burfoot brilliantly suggests that this is why intelligence seems to be such a big factor in rationality—that when you’re improvising everything ad-hoc with very little training or systematic practice, intelligence ends up being the most important factor in what’s left.
Why don’t we have an Art of Rationality? Third, because current “rationalists” have trouble working in groups: of this I shall speak more. Second, because it is hard to verify success in training, or which of two schools is the stronger. But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.
And you do not warn them to scrutinize arguments they agree with just as hard as they scrutinize incongruent arguments for flaws. So they have acquired a great repertoire of flaws of which to accuse only arguments and arguers who they don’t like.
Yes, a group that can’t tolerate disagreement is not rational. But if you tolerate only disagreement—if you tolerate disagreement but not agreement—then you also are not rational. You’re only willing to hear some honest thoughts, but not others. You are a dangerous half-a-rationalist.
The other major component that I think sabotages group efforts in the technophile community is being ashamed of strong feelings. We still have the Spock archetype of rationality stuck in our heads, rationality as dispassion. Or perhaps a related mistake, rationality as cynicism—trying to signal your superior world-weary sophistication by showing that you care less than others. Being careful to ostentatiously, publicly look down on those so naive as to show they care strongly about anything. Wouldn’t it make you feel uncomfortable if the speaker at the podium said that they cared so strongly
...more
If the issue isn’t worth your personally fixing by however much effort it takes, and it doesn’t arise from outright bad faith, it’s not worth refusing to contribute your efforts to a cause you deem worthwhile.
akrasia
We could try for a group norm of being openly allowed—nay, applauded—for caring strongly about something. And a group norm of being expected to do something useful with your life—contribute your part to cleaning up this world. Religion doesn’t really emphasize the getting-things-done aspect as much.