More on this book
Community
Kindle Notes & Highlights
Read between
November 23, 2017 - January 8, 2018
laboratory and take off when you leave. It applies to daily life, though this part is subtler and more difficult. But if you can’t say “Oops” and give up when it looks like something isn’t working, you have no choice but to keep shooting yourself in the foot.
Imagine reaching into an urn that contains seventy white balls and thirty red ones, and plucking out ten mystery balls. Perhaps three of the ten balls will be red, and you’ll correctly guess how many red balls total were in the urn. Or perhaps you’ll happen to grab four red balls, or some other number. Then you’ll probably get the total number wrong. This random error is the cost of incomplete knowledge, and as errors go, it’s not so bad. Your estimates won’t be incorrect on average, and the more you learn, the smaller your error will tend to be. On the other hand, suppose that the white balls
...more
There’s a completely different notion of “rationality” studied by mathematicians, psychologists, and social scientists. Roughly, it’s the idea of doing the best you can with what you’ve got.
The representativeness heuristic can also contribute to base rate neglect, where we ground our judgments in how intuitively “normal” a combination of attributes is, neglecting how common each attribute is in the population at large.8 Is it more likely that Steve is a shy librarian, or that he’s a shy salesperson? Most people answer this kind of question by thinking about whether “shy” matches their stereotypes of those professions. They fail to take into consideration how much more common salespeople are than librarians—seventy-five times as common, in the United States.9
It’s known that we can reduce base rate neglect by thinking of probabilities as frequencies of objects or events.
What Do I Mean By “Rationality”? I mean: Epistemic rationality: systematically improving the accuracy of your beliefs. Instrumental rationality: systematically achieving your values.
If something goes wrong with your eyes, or your brain, then your mental model might say there’s a bookcase where no bookcase exists, and when you go over to get a book, you’ll be disappointed. This is what it’s like to have a false belief, a map of the world that doesn’t correspond to the territory. Epistemic rationality is about building accurate maps instead. This correspondence between belief and reality is commonly called “truth,” and I’m happy to call it that.
Similarly, “Rational agents make decisions that maximize the probabilistic expectation of a coherent utility function” is the kind of thought that depends on a concept of (instrumental) rationality, whereas “It’s rational to eat vegetables” can probably be replaced with “It’s useful to eat vegetables” or “It’s in your interest to eat vegetables.” We need a concept like “rational” in order to note general facts about those ways of thinking that systematically produce truth or value—and the systematic ways in which we fall short of those standards.
In cases like these, it is futile to try to settle the problem by coming up with some new definition of the word “rational” and saying, “Therefore my preferred answer, by definition, is what is meant by the word ‘rational.’” This simply raises the question of why anyone should pay attention to your definition. I’m not interested in probability theory because it is the holy word handed down from Laplace. I’m interested in Bayesian-style belief-updating (with Occam priors) because I expect that this style of thinking gets us systematically closer to, you know, accuracy, the map that reflects the
...more
You cannot change reality, or prove the thought, by manipulating which meanings go with which words.
P. C. Hodgell said: “That which can be destroyed by the truth should be.”
phlegmatic
Ever since I adopted the rule of “That which can be destroyed by the truth should be,” I’ve also come to realize “That which the truth nourishes should thrive.”
“The first virtue is curiosity.”
But what set humanity firmly on the path of Science was noticing that certain modes of thinking uncovered beliefs that let us manipulate the world.
Are there motives for seeking truth besides curiosity and pragmatism? The third reason that I can think of is morality: You believe that to seek the truth is noble and important and worthwhile. Though such an ideal also attaches an intrinsic value to truth, it’s a very different state of mind from curiosity. Being curious about what’s behind the curtain doesn’t feel the same as believing that you have a moral duty to look there. In the latter state of mind, you are a lot more likely to believe that someone else should look behind the curtain, too, or castigate them if they deliberately close
...more
deontological
Personally, I see our quest in terms of acquiring personal skills of rationality, in improving truthfinding technique. The challenge is to attain the positive goal of truth, not to avoid the negative goal of failure. Failurespace is wide, infinite errors in infinite variety. It is difficult to describe so huge a space: “What is true of one apple may not be true of another apple; thus more can be said about a single apple than about all the apples in the world.” Success-space is narrower, and therefore more can be said about it.
The availability heuristic is judging the frequency or probability of an event by the ease with which examples of the event come to mind.
society well-protected against minor hazards takes no action against major risks, building on flood plains once the regular minor floods are eliminated.
Which is to say: Adding detail can make a scenario SOUND MORE PLAUSIBLE, even though the event necessarily BECOMES LESS PROBABLE.
need to notice the word “and.” They would need to be wary of it—not just wary, but leap back from it. Even without knowing that researchers were afterward going to test them on the conjunction fallacy particularly. They would need to notice the conjunction of two entire details, and be shocked by the audacity of anyone asking them to endorse such an insanely complicated prediction.
It might also have helped the forecasters to think about possible reasons why the US and Soviet Union would suspend diplomatic relations. The scenario is not “The US and Soviet Union suddenly suspend diplomatic relations for no reason,” but “The US and Soviet Union suspend diplomatic relations for any reason.”
Similarly, consider the six-sided die with four green faces and two red faces. The subjects had to bet on the sequence (1) RGRRR, (2) GRGRRR, or (3) GRRRRR appearing anywhere in twenty rolls of the dice.3 Sixty-five percent of the subjects chose GRGRRR, which is strictly dominated by RGRRR, since any sequence containing GRGRRR also pays off for RGRRR.
The subjects lost heuristically by thinking: “Aha! Sequence 2 has the highest proportion of green to red! I should bet on Sequence 2!” To win heuristically, the subjects would need to think: “Aha! Sequence 1 is short! I should go with Sequence 1!” They would need to feel a stronger emotional impact from Occam’s Razor—feel every added detail as a burden, even a single extra roll of the dice.
And I said, “It is more probable that universes replicate for any reason, than that they replicate via black holes because advanced civilizations manufacture black holes because universes evolve to make them do it.” And he said, “Oh.”
You have to disentangle the details. You have to hold up every one independently, and ask, “How do we know this detail?” Someone sketches out a picture of humanity’s descent into nanotechnological warfare, where China refuses to abide by an international control agreement, followed by an arms race . . . Wait a minute—how do you know it will be China? Is that a crystal ball in your pocket or are you just happy to be a futurist? Where are all these details coming from? Where did that specific detail come from?
For it is written: If you can lighten your burden you must do so. There is no straw that lacks the power to break your back.
The outside view is when you deliberately avoid thinking about the special, unique features of this project, and just ask how long it took to finish broadly similar projects in the past.
So there is a fairly reliable way to fix the planning fallacy, if you’re doing something broadly similar to a reference class of previous projects. Just ask how long similar projects have taken in the past, without considering any of the special properties of this project. Better yet, ask an experienced outsider how long similar projects have taken. You’ll get back an answer that sounds hideously long, and clearly reflects no understanding of the special reasons why this particular task will take less time. This answer is true. Deal with it.
Be not too quick to blame those who misunderstand your perfectly clear sentences, spoken or written. Chances are, your words are more ambiguous than you think.
A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts. If you don’t recurse far enough, you’re just talking to yourself.
The whole idea of Science is, simply, reflective reasoning about a more reliable process for making the contents of your mind mirror the contents of the world.
Making Beliefs Pay Rent (in Anticipated Experiences)
experience, or poorly connected. Alchemists believed that phlogiston caused fire—we could oversimply their minds by drawing a little node labeled “Phlogiston,” and an arrow from this node to their sensory experience of a crackling campfire—but this belief yielded no advance predictions; the link from phlogiston to experience was always configured after the experience, rather than constraining the experience in advance.
We can build up whole networks of beliefs that are connected only to each other—call these “floating” beliefs. It is a uniquely human flaw among animal species, a perversion of Homo sapiens’s ability to build more general and flexible belief networks.
The rationalist virtue of empiricism consists of constantly asking which experiences our beliefs predict—or better yet, prohibit. Do you believe that phlogiston is the cause of fire? Then what do you expect to see happen, because of that? Do you believe that Wulky Wilkinsen is a post-utopian? Then what do you expect to see because of that? No, not “colonial alienation”; what experience will happen to you? Do you believe that if a tree falls in the forest, and no one hears it, it still makes a sound? Then what experience must therefore befall you?
It is even better to ask: what experience must not happen to you? Do you believe that élan vital explains the mysterious aliveness of living beings? Then what does this belief not allow to happen—what would definitely falsify this belief? A null answer means that your belief does ...
This highlight has been truncated due to consecutive passage length restrictions.
Above all, don’t ask what to believe—ask what to anticipate. Every question of belief should flow from a question of anticipation, and that question of anticipation should be the center of the inquiry.
a belief turns deadbeat, evict it.
cerulean
Maybe the dragon-claimant fears the public ridicule that they imagine will result if they publicly confess they were wrong (although, in fact, a rationalist would congratulate them, and others are more likely to ridicule the claimant if they go on claiming there’s a dragon in their garage). Maybe the dragon-claimant flinches away from the prospect of admitting to themselves that there is no dragon, because it conflicts with their self-image as the glorious discoverer of the dragon, who saw in their garage what all others had failed to see.
But it is realistic to say the dragon-claimant anticipates as if there is no dragon in their garage, and makes excuses as if they believed in the belief.
When someone makes up excuses in advance, it would seem to require that belief and belief in belief have become unsynchronized.
I was once at a dinner party, trying to explain to a man what I did for a living, when he said: “I don’t believe Artificial Intelligence is possible because only God can make a soul.” At this point I must have been divinely inspired, because I instantly responded: “You mean if I can make an Artificial Intelligence, it proves your religion is false?” He said, “What?” I said, “Well, if your religion predicts that I can’t possibly make an Artificial Intelligence, then, if I make an Artificial Intelligence, it means your religion is false. Either your religion allows that it might be possible for
...more
“No, we can’t, actually. There’s a theorem of rationality called Aumann’s Agreement Theorem which shows that no two rationalists can agree to disagree. If two people disagree with each other, at least one of them must be doing something wrong.”
The hottest place in Hell is reserved for those who in time of crisis remain neutral. —Dante Alighieri, famous hell expert John F. Kennedy, misquoter It’s common to put on a show of neutrality or suspended judgment in order to signal that one is mature, wise, impartial, or just has a superior vantage point.
This I call “pretending to be Wise.” Of course there are many ways to try and signal wisdom. But trying to signal wisdom by refusing to make guesses—refusing to sum up evidence—refusing to pass judgment—refusing to take sides—staying above the fray and looking down with a lofty and condescending gaze—which is to say, signaling wisdom by saying and doing nothing—well, that I find particularly pretentious. Paolo Freire said, “Washing one’s hands of the conflict between the powerful and the powerless means to side with the powerful, not to be neutral.”1 A playground is a great place to be a
...more
This highlight has been truncated due to consecutive passage length restrictions.
If you can think of ways to pull the rope sideways, you are justified in expending your limited resources on relatively less common issues where marginal discussion offers relatively higher marginal payoffs. But then the responsibilities that you deprioritize are a matter of your limited resources. Not a matter of floating high above, serene and Wise.
if