Map and Territory (Rationality: From AI to Zombies Book 1)
Rate it:
Open Preview
3%
Flag icon
Perhaps three of the ten balls will be red, and you’ll correctly guess how many red balls total were in the urn. Or perhaps you’ll happen to grab four red balls, or some other number. Then you’ll probably get the total number wrong. This random error is the cost of incomplete knowledge, and as errors go, it’s not so bad. Your estimates won’t be incorrect on average, and the more you learn, the smaller your error will tend to be. On the other hand, suppose that the white balls are heavier, and sink to the bottom of the urn. Then your sample may be unrepresentative in a consistent direction. ...more
3%
Flag icon
The idea of cognitive bias in psychology works in an analogous way. A cognitive bias is a systematic error in how we think, as opposed to a random error or one that’s merely caused by our ignorance. Whereas statistical bias skews a sample so that it less closely resembles a larger population, cognitive biases skew our thinking so that it less accurately tracks the truth (or less reliably serves our other goals).
4%
Flag icon
Most people answer “librarian.” Which is a mistake: shy salespeople are much more common than shy librarians, because salespeople in general are much more common than librarians—seventy-five times as common, in the United States.1 This is base rate neglect : grounding one’s judgments in how well sets of characteristics feel like they fit together, and neglecting how common each characteristic is in the population at large.2 Another example of a cognitive bias is the sunk cost fallacy —people’s tendency to feel committed to things they’ve spent resources on in the past, when they should be ...more
5%
Flag icon
The philosopher Alfred Korzybski once wrote: “A map is not the territory it represents, but, if correct, it has a similar structure to the territory, which accounts for its usefulness.” And what can be said of maps here, as Korzybski noted, can also be said of beliefs, and assertions, and words. “The map
6%
Flag icon
Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88.1 This is scope insensitivity or scope neglect : the number of birds saved—the scope of the altruistic action—had little effect on willingness to pay.
6%
Flag icon
human can visualize 2,000 birds at once, let alone 200,000. The usual finding is that exponential increases in scope create linear increases in willingness-to-pay—perhaps corresponding to the linear time for our eyes to glaze over the zeroes; this small amount of affect is added, not multiplied, with the prototype affect.
6%
Flag icon
The moral: If you want to be an effective altruist, you have to think it through with the part of your brain that processes those unexciting inky zeroes on paper, not just the part that gets real worked up about that poor struggling oil-soaked bird.
8%
Flag icon
The availability heuristic is judging the frequency or probability of an event by the ease with which examples of the event come to mind.
8%
Flag icon
real life, you’re unlikely to ever meet Bill Gates. But thanks to selective reporting by the media, you may be tempted to compare your life success to his—and suffer hedonic penalties accordingly. The objective frequency of Bill Gates is 0.00000000015, but you hear about him much more often. Conversely, 19% of the planet lives on less than $1/day, and I doubt that one fifth of the blog posts you read are written by them.
8%
Flag icon
Burton et al. report that when dams and levees are built, they reduce the frequency of floods, and thus apparently create a false sense of security, leading to reduced precautions.2 While building dams decreases the frequency of floods, damage per flood is afterward so much greater that average yearly damage increases.
9%
Flag icon
A society subject to regular minor hazards treats those minor hazards as an upper bound on the size of the risks, guarding against regular minor floods but not occasional major floods.
9%
Flag icon
“Cognitive biases” are those obstacles to truth which are produced, not by the cost of information, nor by limited computing power, but by the shape of our own mental machinery
9%
Flag icon
according to Surely You’re Joking, Mr. Feynman, which I read as a kid.) A bias is an obstacle to our goal of obtaining truth, and thus in our way.
10%
Flag icon
The conjunction fallacy is when humans assign a higher probability to a proposition of the form “A and B” than to one of the propositions “A” or “B” in isolation, even though it is a theorem that conjunctions are never likelier than their conjuncts.
10%
Flag icon
Which is to say: Adding detail can make a scenario sound more plausible, even though the event necessarily becomes less probable.
12%
Flag icon
I mean two things: 1. Epistemic rationality: systematically improving the accuracy of your beliefs. 2. Instrumental rationality: systematically achieving your values.
15%
Flag icon
The outside view is when you deliberately avoid thinking about the special, unique features of this project, and just ask how long it took to finish broadly similar projects in the past.
20%
Flag icon
Thus begins the ancient parable: If a tree falls in a forest and no one hears it, does it make a sound? One says, “Yes it does, for it makes vibrations in the air.” Another says, “No it does not, for there is no auditory processing in any brain.” If there’s a foundational skill in the martial art of rationality, a mental stance on which all other technique rests, it might be this one: the ability to spot, inside your own head, psychological signs that you have a mental map of something, and signs that you don’t. Suppose that, after a tree falls, the
21%
Flag icon
We can build up whole networks of beliefs that are connected only to each other—call these “floating” beliefs. It is a uniquely human flaw among animal species, a perversion of Homo sapiens ’s ability to build more general and flexible belief networks.
28%
Flag icon
The hottest place in Hell is reserved for those who in time of crisis remain neutral.
29%
Flag icon
But part of it also has to do with signaling a superior vantage point. After all—what would the other adults think of a principal who actually seemed to be taking sides in a fight between mere children ? Why, it would lower the principal’s status to a mere participant in the fray!
30%
Flag icon
In sum, there’s a difference between: Passing neutral judgment; Declining to invest marginal resources; Pretending that either of the above is a mark of deep wisdom, maturity, and a superior vantage point; with the corresponding implication that the original sides occupy lower vantage points that are not importantly different from up there.
31%
Flag icon
Something like the Human Genome Project—that was an internationally sponsored research project. I asked: How would different interest groups resolve their conflicts in a structure like the Human Genome Project?
31%
Flag icon
We believe we are already within a democratic system. Some factors are still missing, like the expression of the people’s will.
31%
Flag icon
conflicts. If all groups had the same preferred policies, there would be no need for democracy—we would automatically cooperate. The resolution process can be a direct majority vote, or an elected legislature, or even a voter-sensitive behavior of an artificial intelligence, but it has to be something. What does it mean to call for a “democratic” solution if you don’t have a conflict-resolution mechanism in mind?