More on this book
Community
Kindle Notes & Highlights
all I will be showing you in this book is how to avoid crossing the street blindfolded.
They Want to Live in Mediocristan
the central difficulty of generalizing from available information, or of learning from the past, the known, and the seen.
You can see that it is extremely convenient for us to assume that we live in Mediocristan.
We focus on preselected segments of the seen and generalize from it to the unseen: the error of confirmation.
We fool ourselves with stories that cater to our Platonic thirst for distinct patterns: the narrative fallacy.
What we see is not necessarily all that is there. History hides Black Swans from us and gives us a mistaken idea about the odds of these events: this is the distortion of silent evidence.
We “tunnel”: that is, we focus on a few well-defined sources of uncertainty, on too specific a list of Black Swans (at the expense of the others that do not easily come to mind).
Clearly you cannot manufacture more information than the past can deliver;
event. People have an incentive to bet against it, or to game the system since they can be paid a bonus reflecting their yearly performance when in fact all they are doing is producing illusory profits that they will lose back one day.
namely shareholders, can be taken for a ride by the managers who show returns and cosmetic profitability but in fact might be taking hidden risks.
CONFIRMATION SHMONFIRMATION!
As much as it is ingrained in our habits and conventional wisdom, confirmation can be a dangerous error.
Look, the other day I had breakfast with him and he didn’t kill anybody. I am serious, I did not see him kill a single person. Wouldn’t that confirm his innocence?
You would have the same reaction if I told you that I took a nap the other day on the railroad track in New Rochelle, New York, and was not killed. Hey, look at me, I am alive, I would say, and that is evidence that lying on train tracks is risk-free.
someone who observed the turkey’s first thousand days (but not the shock of the thousand and first) would tell you, and rightly so, that there is no evidence of the possibility of large events, i.e., Black Swans.
Unless we concentrate very hard, we are likely to unwittingly simplify the problem because our minds routinely do so without our knowing it.
Many people confuse the statement “almost all terrorists are Moslems” with “almost all Moslems are terrorists.”
even if most criminals come from their ethnic subgroup, most of their ethnic subgroup are not criminals, but they still suffer from discrimination by people who should know better.
This problem is chronic: if you tell people that the key to success is not always skills, they think that you are telling them that it is never skills, always luck.
Consider that in a primitive environment there is no consequential difference between the statements most killers are wild
animals and most wild animals are killers. There is an error here, but it is almost inconsequential.
By domain-specific I mean that our reactions, our mode of thinking, our intuitions, depend on the context in which the matter is presented,
The classroom is a domain; real life is another. We react to a piece of information not on its logical merit, but on the basis of which framework surrounds it, and how it registers with our social-emotional system.
For another illustration of the way we can be ludicrously domain-specific in daily life, go to the luxury Reebok Sports Club in New York City, and look at the number of people who, after riding the escalator for a couple of floors, head directly to the StairMasters.
This domain specificity of our inferences and reactions works both ways: some problems we can understand in their applications but not in textbooks; others we are better at capturing in the textbook than in the practical application.
We tend to use different mental machinery—so-called modules—in different situations: our brain lacks a central all-purpose computer that starts with logical rules and applies them equally to all possible situations.
An acronym used in the medical literature is NED, which stands for No Evidence of Disease. There is no such thing as END, Evidence of No Disease.
understanding—a simple confusion of absence of evidence of the benefits of mothers’ milk with evidence of absence of the benefits (another case of Platonicity as “it did not make sense” to breast-feed when we could simply use bottles).
Likewise with tonsils: the removal of tonsils may lead to a higher incidence of throat cancer, but for decades doctors never suspected that this “useless” tissue might actually have a use that escaped their detection.
we have a natural tendency to look for instances that confirm our story and our vision of the world—these instances are always easy to find.
You take past instances that corroborate your theories and you treat them as evidence.
Even in testing a hypothesis, we tend to look for instances where the hypothesis proved true.
If I see someone kill, then I can be practically certain that he is a criminal. If I don’t see him kill, I cannot be certain that he is innocent. The same applies to cancer detection: the finding of a single malignant tumor proves that you have cancer, but the absence of such a finding cannot allow you to say with certainty that you are cancer-free.
We can get closer to the truth by negative instances, not by verification! It is misleading to build a general rule from observed facts.
Contrary to conventional wisdom, our body of knowledge does not increase from a series of confirmatory ob...
This highlight has been truncated due to consecutive passage length restrictions.
The subtlety of real life over the books is that, in your decision making, you need be interested only in one side of the story: if you seek certainty about whether the patient has cancer, not certainty about whether he is healthy, then you might be satisfied with negative inference, since it will supply you the certainty you seek.
“We” are the empirical decision makers who hold that uncertainty is our discipline, and that understanding how to act under conditions of incomplete information is the highest and most urgent human pursuit.
theory around this asymmetry, based on a technique called “falsification” (to falsify is to prove wrong)
Popper’s far more powerful and original idea is the “open” society, one that relies on skepticism as a modus operandi, refusing and resisting definitive truths.
But Popper’s biggest idea was his insight concerning the fundamental, severe, and incurable unpredictability of the world, and that I will leave for the chapter on prediction.*
But it remains the case that you know what is wrong with a lot more confidence than you know what is right. All pieces of information are not equal in importance.
Popper introduced the mechanism
of conjectures and refutations, which works as follows: you formulate a (bold) conjecture and you start looking for the obse...
This highlight has been truncated due to consecutive passage length restrictions.
You can test a given rule either directly, by looking at instances where it works, or indirectly, by focusing on where it does not work. As we saw earlier, disconfirming instances are far more powerful in establishing truth.
Wason noticed that the subjects had a rule in mind, but gave him examples aimed at confirming it instead of trying to supply series that were inconsistent with their hypothesis.
Subjects were asked which questions to ask to find out whether a person was extroverted or not, purportedly for another type of experiment. It was established that subjects supplied mostly questions for which a “yes” answer would support the hypothesis.
Sadly, the notion of corroboration is rooted in our intellectual habits and discourse.
Saw Another Red Mini!
If you believe that witnessing an additional white swan will bring confirmation that there are no black swans, then you should also accept the statement, on purely logical grounds, that the sighting of a red Mini Cooper should confirm that there are no black swans.

