The Black Swan: The Impact of the Highly Improbable (Incerto, #2)
Rate it:
Open Preview
4%
Flag icon
First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact (unlike the bird). Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.
4%
Flag icon
life is the cumulative effect of a handful of significant shocks.
4%
Flag icon
What you know cannot really hurt you.
4%
Flag icon
The inability to predict outliers implies the inability to predict the course of history,
4%
Flag icon
the reason free markets work is because they allow people to be lucky, thanks to aggressive trial and error, not by giving rewards or “incentives” for skill. The strategy is, then, to tinker as much as possible and try to collect as many Black Swan opportunities as you can.
4%
Flag icon
Another related human impediment comes from excessive focus on what we do know: we tend to learn the precise, not the general.
5%
Flag icon
The problem lies in the structure of our minds: we don’t learn rules, just facts, and only facts.
6%
Flag icon
Ideas come and go, stories stay.
6%
Flag icon
Let us call an antischolar—someone who focuses on the unread books, and makes an attempt not to treat his knowledge as a treasure, or even a possession, or even a self-esteem enhancement device—a skeptical empiricist.
6%
Flag icon
the three facets of the same Black Swan problem: a) The error of confirmation, or how we are likely to undeservedly scorn the virgin part of the library (the tendency to look at what confirms our knowledge, not our ignorance), in Chapter 5; b) the narrative fallacy, or how we fool ourselves with stories and anecdotes (Chapter 6); c) how emotions get in the way of our inference (Chapter 7); and d) the problem of silent evidence, or the tricks history uses to hide Black Swans from us
7%
Flag icon
Anatomy of a Black Swan—The triplet of opacity—Reading books backward—The rearview mirror—Everything becomes explainable—Always talk to the driver (with caution)—History doesn’t crawl; it jumps—“It was so unexpected”—Sleeping for twelve hours
8%
Flag icon
The human mind suffers from three ailments as it comes into contact with history, what I call the triplet of opacity. They are: the illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than they realize; the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and the overvaluation of factual information and the handicap of authoritative and learned people, particularly when they ...more
16%
Flag icon
Indeed, scholarship without erudition can lead to disasters.
16%
Flag icon
We focus on preselected segments of the seen and generalize from it to the unseen: the error of confirmation.
16%
Flag icon
We fool ourselves with stories that cater to our Platonic thirst for distinct patterns: the narrative fallacy.
16%
Flag icon
We behave as if the Black Swan does not exist: human nature is not progr...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
What we see is not necessarily all that is there. History hides Black Swans from us and gives us a mistaken idea about the odds of these events: t...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
We “tunnel”: that is, we focus on a few well-defined sources of uncertainty, on too specific a list of Black Swans (at the expense of the o...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
another person can get very high scores on the SATs and still feel a chill of fear when someone from the wrong side of town steps into the elevator. This inability to automatically transfer knowledge and sophistication from one situation to another, or from theory to practice, is a quite disturbing attribute of human nature.
17%
Flag icon
Knowledge, even when it is exact, does not often lead to appropriate actions because we tend to forget what we know, or forget how to process it properly if we do not pay attention, even when we are experts.
17%
Flag icon
I am saying that a series of corroborative facts is not necessarily evidence. Seeing white swans does not confirm the nonexistence of black swans.
17%
Flag icon
we can learn a lot from data—but not as much as we expect. Sometimes a lot of data can be meaningless; at other times one single piece of information can be very meaningful. It is true that a thousand days cannot prove you right, but one day can prove you to be wrong.
19%
Flag icon
We like stories, we like to summarize, and we like to simplify, i.e., to reduce the dimension of matters. The first of the problems of human nature
19%
Flag icon
The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship, upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding.
22%
Flag icon
As Stalin, who knew something about the business of mortality, supposedly said, “One death is a tragedy; a million is a statistic.” Statistics stay silent in us.
22%
Flag icon
Most of our mistakes in reasoning come from using System 1 when we are in fact thinking that we are using System 2. How? Since we react without thinking and introspection, the main property of System 1 is our lack of awareness of using it!
23%
Flag icon
the cortical part, which we are supposed to use for thinking, and which distinguishes us from other animals, and the fast-reacting limbic brain, which is the center of emotions, and which we share with other mammals.
23%
Flag icon
On a day-to-day basis, we are not introspective enough to realize that we understand what is going on a little less than warranted from a dispassionate observation of our experiences.
24%
Flag icon
Believe me, it is tough to deal with the social consequences of the appearance of continuous failure. We are social animals; hell is other people.
24%
Flag icon
Our emotional apparatus is designed for linear causality. For instance, if you study every day, you expect to learn something in proportion to your studies. If you feel that you are not going anywhere, your emotions will cause you to become demoralized. But modern reality rarely gives us the privilege of a satisfying, linear, positive progression: you may think about a problem for a year and learn nothing; then, unless you are disheartened by the emptiness of the results and give up, something will come to you in a flash.
26%
Flag icon
Humans will believe anything you say provided you do not exhibit the smallest shadow of diffidence; like animals, they can detect the smallest crack in your confidence before you express it.
26%
Flag icon
The trick is to be as smooth as possible in personal manners. It is much easier to signal self-confidence if you are exceedingly polite and friendly; you can control people without having to offend their sensitivity.
26%
Flag icon
The problem with business people, Nero realized, is that if you act like a loser they will treat you as a loser—you set the yardstick yourself. There is no absolute measure of good or bad. It is not wh...
This highlight has been truncated due to consecutive passage length restrictions.
29%
Flag icon
This bias causes the survivor to be an unqualified witness of the process. Unsettling? The fact that you survived is a condition that may weaken your interpretation of the properties of the survival, including the shallow notion of “cause.”
31%
Flag icon
In real life you do not know the odds; you need to discover them, and the sources of uncertainty are not defined.
32%
Flag icon
Those who spend too much time with their noses glued to maps will tend to mistake the map for the territory.
32%
Flag icon
Probability is a liberal art; it is a child of skepticism, not a tool for people with calculators on their belts to satisfy their desire to produce fancy calculations and certainties.
33%
Flag icon
Train your reasoning abilities to control your decisions; nudge System 1 (the heuristic or experiential system) out of the important ones. Train yourself to spot the difference between the sensational and the empirical.
33%
Flag icon
“focus” makes you a sucker; it translates into prediction problems, as we will see in the next section. Prediction, not narration, is the real test of our understanding of the world.
34%
Flag icon
Epistemic arrogance bears a double effect: we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states (i.e., by reducing the space of the unknown).
34%
Flag icon
The appearance of busyness reinforces the perception of causality, of the link between results and one’s role in them. This of course applies even more to the CEOs of large companies who need to trumpet a link between their “presence” and “leadership” and the results of the company.
35%
Flag icon
The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
35%
Flag icon
Listening to the news on the radio every hour is far worse for you than reading a weekly magazine, because the longer interval allows information to be filtered a bit.
35%
Flag icon
psychologist Paul Slovic asked bookmakers to select from eighty-eight variables in past horse races those that they found useful in computing the odds. These variables included all manner of statistical information about past performances. The bookmakers were given the ten most useful variables, then asked to predict the outcome of races. Then they were given ten more and asked to predict again. The increase in the information set did not lead to an increase in their accuracy; their confidence in their choices, on the other hand, went up markedly. Information proved to be toxic.
35%
Flag icon
I’ve struggled much of my life with the common middlebrow belief that “more is better”—more is sometimes, but not always, better. This toxicity of knowledge will show in our investigation of the so-called expert.
35%
Flag icon
it is a good idea to question the error rate of an expert’s procedure. Do not question his procedure, only his confidence.
35%
Flag icon
Simply, things that move, and therefore require knowledge, do not usually have experts, while things that don’t move seem to have some experts. In other words, professions that deal with the future and base their studies on the nonrepeatable past have an expert problem (with the exception of the weather and businesses involving short-term physical processes, not socioeconomic ones).
35%
Flag icon
You cannot ignore self-delusion. The problem with experts is that they do not know what they do not know. Lack of knowledge and delusion about the quality of your knowledge come together—the same process that makes you know less also makes you satisfied with your knowledge.
36%
Flag icon
experts did not realize that they were not so good at their own business, in other words, how they spun their stories. There seemed to be a logic to such incompetence, mostly in the form of belief defense, or the protection of self-esteem. He therefore dug further into the mechanisms by which his subjects generated ex post explanations.
36%
Flag icon
These “experts” were lopsided: on the occasions when they were right, they attributed it to their own depth of understanding and expertise; when wrong, it was either the situation that was to blame, since it was unusual, or, worse, they did not recognize that they were wrong and spun stories around it.
« Prev 1