More on this book
Community
Kindle Notes & Highlights
Read between
January 28 - April 29, 2018
Swan-blind species on the planet.
The idea is simply to let human mistakes and miscalculations remain confined,
What’s more, they need the Extremistan type of variability, certain extreme stressors. Otherwise they become fragile. That, I completely missed.
Marathon running is a modern abomination (particularly when done without emotional stimuli).
we hunted in response to hunger; we did not eat breakfast to hunt, hunting accentuated our energy deficits.
If you deprive an organism of stressors, you affect its epigenetics and gene expression—some genes are up-regulated (or down-regulated) by contact with the environment.
A person who does not face stressors will not survive should...
This highlight has been truncated due to consecutive passage length restrictions.
By the same argument we can lower 90 percent of Black Swan risks in economic
life … by just eliminating speculative debt.
The Black Swan is about human error in some domains, swelled by a long tradition of scientism and a plethora of information that fuels confidence without increasing knowledge.
Not understanding that doing nothing can be much more preferable to doing something potentially harmful.
satisfy a genuine curiosity,
Black Swan for the turkey is not a Black Swan for the butcher.
Some people, otherwise intelligent, have a deficiency of that human ability to impute to others knowledge that is different from their own.
Indeed, the Latin poet Lucretius, who did not attend business school, wrote that we consider the biggest object of any kind
that we have seen in our lives as the largest possible item:
The notion that two people can have two different views of the world, then express them as different probabilities remained foreign to the research.
different people can, while being rational, assign different probabilities to different future states of the world. This is called “subjective probability.”
Dutch book constraint: that is, you cannot express your probabilities inconsistently
by engaging in a series of bets that lock in a certain loss, for example, by acting as if the probabilities of separable contingencies can add up to more than 100 percent.
There is another difference here, between “true” randomness (say the equivalent of God throwing a die) and randomness that results from what I call epistem...
This highlight has been truncated due to consecutive passage length restrictions.
Clearly, you cannot doubt everything and function; you cannot believe everything and survive.
Think of living in a three-dimensional space while under the illusion of being in two dimensions. It may work well if you are a worm, certainly not if you happen to be a bird.
With respect to Black Swans, you act to protect yourself from negative ones (or expose yourself to positive ones) even though you may have no evidence that they can take place, just as we check people for weapons before they board a plane even though we have no evidence that they are terrorists. This focus on off-the-shelf commoditized notions such as “evidence,” is a problem with people who claim to use “rigor” yet
go bust on occasion.
The more remote the event, the less we can get empirical data (assuming generously that the future will resemble the past) and the more we need to rely on theory.
We thus need a prior model representation for that; the rarer the event, the higher the error in estimation from standard inductive methods (say, frequency sampling from counting past occurrences), hence the higher the dependence on an a priori representation that extrapolates into the space of low-probability events (which necessarily are not seen often).
“Undecidability: On the inconsistency of estimating probabilities from a sample without binding a priori assumptions on the class of acceptable probabilities.”
we in real life do not care about simple, raw probability (whether an event
happens or does not happen); we worry about consequences (the size of the event; how much total destruction of lives or wealth, or other losses, will come from it; how much benefit a beneficial event will bring). Given
Likewise, it is easy to construct a theory in your mind, or pick it up from Harvard, then go project it on the world. Then things are very simple.
Clearly, catastrophic events will be necessarily absent from the data, since the survivorship of the variable itself will depend on such effect.
In Extremistan, things work differently. The conditional expectation of an increase in a random variable does not converge to the threshold as the variable gets larger.
This tells us that there is “no” typical failure and “no” typical success. You may be able to predict the occurrence of a war, but you will not be able to gauge its effect! Conditional on a war killing more than 5 million people, it should kill around 10 million (or more).
You may correctly predict that a skilled person will get “rich,” but, conditional on his making it, his wealth can reach $1 million, $10 million, $1 billion, $10 billion—there is no typical number.
“A war” is meaningless: you need to estimate its damage—and no damage is typical.
Our research shows that the way a risk is framed sharply influences people’s understanding of it.
It is much more sound to take risks you can measure than to measure the risks you
are taking.
Very true or very false does not bring you additional benefits or damage.
Binary exposures do not depend on high-impact events
The recommendation is to move from the Fourth Quadrant into the third one. It is not possible to change the distribution; it is possible to change the exposure, as will be discussed in the next section.
In other words, the Fourth Quadrant is where the difference between absence of evidence and evidence of absence becomes acute.
Yet people do not realize that success consists mainly in avoiding losses, not in trying to derive profits.
spent twelve years trying to explain that in many instances it was better—and wiser—to have no models than to have the mathematical acrobatics we had.
The very term iatrogenics, i.e., the study of the harm caused by the healer, is not widespread—I
You cannot do anything with knowledge unless you know where it stops, and the costs of using it. Post-Enlightenment
Have respect for time and nondemonstrative knowledge.
Avoid optimization; learn to love redundancy.
Avoid prediction of small-probability payoffs—though not necessarily of ordinary ones.

