More on this book
Community
Kindle Notes & Highlights
Read between
December 12, 2017 - February 1, 2018
The payoff of a human venture is, in general, inversely proportional to what it is expected to be.
ones. * The highly expected not happening is also a Black Swan. Note that, by symmetry, the occurrence of a highly improbable event is the equivalent of the nonoccurrence of a highly probable one.
It is one thing to be cosmetically defiant of authority by wearing unconventional clothes—what social scientists and economists call “cheap signaling”—and another to prove willingness to translate belief into action.
You can afford to be compassionate, lax, and courteous if, once in a while, when it is least expected of you, but completely justified, you sue someone, or savage an enemy, just to show that you can walk the walk.
what I call the triplet of opacity. They are: the illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than they realize; the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and the overvaluation of factual information and the handicap of authoritative
Today’s alliance between Christian fundamentalists and the Israeli lobby would certainly seem puzzling to a nineteenth-century intellectual—Christians used to be anti-Semites and Moslems were the protectors of the Jews, whom they preferred to Christians. Libertarians used to be left-wing. What is interesting to me as a probabilist is that some random event makes one group that initially supports an issue ally itself with another group that supports another issue, thus causing the two items to fuse and unify … until the surprise of the separation.
not. Almost all social matters are from Extremistan. Another way to say it is that social quantities are informational, not physical: you cannot touch them. Money in a bank account is something important, but certainly not physical. As such it can take any value without necessitating the expenditure of energy. It is just a number!
You can still experience severe Black Swans in Mediocristan, though not easily. How? You may forget that something is random, think that it is deterministic, then have a surprise. Or you can tunnel and miss on a source of uncertainty, whether mild or wild, owing to lack of imagination—most Black Swans result from this “tunneling” disease, which I will discuss in Chapter 9.*
For another illustration of the way we can be ludicrously domain-specific in daily life, go to the luxury Reebok Sports Club in New York City, and look at the number of people who, after riding the escalator for a couple of floors, head directly to the StairMasters.
As we saw earlier, disconfirming instances are far more powerful in establishing truth. Yet we tend to not be aware of this property.
People in professions with high randomness (such as in the markets) can suffer more than their share of the toxic effect of look-back stings: I should have sold my portfolio at the top;
Don’t try to willingly avoid thinking about it: this will almost surely backfire. A more appropriate solution is to make the event appear more unavoidable. Hey, it was bound to take place and it seems futile to agonize over it. How can you do so? Well, with a narrative. Patients who spend fifteen minutes every day writing an account of their daily troubles feel indeed better about what has befallen them.
If you work in a randomness-laden profession, as we see, you are likely to suffer burnout effects from that constant second-guessing of your past actions in terms of what played out subsequently. Keeping a diary is the least you can do in these circumstances.
There are fact-checkers, not intellect-checkers. Alas.
Besides narrative and causality, journalists and public intellectuals of the sound-bite variety do not make the world simpler. Instead, they almost invariably make it look far more complicated than it is.
Adding the because makes these matters far more plausible, and far more likely. Cancer from smoking seems more likely than cancer without a cause attached to it—an unspecified cause means no cause at all.
Just imagine that, as shown by Paul Slovic and his collaborators, people are more likely to pay for terrorism insurance than for plain insurance (which covers, among other things, terrorism).
Finally, after years of searching for empirical tests of our scorn of the abstract, I found researchers in Israel that ran the experiments I had been waiting for. Greg Barron and Ido Erev provide experimental evidence that agents underweigh small probabilities when they engage in sequential experiments in which they derive the probabilities themselves, when they are not supplied with the odds. If you draw from an urn with a very small number of red balls and a high number of black ones, and if you do not have a clue about the relative proportions, you are likely to underestimate the number of
...more
We also tend to forget about the notion of Black Swans immediately after one occurs—since they are too abstract for us—focusing, rather, on the precise and vivid events that easily come to our minds. We do worry about Black Swans, just the wrong ones.
The way to avoid the ills of the narrative fallacy is to favor experimentation over storytelling, experience over history, and clinical knowledge over theories. Certainly the newspaper cannot perform an
The researcher Thomas Astebro has shown that returns on independent inventions (you take the cemetery into account) are far lower than those on venture capital. Some blindness to the odds or an obsession with their own positive Black Swan is necessary for entrepreneurs to function.
This may indeed apply to all concentrated businesses: when you look at the empirical record, you not only see that venture capitalists do better than entrepreneurs, but publishers do better than writers, dealers do better than artists, and science does better than scientists (about 50 percent of scientific and scholarly papers, costing months, sometimes years, of effort, are never truly read). The person involved in such gambles is paid in a currency other than material success: hope.
As a matter of fact, your happiness depends far more on the number of instances of positive feelings, what psychologists call “positive affect,” than on their intensity when they hit. In other words, good news is good news first; how good matters rather little. So to have a pleasant life you should spread these small “affects” across time as evenly as possible. Plenty of mildly good news is preferable to one single lump of great news.
some business bets in which one wins big but infrequently, yet loses small but frequently, are worth making if others are suckers for them and if you have the personal and intellectual stamina. But you need such stamina. You also need to deal with people in your entourage heaping all manner of insult on you, much of it blatant.
Nero engaged in a strategy that he called “bleed.” You lose steadily, daily, for a long time, except when some event takes place for which you get paid disproportionately well.
Remarkably, historians and other scholars in the humanities who need to understand silent evidence the most do not seem to have a name for it (and I looked hard). As for journalists, fuhgedaboudit! They are industrial producers of the distortion.
I have two further points to make on this subject. First, justification of overoptimism on grounds that “it brought us here” arises from a far more serious mistake about human nature: the belief that we are built to understand nature and our own nature and that our decisions are, and have been, the result of our own choices. I beg to disagree. So many instincts drive us.
His big insight is that bank employees who sell you a house that’s not theirs just don’t care as much as the owners; Tony knew very rapidly how to talk to them and maneuver. Later, he also learned to buy and sell gas stations with money borrowed from small neighborhood bankers.
He teaches us the art of doubting, how to position ourselves between doubting and believing. He writes: “One needs to exit doubt in order to produce science—but few people heed the importance of not exiting from it prematurely. … It is a fact that one usually exits doubt without realizing it.” He warns us further: “We are dogma-prone from our mother’s wombs.”
once we produce a theory, we are not likely to change our minds—so those who delay developing their theories are better off. When you develop your opinions on the basis of weak evidence, you will have difficulty interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate. Two
Remember that we are swayed by the sensational. Listening to the news on the radio every hour is far worse for you than reading a weekly magazine, because the longer interval allows information to be filtered a bit.
No matter what anyone tells you, it is a good idea to question the error rate of an expert’s procedure. Do not question his procedure, only his confidence. (As someone who was burned by the medical establishment, I learned to be cautious, and I urge everyone to be: if you walk into a doctor’s office with a symptom, do not listen to his odds of its not being cancer.)
(As an example of a computer using a single metric, the ratio of liquid assets to debt fares better than the majority of credit analysts.)
into a delusional mind-set that makes you ignore additional information—it is best to avoid wavering during battle. On the other hand, unlike raids, large-scale wars are not something present in human heritage—we are new to them—so we tend to misestimate their duration and overestimate our relative power. Recall the underestimation of the duration of the Lebanese war.
What matters is not how often you are right, but how large your cumulative errors are. And these cumulative errors depend largely on the big surprises, the big opportunities. Not only do economic, financial, and political predictors miss them, but they are quite ashamed to say anything outlandish to their clients—and yet events, it turns out, are almost always outlandish. Furthermore, as we will see in the next section, economic forecasters tend to fall closer to one another than to the resulting outcome. Nobody wants to be off the wall.
had no advantage over journalists. The only regularity Tetlock found was the negative effect of reputation on prediction: those who had a big reputation were worse predictors than those who had none.
Hedgehogs, because of the narrative fallacy, are easier for us to understand—their ideas work in sound bites. Their category is overrepresented among famous people; ergo famous people are on average worse at forecasting than the rest of the predictors.
They collectively show no convincing evidence that economists as a community have an ability to predict, and, if they have some ability, their predictions are at best just slightly better than random ones—not good enough to help with serious decisions.
Economics is the most insular of fields; it is the one that quotes least from outside itself!
In fact, the more routine the task, the better you learn to forecast. But there is always something nonroutine in our modern environment.
We cannot truly plan, because we do not understand the future—but this is not necessarily bad news. We could plan while bearing in mind such limitations. It just takes guts.
assumptions). We have become worse planners than the Soviet Russians thanks to these potent computer programs given to those who are incapable of handling their knowledge. Like most commodity traders, Brian is a man of incisive and sometimes brutally painful realism.
The first fallacy: variability matters. The first error lies in taking a projection too seriously, without heeding its accuracy. Yet, for planning purposes, the accuracy in your forecast matters far more than the forecast itself. I will explain it as follows.
policy—the worst case is far more consequential than the forecast itself. This is particularly true if the bad scenario is not acceptable. Yet the current phraseology makes no allowance for that. None.
follows: to understand the future to the point of being able to predict it, you need to incorporate elements from this future itself
To clarify, Platonic is top-down, formulaic, closed-minded, self-serving, and commoditized; a-Platonic is bottom-up, open-minded, skeptical, and empirical.
observation. As I’ve said, the problem with organized knowledge is that there is an occasional divergence of interests between academic guilds and knowledge itself. So I cannot for the life of me understand why today’s libertarians do not go after tenured faculty (except perhaps because many libertarians are academics). We saw that companies can go bust, while governments remain. But while governments remain, civil servants can be demoted and congressmen and senators can be eventually voted out of office. In academia a tenured faculty is permanent—the business of knowledge has permanent
...more
There is a blind spot: when we think of tomorrow we do not frame it in terms of what we thought about yesterday on the day before yesterday. Because of this introspective defect we fail to learn about the difference between our past predictions and the subsequent outcomes. When we think of tomorrow, we just project it as another yesterday.
You are about to commit a prediction error that you have already made. Yet it would cost so little to introspect!
But self-deception is not a desirable feature outside of its natural domain. It prevents us from taking some unnecessary risks—but we saw in Chapter 6 how it does not as readily cover a spate of modern risks that we do not fear because they are not vivid, such as investment risks, environmental dangers, or long-term security.