More on this book
Community
Kindle Notes & Highlights
I will leave aside how one’s ideological commitments influence one’s perception and address the more general aspects of this blind spot toward one’s own predictions. You tell yourself that you were playing a different game.
Had you been in possession of such economic intelligence, you would certainly have been able to predict the demise of the Soviet regime. It is not your skills that are to blame.
You invoke the outlier. Something happened that was outside the system, outside the scope of your science. Given that it was not predictable, you are not to blame.
The “almost right” defense. Retrospectively, with the benefit of a revision of values and an informational framework, it is easy to feel that it was a close call.
We attribute our successes to our skills, and our failures to external events outside our control, namely to randomness.
The hedgehog and the fox.
As in Aesop’s fable, the hedgehog knows one thing, the fox knows many things—these are the adaptable types you need in daily life.
I know that history is going to be dominated by an improbable event, I just don’t know what that event will be.
I had an identical experience in my quant days—the foreign scientist with the throaty accent spending his nights on a computer doing complicated mathematics rarely fares better than a cabdriver using the simplest methods within his reach.
This unfitness of complicated methods seems to apply to all methods.
“Instead [statisticians] have concentrated their efforts in building more sophisticated models without regard to the ability of such models to more accurately predict real-life data,” Makridakis and Hibon write.
“OTHER THAN THAT,” IT WAS OKAY
Plans fail because of what we have called tunneling, the neglect of sources of uncertainty outside the plan itself.
Tunneling is ignoring things outside our "focus tube" (My term). Something just outside the tunnel can mess up everything. For instance, the car behind the car. When see a car stopping at an intersection, we can miss the car behind the car that pulls out in front of us.
He projected his own schedule, but he tunneled, as he did not forecast that some “external” events would emerge to slow him down.
We can project our own schedule, but not outside events. For instance, I'm motivated to jog each morning, but my knees are preventing me from jogging as much as I'd like.
I knew something would start hurting, but I didn't know what or how much effect it would have. I did overestimate how far I could run before something started hurting.
I also underestimated how difficult it would be to cut the lawn.
The unexpected has a one-sided effect with projects.
we are too focused on matters internal to the project to take into account external uncertainty, the “unknown unknown,” so to speak, the contents of the unread books.
We have become worse planners than the Soviet Russians thanks to these potent computer programs given to those who are incapable of handling their knowledge.
A classical mental mechanism, called anchoring, seems to be at work here. You lower your anxiety about uncertainty by producing a number, then you “anchor” on it, like an object to hold on to in the middle of a vacuum.
ANCHORING: You produce a number and then hold on to it no matter how much the situation changes. Even if the number is originally expressed as a range, critics will judge based on the center of the range. The best thing to do is tell the critics to make up their own number. Perhaps a magic eightball would help.
We cannot work without a point of reference.
The Character of Prediction Errors
It is not scalable, since the older we get, the less likely we are to live.
With scalable variables, the ones from Extremistan, you will witness the exact opposite effect. Let’s say a project is expected to terminate in 79 days, the same expectation in days as the newborn female has in years. On the 79th day, if the project is not finished, it will be expected to take another 25 days to complete.
As you see, the longer you wait, the longer you will be expected to wait.
The Arab-Israeli conflict is sixty years old, and counting—yet it was considered “a simple problem” sixty years ago. (Always remember that, in a modern environment, wars last longer and kill more people than is typically planned.)
scalable randomness is unusually counterintuitive.
SCALABLE: The longer you wait, the longer you can expect to wait. If you are at a restaurant and people who came after you are served before you, don't expect to be served sooner. You will be expected to wait without complaint. When you do complain, it will somehow be your fault. "The cook put your ticket at the end BECAUSE YOU COMPLAINED."
DON’T CROSS A RIVER IF IT IS (ON AVERAGE) FOUR FEET DEEP
It did not dawn on them that it was ludicrous to forecast a second time given that their forecast was off so early and so markedly, that this business of forecasting had to be somehow questioned.
The first fallacy: variability matters. The first error lies in taking a projection too seriously, without heeding its accuracy.
Don’t cross a river if it is four feet deep on average.
The policies we need to make decisions on should depend far more on the range of possible outcomes than on the expected final number.
Go to a bank or security-analysis training program and see how they teach trainees to make assumptions; they do not teach you to build an error rate around those assumptions—but their error rate is so large that it is far more significant than the projection itself!
Our forecast errors have traditionally been enormous, and there may be no reasons for us to believe that we are suddenly in a more privileged position to see into the future compared to our blind predecessors.
The third fallacy, and perhaps the gravest, concerns a misunderstanding of the random character of the variables being forecast.
It is often said that “is wise he who can see things coming.” Perhaps the wise one is the one who knows that he cannot see things far away.
Get Another Job
“If you’re so smart, show me your own prediction.” In fact, the latter question, usually boastfully presented, aims to show the superiority of the practitioner and “doer” over the philosopher, and mostly comes from people who do not know that I was a trader.
Anyone who causes harm by forecasting should be treated as either a fool or a liar.
the Black Swan has three attributes: unpredictability, consequences, and retrospective explainability.
Chapter Eleven HOW TO LOOK FOR BIRD POOP
Some Black Swans will remain elusive, enough to kill our forecasts.
HOW TO LOOK FOR BIRD POOP
To a fellow deeply skeptical of the central planner, the notion was ludicrous; growth within the firm had been organic and unpredictable, bottom-up not top-down. It was well known that the firm’s most lucrative department was the product of a chance call from a customer asking for a specific but strange financial transaction.
Inadvertent Discoveries
The classical model of discovery is as follows: you search for what you know (say, a new way to reach India) and find something you didn’t know was there (America).
almost everything of the moment is the product of serendipity.
“The Three Princes of Serendip.” These princes “were always making discoveries by accident or sagacity, of things which they were not in quest of.”
Sir Francis Bacon commented that the most important advances are the least predictable ones, those “lying out of the path of the imagination.”

