More on this book
Community
Kindle Notes & Highlights
Started reading
July 19, 2025
Simpler, less sexy methods fare exceedingly better, but they do not take you to Stockholm.
Plans fail because of what we have called tunneling, the neglect of sources of uncertainty outside the plan itself.
we are too focused on matters internal to the project to take into account external uncertainty, the “unknown unknown,” so to speak, the contents of the unread books.
It is often said that “is wise he who can see things coming.” Perhaps the wise one is the one who knows that he cannot see things far away.
We forget about unpredictability when it is our turn to predict.
Louis Pasteur’s adage about creating luck by sheer exposure. “Luck favors the prepared,” Pasteur said, and, like all great discoverers, he knew something about accidental discoveries. The best way to get maximal exposure is to keep researching. Collect opportunities—on that, later.
In a dynamical system, where you are considering more than a ball on its own, where trajectories in a way depend on one another, the ability to project into the future is not just reduced, but is subjected to a fundamental limitation.
To borrow from Warren Buffett, don’t ask the barber if you need a haircut—and don’t ask an academic if what he does is relevant.
Someone with a low degree of epistemic arrogance is not too visible, like a shy person at a cocktail party. We are not predisposed to respect humble people, those who try to suspend judgment.
This does not necessarily mean that he lacks confidence, only that he holds his own knowledge to be suspect. I will call such a person an epistemocrat;
The first consequence of this asymmetry is that, in people’s minds, the relationship between the past and the future does not learn from the relationship between the past and the past previous to it.
There is a blind spot: when we think of tomorrow we do not frame it in terms of what we thought about yesterday on the day before yesterday. Because of this introspective defect we fail to learn about the difference between our past predictions and the subsequent outcomes.
When we think of tomorrow, we just project it as a...
This highlight has been truncated due to consecutive passage length restrictions.
Clearly, to those people amused by the apes, the idea of a being who would look down on them the way they look down on the apes cannot immediately come to their minds—if it did, it would elicit self-pity.
Accordingly, an element in the mechanics of how the human mind learns from the past makes us believe in definitive solutions—yet not consider that those who preceded us thought that they too had definitive solutions.
We laugh at others and we don’t realize that someone will be just as justified in laughing at us on some not too remote day. Such a realization would entail the recursive, or second-order, thinking th...
This highlight has been truncated due to consecutive passage length restrictions.
Our problem is not just that we do not know the future, we do not know much of the past either.
Randomness, in the end, is just unknowledge. The world is opaque and appearances fool us.
Yogi Berra, another great thinker, said, “You got to be very careful if you don’t know where you’re going, because you might not get there.”
To be contagious, a mental category must agree with our nature.
three “popular science” books that summarize the research in complex systems: Mark Buchanan’s Ubiquity, Philip Ball’s Critical Mass, and Paul Ormerod’s Why Most Things Fail.
all three authors, by producing, or promoting precision, fall into the trap of not differentiating between the forward and the backward processes (between the problem and the inverse problem)—to me, the greatest scientific and epistemological sin. They are not alone; nearly everyone who works with data but doesn’t make decisions on the basis of these data tends to be guilty of the same sin, a variation of the narrative fallacy. In the absence of a feedback process you look at models and think that they confirm reality.
complexity theory should make us more suspicious of scientific claims of precise models of reality. It does not make all the swans white; that is predictable: it makes them gray, and only gray.*
You are indeed much safer if you know where the wild animals are.
people think that science is about formulating predictions (even when wrong). To me, science is about how not to be a sucker.
people who worry about pennies instead of dollars can be dangerous to society.
my thinking is rooted in the belief that you cannot go from books to problems, but the reverse, from problems to books.
Even priests don’t go to bishops when they feel ill: their first stop is the doctor’s.
Having plenty of data will not provide confirmation, but a single instance can disconfirm. I am skeptical when I suspect wild randomness, gullible when I believe that randomness is mild.
worry less about small failures, more about large, potentially terminal ones.
worry less about advertised and sensational risks, more about the more vicious hidden ones.
worry less about embarrassment than about missing an opportunity.
In the end this is a trivial decision making rule: I am very aggressive when I can gain exposure to positive Black Swans—when a failure would be of small moment—and very conservative when I am under threat from a negative Black Swan. I am very aggressive when an error in a model can benefit me, and paranoid when the error can hurt.
not matching the idea of success others expect from you is only painful if that’s what you are seeking. You stand above the rat race and the pecking order, not outside of it, if you do so by choice.
A person who does not face stressors will not survive should he encounter them.
no stress plus a little bit of extreme stress is vastly better than a little bit of stress (like mortgage worries) all the time.