More on this book
Community
Kindle Notes & Highlights
“History doesn’t repeat itself, but it does rhyme.”
You are steadily faced with unfamiliar situations, usually with a large array of choices.
He meant that thinking about a problem from an inverse perspective can unlock new solutions and strategies. For example, most people approach investing their money from the perspective of making more money; the inverse approach would be investing money from the perspective of not losing money.
Let us offer an example from the world of sports. In tennis, an unforced error occurs when a player makes a mistake not because the other player hit an awesome shot, but rather because of their own poor judgment or execution. For example, hitting an easy ball into the net is one kind of unforced error. To be wrong less in tennis,
you need to make fewer unforced errors on the court. And to be consistently wrong less in decision making, you consistently need to make fewer unforced errors in your own life.
The best decision based on the information available at the time can easily turn out to be the wrong decision in the long run.
Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.
If your thinking is antifragile, then it gets better over time as you learn from your mistakes and interact with your surroundings.
The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions.
First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.
If you can argue from first principles, then you can more easily approach unfamiliar situations, or approach familiar situations in innovative ways.
Elon Musk illustrates how this process works in practice in an interview on the Foundation podcast: First principles is kind of a physics way of looking at the world. . . . You kind of boil things down to the most fundamental truths and say, “What are we sure is true?” . . . and then reason up from there. . . . Somebody could say . . . “Battery packs are really expensive and that’s just the way they will always be. . . . Historically, it has cost $600 per kilowatt-hour, and so it’s not going to be much better than that in the future.” . . .
With first principles, you say, “What are the material constituents of the batteries? What is the stock market value of the material constituents?” . . . It’s got cobalt, nickel, aluminum, carbon, and some polymers for separation, and a seal can. Break that down on a material basis and say, “If we bought that on the London Metal Exchange, what would each of those things cost?” . . . It’s like $80 per kilowatt-hour. So clearly you just need to think of clever ways to take those materials and combine them into the shape of a battery cell and you can have batteries that are much, much cheaper
...more
Ultimately, to be wrong less, you also need to be testing your assumptions in the real world, a process known as de-risking.
There is risk that one or more of your assumptions are untrue, and so the conclusions you reach could also be false.
My team can build our product. People will want our product. Our product will generate profit. We will be able to fend off competitors. The market is large enough for a long-term business opportunity.
Once you identify the critical assumptions to de-risk, the next step is actually going out and testing these assumptions, proving or disproving them, and then adjusting your strategy appropriately.
When de-risking, you want to test assumptions quickly and easily.
In computer science this trap is called premature optimization, where you tweak or perfect code or algorithms (optimize) too early (prematurely).
The MVP is the product you are developing with just enough features, the minimum amount, to be feasibly, or viably, tested by real people.
As with de-risking, you can extend the MVP model to fit many other contexts: minimum viable organization, minimum viable communication, minimum viable strategy, minimum viable experiment.
Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true.
yourself: Does this assumption really need to be here? What evidence do I have that it should remain? Is it a false dependency?
Ockham’s razor is not a “law” in that it is always true; it just offers guidance. Sometimes the true explanation can indeed be quite complicated. However, there is no reason to jump immediately to the complex explanation when you have simpler alternatives to explore first. If you don’t simplify your assumptions, you can fall into a couple of traps, described in our next mental models.
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable? 1. Linda is a bank teller. 2. Linda is a bank teller and is active in the feminist movement.
The fallacy arises because the probability of two events in conjunction is always less than or equal to the probability of either one of the events occurring alone, a concept illustrated in the Venn diagram on the next page.
You not only have a natural tendency to think something specific is more probable than something general, but you also have a similarly fallacious tendency to explain data using too many assumptions. The mental model for this second fallacy is overfitting, a concept from statistics.
Overfitting occurs when you use an overly complicated explanation when a simpler one will do.
It can occur in any situation where an explanation introduces unnecessary assumptions.
One approach to combatting both traps is to ask yourself: How much does my data really support my conclusion versus other conclusions?
pithy
You go through life seeing everything from your perspective, which varies widely depending on your particular life experiences and current situation.
A frame-of-reference mental trap (or useful trick, depending on your perspective) is framing.
When someone presents a new idea or decision to you, take a step back and consider other ways in which it could be framed.
Unfortunately, the call did not specify an exact address, and the officers responded to the wrong house. Upon finding the back door unlocked, they entered, and encountered a dog. Gunfire ensued, and the dog, homeowner, and one of the officers were shot, all by officer gunfire. The homeowner and officer survived. Two headlines framed the incident in dramatically different ways.
The pattern was clear-cut: A misleading headline impaired memory for the article.
Participants saw a film of a traffic accident and then answered the question, “About how fast were the cars going when they contacted each other?” Other participants received the same question, except that the verb contacted was replaced by either hit, bumped, collided, or smashed. Even though the participants saw the same film, the wording of the question affected their answers. The speed estimates (in miles per hour) were 31, 34, 38, 39, and 41, respectively.
You can be nudged in a direction by a subtle word choice or other environmental cues. Restaurants will nudge you by highlighting certain dishes on menu inserts, by having servers verbally describe specials, or by just putting boxes around certain items.
anchoring, which describes your tendency to rely too heavily on first impressions when making decisions.
The Economist. Readers were offered three ways to subscribe: web only ($59), print only ($125), and print and web ($125). Yes, you read that right: the “print only” version cost the same as the “print and web” version. Who would choose that? Predictably, no one. Here is the result when one hundred MIT students reported their preference: Web only ($59): 16 percent Print only ($125): 0 percent Print and web ($125): 84 percent So why include that option at all? Here’s why: when it was removed from the question, this result was revealed: Web only ($59): 68 percent Print and web ($125): 32 percent
Anchoring isn’t just for numbers. Donald Trump uses this mental model, anchoring others to his extreme positions, so that what seem like compromises are actually agreements in his favor.
which occurs when a bias, or distortion, creeps into your objective view of reality thanks to information recently made available to you.
Yet the data suggests that illegal immigration via the southern border is actually at a five-decade low, indicating that the prevalence of the topic is creating an availability bias for many.
“If it bleeds, it leads.”
The resulting heavy coverage of violent crime causes people to think it occurs more often than it does.
With the rise of personalized recommendations and news feeds on the internet, availability bias has become a more and more pernicious problem.
Because of availability bias, you’re likely to click on things you’re already familiar with, and so Google, Facebook, and many other companies tend to show you more of what they think you already know and like.
This happened even when they were signed out and in so-called incognito mode.
When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles.
WALK A MILE IN THEIR SHOES

