Super Thinking: Upgrade Your Reasoning and Make Better Decisions with Mental Models
Rate it:
Open Preview
4%
Flag icon
the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have models in your head
4%
Flag icon
critical mass, the mass of nuclear material needed to create a critical state whereby a nuclear chain reaction is possible.
4%
Flag icon
Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ’em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have models in your head. And you’ve got to array your experience—both vicarious and direct—on this latticework of models.
4%
Flag icon
you can’t really know anything if you just remember isolated facts and try and bang ’em back.
5%
Flag icon
When you don’t use mental models, strategic thinking is like using addition when multiplication is available to you.
5%
Flag icon
When I urge a multidisciplinary approach … I’m really asking you to ignore jurisdictional boundaries. If you want to be a good thinker, you must develop a mind that can jump these boundaries. You don’t have to know it all. Just take in the best big ideas from all these disciplines. And it’s not that hard
5%
Flag icon
The concept of inverse thinking can help you with the challenge of making good decisions. The inverse of being right more is being wrong less. Mental models are a tool set that can help you be wrong less. They are a collection of concepts that help you more effectively navigate our complex world.
6%
Flag icon
Some things benefit from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Yet, in spite of the ubiquity of the phenomenon, there is no word for the exact opposite of fragile. Let us call it antifragile.
6%
Flag icon
Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.
6%
Flag icon
If your thinking is antifragile, then it gets better over time as you learn from your mistakes and interact with your surroundings.
6%
Flag icon
It’s also the difference between being a chef—someone who can take ingredients and turn them into an amazing dish without looking at a cookbook—and being the kind of cook who just knows how to follow a recipe.
6%
Flag icon
The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.
6%
Flag icon
arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.
6%
Flag icon
If you can argue from first principles, then you can more easily approach unfamiliar situations, or approach familiar situations in innovative ways.
6%
Flag icon
When arguing from first principles, you are deliberately starting from scratch. You are explicitly avoiding the potential trap of conventional wisdom, which could turn out to be wrong. Even if you end up in agreement with conventional wisdom, by taking the first-principles approach, you will gain a much deeper understanding of the subject at hand.
7%
Flag icon
Once you identify the critical assumptions to de-risk, the next step is actually going out and testing these assumptions, proving or disproving them, and then adjusting your strategy appropriately.
7%
Flag icon
Unfortunately, people often make the mistake of doing way too much work before testing assumptions in the real world. In computer science this trap is called premature optimization, where you tweak or perfect code or algorithms (optimize) too early (prematurely). If your assumptions turn out to be wrong, you’re going to have to throw out all that work, rendering it ultimately a waste of time.
7%
Flag icon
Back in startup land, there is another mental model to help you test your assumptions, called minimum viable product, or MVP. The MVP is the product you are developing with just enough features, the minimum amount, to be feasibly, or viably, tested by real people.
7%
Flag icon
“Everybody has a plan until they get punched in the mouth.”
7%
Flag icon
with. Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true. When you encounter competing explanations that plausibly explain a set of data equally well, you probably want to choose the simplest one to investigate first.
7%
Flag icon
This model is a razor because it “shaves off” unnecessary assumptions.
7%
Flag icon
“Everything should be made as simple as it can be, but not simpler!” In medicine, it’s known by this saying: “When you hear hoofbeats, think of horses, not zebras.
7%
Flag icon
A practical tactic is to look at your explanation of a situation, break it down into its constituent assumptions, and for each one, ask yourself: Does this assumption really need to be here? What evidence do I have that it should remain? Is it a false dependency?
7%
Flag icon
First, most people are, unfortunately, hardwired to latch onto unnecessary assumptions, a predilection called the conjunction fallacy, studied
7%
Flag icon
The fallacy arises because the probability of two events in conjunction is always less than or equal to the probability of either one of the events occurring alone, a concept illustrated in the Venn diagram on the next page.
8%
Flag icon
Overfitting occurs when you use an overly complicated explanation when a simpler one will do. It’s what happens when you don’t heed Ockham’s razor, when you get sucked into the conjunction fallacy or make a similar unforced error. It can occur in any situation where an explanation introduces unnecessary assumptions.
8%
Flag icon
physics your perspective is called your frame of reference, a concept central to Einstein’s theory of relativity.
8%
Flag icon
In fact, everything but the speed of light—even time—appears different in different frames of reference.
8%
Flag icon
A frame-of-reference mental trap (or useful trick, depending on your perspective) is framing. Framing refers to the way you present a situation or explanation. When you present an important issue to your coworker or family member, you try to frame it in a way that might help them best understand your perspective, setting the stage for a beneficial conversation.
8%
Flag icon
framing. Framing refers to the way you present a situation or explanation.
8%
Flag icon
When you present an important issue to your coworker or family member, you try to frame it in a way that might help them best understand your perspective, sett...
This highlight has been truncated due to consecutive passage length restrictions.
8%
Flag icon
You can be nudged in a direction by a subtle word choice or other environmental cues.
8%
Flag icon
Another concept you will find useful when making purchasing decisions is anchoring, which describes your tendency to rely too heavily on first impressions when making decisions.
8%
Flag icon
You get anchored to the first piece of framing information you encounter. This tendency is commonly exploited by businesses when making offers.
9%
Flag icon
More broadly, these mental models are all instances of a more general model, availability bias, which occurs when a bias, or distortion, creeps into your objective view of reality thanks to information recently made available to you.
9%
Flag icon
Availability bias can easily emerge from high media coverage of a topic. Rightly or wrongly, the media infamously has a mantra of “If it bleeds, it leads.”
9%
Flag icon
Availability bias stems from overreliance on your recent experiences within your frame of reference, at the expense of the big picture.
9%
Flag icon
Because of availability bias, you’re likely to click on things you’re already familiar with, and so Google, Facebook, and many other companies tend to show you more of what they think you already know and like. Since there are only so many items they can show you—only so many links on page one of the search results—they therefore filter out links they think you are unlikely to click on, such as opposing viewpoints, effectively placing you in a bubble.
9%
Flag icon
When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same
9%
Flag icon
echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles.
9%
Flag icon
groups of people, echoing around the collective chambers of these connected filter bubbles. Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints. And because of availability bias, they consistently overestimate the percentage of people who hold the same opinions. It’s easy to focus solely on what is put in front of you. It’s much harder...
This highlight has been truncated due to consecutive passage length restrictions.
9%
Flag icon
Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints. And because of availability bias, they consistently overestimate the...
This highlight has been truncated due to consecutive passage length restrictions.
9%
Flag icon
easy to focus solely on what is put in front of you. It’s much harder to seek out an objective frame of reference, but that is what you...
This highlight has been truncated due to consecutive passage length restrictions.
9%
Flag icon
Forcing yourself to think as an impartial observer can help you in any conflict
9%
Flag icon
situation, including difficult business negotiations and personal disagreements.
9%
Flag icon
Sheila Heen explore this model in detail in their book Difficult Conversations: “The key is learning to describe the gap—or difference—between your story and the other person’s story. Whatever else you may think and feel, you can at least agree that you and the other person see things differently.
9%
Flag icon
you acknowledge the perspective of the third story within difficult conversations, it can have a disarming effect, causing others involved to act less defensively. That’s because you are signaling your willingness and ability to consider an objective point of view.
9%
Flag icon
Another tactical model that can help you empathize is the most respectful interpretation, or MRI. In any situation, you can explain a person’s behavior in many ways. MRI asks you to you interpret the other parties’ actions in the most respectful way possible. It’s giving people the benefit of the doubt.
9%
Flag icon
called Hanlon’s razor: never attribute to malice that which is adequately explained by carelessness. Like Ockham’s razor, Hanlon’s razor seeks out the simplest explanation. And when people do something harmful, the simplest explanation is usually that they took the path of least resistance. That is, they carelessly created the negative outcome; they did not cause the outcome out of malice.
10%
Flag icon
psychologists call the fundamental attribution error, where you frequently make errors by attributing others’ behaviors to their internal,
« Prev 1 3 11