Super Thinking: The Big Book of Mental Models
Rate it:
Open Preview
Kindle Notes & Highlights
1%
Flag icon
These recurring concepts are called mental models.
1%
Flag icon
An example of a useful mental model from physics is the concept of critical mass, the
1%
Flag icon
We were introduced to the concept of super models many years ago through Charlie Munger, the partner of renowned investor Warren Buffett.
1%
Flag icon
“A Lesson on Elementary, Worldly Wisdom as It Relates to Investment Management and Business”:
1%
Flag icon
the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have models in your head.
1%
Flag icon
If you realize that the concept of critical mass applies to this business, then you know that there is some threshold that needs
2%
Flag icon
This book is that toolbox: it systematically lists, classifies, and explains all the important mental models across the major disciplines.
2%
Flag icon
1 Being Wrong Less
2%
Flag icon
thinking about a problem from an inverse perspective can unlock new solutions and strategies.
2%
Flag icon
the perspective of making more money; the inverse approach would be investing money from the perspective of not losing money.
2%
Flag icon
controlled ingredients. An inverse approach, by contrast, would be to try to avoid unhealthy options.
2%
Flag icon
inverse thinking can help you with the challenge of making good decisions. The inverse of being right more is being wrong less. Mental models are a tool set that can help you be wrong less.
2%
Flag icon
In tennis, an unforced error occurs when a player makes a mistake not because the other player hit an awesome shot, but rather because of their own poor judgment or execution.
2%
Flag icon
To be wrong less in tennis, you need to make fewer unforced errors on the court.
2%
Flag icon
Start looking for unforced errors around you and you will see them everywhere.
2%
Flag icon
The best decision based on the information available at the time can easily turn out to be the wrong decision in the long run.
2%
Flag icon
antifragile, a concept explored in a book of the same name by financial analyst Nassim Nicholas Taleb.
2%
Flag icon
Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.
2%
Flag icon
similarly pays off to make your thinking antifragile in the face of new decisions. If your thinking is antifragile, then it gets better over time as you learn from your mistakes and interact with your surroundings.
2%
Flag icon
you will have more than three hundred mental models floating around in your head from dozens of disciplines, eager to pop up at just the right time.
3%
Flag icon
KEEP IT SIMPLE, STUPID!
3%
Flag icon
stresses the importance of knowing how to derive every formula that you use, because
3%
Flag icon
It’s also the difference between being a chef—someone who can take ingredients and turn them into an amazing dish without looking at a cookbook—and being the kind of cook who just knows how to follow a recipe.
3%
Flag icon
The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.
3%
Flag icon
“What are we sure is true?” . . . and then reason up from there. . . .
3%
Flag icon
cobalt, nickel, aluminum, carbon, and some polymers for separation, and a seal can. Break that down on a material basis and say, “If we bought that on the London Metal Exchange, what would each of those things cost?” . . . It’s like $80 per kilowatt-hour.
3%
Flag icon
Even if you end up in agreement with conventional wisdom, by taking the first-principles approach, you will gain a much deeper understanding of the subject at hand.
3%
Flag icon
Ultimately, to be wrong less, you also need to be testing your assumptions in the real world, a process known as de-risking. There is risk that one or more of your assumptions are untrue, and so the conclusions you reach could also be false.
3%
Flag icon
any startup business idea is built upon a series of principled assumptions: My team can build our product. People will want our product. Our product will generate profit. We will be able to fend off competitors. The market is large enough for a long-term business opportunity.
3%
Flag icon
first are the ones that are necessary conditions for success and that you are most uncertain about.
3%
Flag icon
Once you identify the critical assumptions to de-risk,
3%
Flag icon
Unfortunately, people often make the mistake of doing way too much work before testing assumptions in the real world. In computer science this trap is called premature optimization, where you tweak or perfect code or algorithms (optimize) too early (prematurely). If your assumptions turn out to be wrong, you’re going to have to throw out all that work, rendering it ultimately a waste of time.
4%
Flag icon
minimum viable product, or MVP. The MVP is the product you are developing with just enough features, the minimum amount, to be feasibly, or viably, tested by real people.
4%
Flag icon
MVP model to fit many other contexts: minimum viable organization, minimum viable communication, minimum viable strategy, minimum viable experiment.
4%
Flag icon
Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true. When you encounter competing explanations that plausibly explain a set of data equally well, you probably want to choose the simplest one to investigate first.
4%
Flag icon
In medicine, it’s known by this saying: “When you hear hoofbeats, think of horses, not zebras.”
4%
Flag icon
Ockham’s razor is not a “law” in that it is always true; it just offers guidance.
4%
Flag icon
First, most people are, unfortunately, hardwired to latch onto unnecessary assumptions, a predilection called the conjunction fallacy,
4%
Flag icon
tendency to think something specific is more probable than something general, but you also have a similarly fallacious tendency to explain data using too many assumptions.
4%
Flag icon
overfitting, a concept from statistics.
4%
Flag icon
when you have a cold is overfitting your symptoms.
4%
Flag icon
Overfitting occurs when you use an overly complicated explanation when a simpler one will do.
4%
Flag icon
As a visual example, the data depicted on the next page can be easily explained by a straight line, but you could also overfit the data by creating a curved one that moves through every single point, as the wavy line does.
4%
Flag icon
How much does my data really support my conclusion versus other conclusions?
4%
Flag icon
KISS: Keep It Simple, Stupid!
4%
Flag icon
IN THE EYE OF THE BEHOLDER
4%
Flag icon
In physics your perspective is called your frame of reference,
4%
Flag icon
you are in a moving train, your reference frame is inside the train, which appears at rest to you, with objects inside the train not moving relative to one another,
4%
Flag icon
light—even time—appears different in different frames of reference.
5%
Flag icon
frame-of-reference mental trap (or useful trick, depending on your perspective) is framing. Framing refers to the way you present a situation or explanation.
« Prev 1