More on this book
Community
Kindle Notes & Highlights
The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.
Ultimately, to be wrong less, you also need to be testing your assumptions in the real world, a process known as de-risking
Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true.
A practical tactic is to look at your explanation of a situation, break it down into its constituent assumptions, and for each one, ask yourself: Does this assumption really need to be here? What evidence do I have that it should remain? Is it a false dependency?
If you’re trying to be as objective as possible when making a decision or solving a problem, you always want to account for your frame of reference.
Because of availability bias, you’re likely to click on things you’re already familiar with, and so Google, Facebook, and many other companies tend to show you more of what they think you already know and like. Since there are only so many items they can show you—only so many links on page one of the search results—they therefore filter out links they think you are unlikely to click on, such as opposing viewpoints, effectively placing you in a bubble.
It’s much harder to seek out an objective frame of reference, but that is what you need to do in order to be wrong less.
Ironically, belief in a just world can get in the way of actual justice by leading people to victim-blame: The sexual assault victim “should have worn different clothes” or the welfare recipient “is just lazy.”
disconfirmation bias, where you impose a stronger burden of proof on the ideas you don’t want to believe.
A real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms.
The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts (which happens occasionally, but much less frequently than one might imagine).
Each additional spam message benefits the spammer who sends it while simultaneously degrading the entire email system. Collective overuse of antibiotics in medicine and agriculture is leading to dangerous antibiotic resistance. People make self-serving edits to Wikipedia articles, diminishing the overall reliability of the encyclopedia.
More broadly, the tragedy of the commons arises from what is called the tyranny of small decisions, where a series of small, individually rational decisions ultimately leads to a system-wide negative consequence, or tyranny. It’s death by a thousand cuts.
Professionally, it may be the occasional distractions and small procrastinations that, in aggregate, make your deadlines hard to reach.
Goodhart’s law summarizes the issue: When a measure becomes a target, it ceases to be a good measure.
For any decision, ask yourself: What kind of debt am I incurring by doing this? What future paths am I taking away by my actions today?
use Hick’s law to remember that decision time is going to increase with the number of choices, and so if you want people to make quick decisions, reduce the number of choices. One way to do this is to give yourself or others a multi-step decision with fewer choices at each step, such as asking what type of restaurant to go to (Italian, Mexican, etc.), and then offering another set of choices within the chosen category.
You should be wary of fighting a two-front war, yet you probably do so every single day in the form of multitasking.
Sayre’s law, named after political scientist Wallace Sayre, offers that in any dispute the intensity of feeling is inversely proportional to the value of the issues at stake.
Parkinson’s law of triviality, named after naval historian Cyril Parkinson, which states that organizations tend to give disproportionate weight to trivial issues.
Generally, you want to choose things that have higher value than their opportunity costs, the best of all the alternatives in front of you. When put like that, it sounds simple, right? Complications arise when you realize that you can’t have it all. There are always trade-offs when you choose among the pursuits important to you.
The highest-leverage choice might not be the best fit every time, but the option that provides the most impact at the lowest cost always warrants consideration.
As a rule, the highest leverage activities have the lowest opportunity cost.
just because you’ve hit diminishing returns doesn’t always mean you must stop what you’re doing. It really comes down to opportunity cost. If you can identify another activity that can produce greater results for the same amount of effort, then you should jump to it. Otherwise, you should keep at your current activity, since you’re still making progress (even if it is slower progress) and you don’t have anything better to do.
Hofstadter’s law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.
heuristic solution, a trial-and-error solution that is not guaranteed to yield optimal or perfect results, but in many cases is nevertheless very effective.
Many algorithms operate as black boxes, which means they require very little understanding by the user of how they work. You don’t care how you got the best seats, you just want the best seats! You can think of each algorithm as a box where inputs go in and outputs come out, but outside it is painted black so you can’t tell what is going on inside. Common examples of black box algorithms include recommendation systems on Netflix or Amazon, matching on online dating sites, and content moderation on social media.
When you think about using tools to get your work done faster, you should start by discovering all the off-the-shelf options available to you.
Another strategy to get to a solution quicker when faced with a hard situation is to reframe the problem. Consider a central problem faced by Disney World: long lines. Most rides have limited seating, so the only way to move more people through the ride in the same amount of time is to build more apparatuses for the same ride. That’s expensive, involves closing the ride for a substantial period, and may not even be realistically possible given space constraints. But what if the problem was reframed not as “How do we move people through the line faster?” but “How do we make people happier while
...more
The most successful (and adaptive) people and organizations are constantly refining how they work and what they work on to be more effective.
Culture eats strategy for breakfast. It is a warning that if you embark on a strategy that is in opposition to your organization’s culture, which has much more inertia than even its strategy, it is very unlikely to succeed.
Being an expert in an area that is about to hit a tipping point is an advantageous position, since your expertise has increasing leverage as the idea or technology takes off.
Being early is good,but you dont want to be too early and risk ruin.
How then does one determine that sweet spot just before a new technology reaches the tipping point?
I wonder if much of this is based on faith.
When you critically evaluate a study (or conduct one yourself), you need to ask yourself: Who is missing from the sample population? What could be making this sample population nonrandom relative to the underlying population?
For example, if you want to grow your company’s customer base, you shouldn’t just sample existing customers; that sample doesn’t account for the probably much larger population of potential customers. This much larger potential customer base may behave very differently from your existing customer base
The improbable should not be confused with the impossible. If enough chances are taken, even rare events are expected to happen. Some people do win the lottery and some people do get struck by lightning. A one-in-a-million event happens quite frequently on a planet with seven billion people.
Any isolated experiment can result in a false positive or a false negative and can also be biased by myriad factors, most commonly selection bias, response bias, and survivorship bias.
Thought experiments are particularly useful in scenario analysis. Posing questions that start with “What would happen if …” is a good practice in this way: What would happen if life expectancy jumped forty years? What would happen if a well-funded competitor copied our product? What would happen if I switched careers?
Another helpful lateral-thinking technique involves adding some randomness when you are generating ideas. For example, you can choose an object at random from your surroundings or a noun from the dictionary and try to associate it in some way with your current idea list, laterally forming new offshoot ideas in the process.
There are many ways to manage groupthink, though, including setting a culture of questioning assumptions, making sure to evaluate all ideas critically, establishing a Devil’s advocate position (see Chapter 1), actively recruiting people with differing opinions, reducing leadership’s influence on group recommendations, and splitting the group into independent subgroups.
When tempted to use a pro-con list, consider upgrading to a cost-benefit analysis or decision tree as appropriate.
Strive to understand the global optimum in any system and look for decisions that move you closer to it.
Authorities are often more knowledgeable of the facts and issues in their area of expertise, but even then, it is important to go back to first principles and evaluate their arguments on merit. In the words of astrophysicist Carl Sagan, from his book The Demon-Haunted World: “One of the great commandments of science is, ‘Mistrust arguments from authority.’ … Too many such arguments have proved too painfully wrong. Authorities must prove their contentions like everybody else.”
While mid-career professionals are unlikely to take up babysitting for extra cash, they are likely to babysit for free when a friend is in a pinch. The first scenario is framed from a market perspective (“Would you babysit my kids for fifteen dollars an hour?”) and the second is framed from a social perspective (“Can you please do me a favor?”). The difference in the way this situation is framed can be thought of as social norms versus market norms and draws on the concept of reciprocity