More on this book
Community
Kindle Notes & Highlights
When people reflect on what it takes to be mentally fit, the first idea that comes to mind is usually intelligence. The smarter you are, the more complex the problems you can solve—and the faster you can solve them. Intelligence is traditionally viewed as the ability to think and learn. Yet in a turbulent world, there’s another set of cognitive skills that might matter more: the ability to rethink and unlearn.
the first-instinct fallacy.
it’s not so much changing your answer that improves your score as considering whether you should change it.
Part of the problem is cognitive laziness. Some psychologists point out that we’re mental misers: we often prefer the ease of hanging on to old views over the difficulty of grappling with new ones.
We favor the comfort of conviction over the discomfort of doubt,
If you’re a firefighter, dropping your tools doesn’t just require you to unlearn habits and disregard instincts. Discarding your equipment means admitting failure and shedding part of your identity. You have to rethink your goal in your job—and your role in life.
Our ways of thinking become habits that can weigh us down, and we don’t bother to question them until it’s too late.
adapting to a changing environment isn’t something a company does—it’s something people do in the multitude of decisions they make every day.
By 1980, medical knowledge was doubling every seven years, and by 2010, it was doubling in half that time.
We need to develop the habit of forming our own second opinions.
We go into preacher mode when our sacred beliefs are in jeopardy: we deliver sermons to protect and promote our ideals. We enter prosecutor mode when we recognize flaws in other people’s reasoning: we marshal arguments to prove them wrong and win our case. We shift into politician mode when we’re seeking to win over an audience: we campaign and lobby for the approval of our constituents. The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who are wrong, and politicking for support that we don’t bother to rethink our own views.
We move into scientist mode when we’re searching for the truth: we run experiments to test hypotheses and discover knowledge.
Hypotheses have as much of a place in our lives as they do in the lab. Experiments can inform our daily decisions. That makes me wonder: is it possible to train people in other fields to think more like scientists, and if so, do they end up making smarter choices?
The training for both groups was identical, except that one was encouraged to view startups through a scientist’s goggles. From that perspective, their strategy is a theory, customer interviews help to develop hypotheses, and their minimum viable product and prototype are experiments to test those hypotheses. Their task is to rigorously measure the results and make decisions based on whether their hypotheses are supported or refuted.
Over the following year, the startups in the control group averaged under $300 in revenue. The startups in the scientific thinking group averaged over $12,000 in revenue.
Why? The entrepreneurs in the control group tended to stay wedded to their original strategies and products.
The entrepreneurs who had been taught to think like scientists, in contrast, pivoted more than twice as often. When their hypotheses weren’t supported, they knew it was time to rethink their business models.
conviction: decisive and certain. Yet evidence reveals that when business executives compete in tournaments to price products, the best strategists are actually slow and unsure. Like careful scientists, they take their time so they have the flexibility to change their minds. I’m beginning to think decisiveness is overrated . . . but I reserve the right to change my mind.
Mental horsepower doesn’t guarantee mental dexterity. No matter how much brainpower you have, if you lack the motivation to change your mind, you’ll miss many occasions to think again.
Research reveals that the higher you score on an IQ test, the more likely you are to fall for stereotypes, because you’re faster at recognizing patterns. And recent experiments suggest that the smarter you are, the more you might struggle to update your beliefs.
The better you are at crunching numbers, the more spectacularly you fail at analyzing patterns that contradict your views.
In psychology there are at least two biases that drive this pattern. One is confirmation bias: seeing what we expect to see. The other is desirability bias: seeing what we want to see. These biases don’t just prevent us from applying our intelligence. They can actually contort our intelligence into a weapon against the truth.
My favorite bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that smart people are more likely to fall into this trap. The brighter you are, the harder it can be to see your own limitations. Being good at thinking can make you worse at rethinking.
DON’T STOP UNBELIEVING
If knowledge is power, knowing what we don’t know is wisdom.
Our convictions can lock us in prisons of our own making. The solution is not to decelerate our thinking—it’s to accelerate our rethinking.
Research shows that when people are resistant to change, it helps to reinforce what will stay the same. Visions for change are more compelling when they include visions of continuity. Although our strategy might evolve, our identity will endure.
The curse of knowledge is that it closes our minds to what we don’t know. Good judgment depends on having the skill—and the will—to open our minds.
Anton’s syndrome. We all have blind spots in our knowledge and opinions. The bad news is that they can leave us blind to our blindness, which gives us false confidence in our judgment and prevents us from rethinking.
we need to learn to recognize our cognitive blind spots and revise our thinking accordingly.
According to what’s now known as the Dunning-Kruger effect, it’s when we lack competence that we’re most likely to be brimming with overconfidence.
In a series of studies, people rated whether they knew more or less than most people about a range of topics like these, and then took a quiz to test their actual knowledge. The more superior participants thought their knowledge was, the more they overestimated themselves—and the less interested they were in learning and updating.
If you think you know more about history or science than most people, chances are you know less than you think.
As Dunning quips, “The first rule of the Dunning-Kruger club is you don’t know you’re a member ...
This highlight has been truncated due to consecutive passage length restrictions.
There’s a less obvious force that clouds our vision of our abilities: a deficit in metacognitive skill, the ability to think about our thinking. Lacking competence can leave us blind to our own incompetence. If you’re a tech entrepreneur and you’re uninformed about education systems, you can feel certain that your master plan will fix them.
When we lack the knowledge and skills to achieve excellence, we sometimes lack the knowledge and skills to judge excellence.
We’re also prone to overconfidence in situations where it’s easy to confuse experience for expertise, like driving, typing, trivia, and managing emotions.
Yet we underestimate ourselves when we can easily recognize that we lack experience—like painting, driving a race car, and rapidly reciting the alphabet backward.
Confidence is a measure of how much you believe in yourself. Evidence shows that’s distinct from how much you believe in your methods. You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present. That’s the sweet spot of confidence.
We can be consumed by an inferiority complex when we know the right method but feel uncertain about our ability to execute it. What we want to attain is confident humility: having faith in our capability while appreciating that we may not have the right solution or even be addressing the right problem. That gives us enough doubt to reexamine our old knowledge and enough confidence to pursue new insights.
In one experiment, when students read a short article about the benefits of admitting what we don’t know rather than being certain about it, their odds of seeking extra help in an area of weakness spiked from 65 to 85 percent.
The most effective leaders score high in both confidence and humility. Although they have faith in their strengths, they’re also keenly aware of their weaknesses. They know they need to recognize and transcend their limits if they want to push the limits of greatness.
The first upside of feeling like an impostor is that it can motivate us to work harder.
Second, impostor thoughts can motivate us to work smarter. When we don’t believe we’re going to win, we have nothing to lose by rethinking our strategy.
Third, feeling like an impostor can make us better learners.
“Learning requires the humility to realize one has something to learn.”

