Jump to ratings and reviews
Rate this book

Thinking 101: How to Reason Better to Live Better

Rate this book
Psychologist Woo-kyoung Ahn devised a course at Yale University to help students examine the biases that cause so many problems in their daily lives. Called “Thinking," the course quickly became one of the university’s most popular. In Ahn’s class, students examine “thinking problems”—such as confirmation bias, causal attribution, and delayed gratification—and how they contribute to our most pressing societal issues and inequities.

Thinking 101 draws on decades of research from other cognitive psychologists, as well as from Ahn's own teaching and groundbreaking studies. She presents it all in a compellingly readable style that uses fun examples from K-pop dancing, anecdotes from her own life, and illuminating stories from history and the headlines. As Thinking 101 shows, with better awareness of our biases, we can improve our lives and tackle real-world problems. It is, quite simply, required reading for everyone who wants to think—and live—better.

288 pages, Hardcover

Published September 13, 2022

474 people are currently reading
11697 people want to read

About the author

Woo-Kyoung Ahn

10 books51 followers
WOO-KYOUNG AHN is the John Hay Whitney Professor of Psychology at Yale University. After receiving her Ph.D. in psychology from the University of Illinois, Urbana-Champaign, she was assistant professor at Yale University and associate professor at Vanderbilt University. In 2022, she received Yale’s Lex Hixon Prize for teaching excellence in the social sciences. Her research on thinking biases has been funded by the National Institutes of Health, and she is a fellow of the American Psychological Association and the Association for Psychological Science.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
375 (24%)
4 stars
636 (42%)
3 stars
403 (26%)
2 stars
84 (5%)
1 star
15 (<1%)
Displaying 1 - 30 of 184 reviews
Profile Image for Frank Calberg.
188 reviews65 followers
August 30, 2025
Takeaways from reading the book:

Chapter 1: The allure of fluency.
- Page 7: The "not me" bias is the belief that while others may have certain cognitive biases, we ourselves are immune.
- Page 10: Watching Michael Jackson do the moonwalk 20 times without practicing does not make you a better moonwalker than someone, who has only seen him do it once.
- Page 16: Having access to expert information engenders overconfidence. It makes people think they are more knowledgeable than they really are - even about topics they have not googled.

Chapter 2: Confirmation bias.
- Page 42: Confirmation bias is the tendency to confirm what we already believe.
- Page 58: If a person believes that as she or he is depressed, this thinking may lead to more depressive thinking, for example by stopping to do things that are fun.
- Page 58: A person, who underestimates himself or herself, may avoid to go for opportunities. And a person, who overestimates himself / herself, may forget what he or she is not good at.
- Page 44: To avoid confirmation bias, be open-minded, for example about the possibility that there are other or more reasons for a problem.
- Page 73: To avoid confirmation bias, try doing things you have not done before and/or do things in new ways. Examples: 1. Choose a new route when you go somewhere. 2. Order something you do not know from a restaurant menu. 3. Go to events where you meet people you never met.

Chapter 3: The challenge of causal attribution. Why we shouldn't be so sure when we give credit or assign blame.
- Page 80: Most people understand correctly that climate change is caused by a myriad of human activities as well as natural disasters interacting with Earth's atmosphere over time.
- Page 81: Sometimes small causes produce large effects. As an example I came to think of small acts of kindness.
- Page 89: People's causal attributions for the same event often diverge. Deciding what counts as normal and abnormal can vary depending on one's perspective.
- Page 93: Inaction is not always better than a bad action. Example: A person, who does not vote, takes away a vote for a cause or person, which may change people's lives positively.
- Page 95: When we give too much credit to the most recent event, we ignore / discredit other factors that are responsible for the outcome.
- Page 99: We tend to ask more why questions when negative events occur.
- Page 99: When you think of reasons why a problem happened, try to take a different person's perspective. That will help you feel less anger.
- Page 101: We can never find out the true causes of any outcome.
- Page 101: When you stop thinking of why a problem happened, you can take a more distant view. That will help you to free yourself from feeling negative emotions.

Chapter 4: The problem of examples. What we miss when we rely on stories.
- Page 106: Concrete examples are more powerful than abstract descriptions. They stick to our minds much better, and they are more convincing.
- Page 108: The main reason we are not convinced by statistics is that most of us do not fully understand them.
- Page 108: The more observations we make, the more accurately we can make predictions about the future.

Chapter 5: Negativity bias. How our fear of loss can lead us astray.
- Page 143: If having passion and being really good at something is important, why are students with uniform / homogeneous grades prioritized before students with varying grades, when their grade averages are similar? It is because discipline is valued higher than having passion for something.
- Pages 154 and 159: People hate to lose what they own - even when they owned it for only a brief moment. Example: A free membership of something for 30 days will lead some people into buying a membership after 30 days.
- Page 155: Negativity bias exists because our ancestors lived close to the margins of survival, where losing something meant dying, so they had to prioritize the prevention of potential losses.
- Page 156: Negativity bias draws our attention to things that need to be fixed.
- Page 156: Forms of negativity bias can be 1) difficulty to breathe, 2) difficulty to walk, 3) bad grades in school, 4) when a baby cries.
- Page 157: When patients with lung cancer were told they had 90% chance of surviving if they underwent surgery, more than 80% of them opted for operation. When patients were told they had 10% chance of dying after surgery, 50% of them opted for operation. Patients should be presented with both ways of communicating, so their decisions are not swayed by either the negativity bias or the positivity bias.
- Page 161: Loss aversion is one reason why our homes are full of stuff we do not use.

Chapter 6: Biased interpretation. Why we fail to see things as they are.
- Page 170: Countless studies demonstrate bias based on every kind of -ism one can think of. Examples: Ageism. Classism. Ethnicism. Racism. Sexism.
- Page 191: Sometimes the only way to counteract one system is with another - one that is explicitly, equitably and intentionally designed to protect the greater good.
- Page 200: Once you know something, you have trouble taking the perspective of someone who does not know it.

Chapter 7: The dangers of perspective-taking. Why others don't always get what is obvious to us.
- Page 213: Putting yourself in another person's situation, for example the situation of a refugee from a war-torn country, can increase prosocial behaviors.
- Page 219: The only sure way to know what other people know, believe, think or feel is to ask them.

Chapter 8: The trouble with delayed gratification. How our present self misunderstands our future self.
- Page 226: When we recycle to reduce waste or plant trees to capture carbon, we are not immediately rewarded with cleaner air and lower sea levels. It takes years to reap the rewards.
- Page 230: The longer children could wait for a second marshmallow in the marshmallow test, the better they did on SAT tests at the end of high school.
- Page 230: In the marshmallow test, if children were given a toy to play with, their wait time for the second marshmallow increased.
- Page 239: Whenever we are faced with a choice that involves delayed gratification, there is the possibility that our preference for certainty, i.e. getting it now, vs. uncertainty, i.e. getting it in the future, will be a factor.
- Page 239: To cope with fear of uncertainty, one solution is to boost our confidence about the future.
- Page 240: To have faith in the future, remind yourself of a time when you had the power to make a difference - in your own life or in the life of another person.
- Page 242: Think about future events with as much detail as you can.
- Page 243: Thinking about your future self and good things that can happen to you in the future helps you to choose delayed gratification.
Profile Image for Gretchen Rubin.
Author 42 books134k followers
Read
May 16, 2022
A very engaging, readable, and powerful examination of how we can think more clearly.
Profile Image for Carin.
Author 1 book113 followers
September 6, 2022
I actually hadn’t been planning on reading this book, even though the author made a super fun and interesting presentation to us, but it came up in conversation with my husband, who started reading it that night, and I figured I would too. After all, it truly does impact everyday life and just regular conversations daily.

We all think we’re logical. But we’re not. Our brains aren’t designed that way. They’re designed to make quick decisions based on available information. Now, I am the child of an economist who taught game theory, and he taught some of that to me, so I do feel like I’m more logical than most, but that’s a fallacy too (in fact, I was tickled when she pointed out that for decades the entire foundation of the Field of Economics was based on the bizarre belief that people always make the most logical decision. Had economists never met people? I mean, I see that disproved daily.) This whole book is about thinking fallacies. From common ones like anecdotal evidence to ones you haven’t heard of as much, like the fluency fallacy or the recency fallacy, but I promise–when you read the section you’ll immediately think of a time when this happened in your life. For example, there’s a fallacy where we tend to blame things we (think we) have control over. Recently my husband’s car broke down in the drive-thru of a Walgreens. He immediately blamed himself, saying that the car was hesitating on starting, and he had therefore tried to put it in gear too quickly, and by doing that he somehow broke it. Nope. It was the battery. He didn’t do anything to the car.

This book is super readable and surprisingly fun, and it will help us all think better, plan better (one fallacy is that you will have more time to do things than you really will), and argue better. How can you say no to a short book that will accomplish all that?
Profile Image for Stetson.
511 reviews311 followers
March 16, 2024
Thinking 101 is another unneeded addition to the genre of books on cognitive biases. This genre reached maturity with the publication of Thinking Fast and Slow in 2011. More thank a decade later, it is time for a moratorium.

It is to be expected that pop psych books couched in part as self-help will be a mainstay in nonfiction publishing, but these shouldn't be authored by active research scientists in psychology. It dilutes the purported gravity of the actual research science, especially if the ideas are communicated in superficial ways.

Ahn has chosen accessibility and engagement over rigor in Thinking 101. We are provided a brief tour of the usual cognitive biases: the illusion of explanatory depth (fluency bias), availability bias, confirmation bias, loss aversion, and common pitfalls in causal inference. The is well-trod ground, and the delivery is somewhat glib. Instead of a survey of well-known cognitive biases, Ahn could have better served readers by doing a deep dive on causal inference altogether since this is apparently one of her active areas of research. Instead this opportunity was missed.

Apparently, it is okay for top academics to pass off recycled ideas with some new anecdotes or workshopped opinions on related rhetorical discussion points as books worth reading. Of course, any halfway decent writer familiar with a topic could pull this off. Why does an ivy league professor needs to fill the publishing space with this sort of thing? When superficial work like this is delivered by an academic, it is safe to assume it is primarily done for promotional and professional reasons. Whether the book communicates something meaningful from psychological research starts to become secondary to the goal of making the author an intellectual brand. Academic authors of wholly derivative and superficial works are doing readers a disservice

Finally, this book is mistitled. This isn't a book about how to think deliberately reason through logical problems. It is a book about the shortcuts that are brains have evolved to use in order to arrive at effective, low-cost answers in high stakes situations.

Maybe I am being too hard on this book. Maybe it is okay for young readers who are novices to the subject. Nonetheless, it seems like a work that is doing a lot more for the author than the readers.
Profile Image for Read By Kyle .
566 reviews446 followers
January 14, 2023
This book is excellent and is my new go-to recommendation for this subject. It's not that I learned much from it; I actually maybe learned nothing? Or very little. But I treat books like these like mental pushups, keeping myself sharp on the various biases and things humans do to trick themselves into thinking they're rational beings and I think this one does an absolutely great job at keeping everything very simple for the reader, giving multiple examples that are very clear, without going overboard on repeating information. It also covers quite a lot of ground and talks about a lot of the most important experiments in cognitive science and what we can extrapolate from them and even some things that we can't.

If this book was required reading in school, the world would be a better place.
Profile Image for Ari Partrich.
22 reviews
July 22, 2023
Finally writing an in-depth review because this is info I’m somewhat familiar with and I’m not totally out of my depth in reviewing it

This book was assigned to me in a college course and presented as basic information about biases. It was, and I have previously taken courses in the same vein as this book so I wanted to see if this could add anything new. And… eh. I’ll concede that about 3 chapters were entirely new information to me, but as a whole, this work, for what it is, didn’t pull me in despite the author trying to be relevant (if anything, that’ll end up hurting its shelf life). It’s generally uncritical and doesn’t really touch upon the conditions that create and accentuate our biases. There’s a COVID-19 example in the first chapter that is just one long, atrocious generalization that left my margins filled with question marks.

If you want an intro college course about bias (emphasis on) compressed into ~250 pages this is for you. It’s the various prominent types of bias, their definitions, and examples. It’s thought provoking to people who have never heard of the terms that are presented, but not really for anyone else. Some examples are kind of weak and, unfortunately, the definitions do not go any further than how they function on an individual scale. Bias necessitates at least two actors and is something that is cultivated by one’s environment, and that essentially cuts through most of the logic pervading in the wording of the definitions and the following examples. This makes it feel like a very self-helpy read in a negative way because, as with most self-help books, their ‘strategies/solutions’ are in no way replicable to most people. They generally just reach an audience with similar life experiences. But what’s really the novel experience here? I wouldn’t call putting established terms to commonly felt experiences novel.

Honestly, the arguments within the examples for these well-established biases are so weak at points that I doubt even the target audience is resonating. Trying not to be harsh but there were some sections in my first couple progressions with the book that straight up were like:

______ Bias
-personal story, definition, and maybe a little history
-‘one study shows that x perception causes y bias’ … the course I’m reading this for is on research methods and lesson one is that one study is never enough
-simplifying for the reader by doing a syllogism that turns out to, somewhat comically, be premise-premise-totally unfounded conclusion
-another personal story about how knowledge of the bias created a strategy to go about a similar task as the initial personal story

And this just makes me wonder… when the target audience is so narrow, when the information is pre-established, and when the strategies don’t seem very effective on a wide scale, what is the aim of the book when these definitions are now google-able?
Profile Image for Eric.
200 reviews34 followers
September 8, 2022
TL;DR

Dr. Woo-Kyoung Ahn’s Thinking 101 teaches us how we make mistakes in our thought process through clear definitions and many pertinent, interesting examples. By putting into place her strategies for combating biases, maybe we can make ourselves a little better and find a little more grace and patience for others. Highly recommended.

Disclaimer: The publisher provided a copy of this book in exchange for an honest review. Any and all opinions that follow are mine alone.

Review: Thinking 101 by Woo-kyoung Ahn

Prior to 2016, I believed that learning to argue would be a good way to change people’s minds about politics. I believed that well-constructed, logical arguments were the way to discuss politics. 2016 and the years since have changed my mind. A person’s politics are decided by emotion with rational explanations decided after the fact. It doesn’t matter which way the person leans – left or right – they are still going with their gut emotional reaction and, then, justifying it later. This has been even more apparent with the rise of Qanon and its smaller Liberal counterpart. As I watched an acquaintance fall under the spell of Qanon and deny reality, I began to wonder how does that happen. What thought processes lead a person to believe such disprovable lies? The only answer I could come up with was biases. People’s thoughts are biased in certain ways that logic cannot overcome. So, I sought out to learn more about biases when a book popped up on NetGalley that promised to do just that. Thinking 101 by psychologist and Yale professor Woo-kyoung Ahn used her extensive research into the human mind to demonstrate how we think and how we can improve our thinking. Ahn draws from her Yale course and from the thinking lab that she leads. This course won’t free us from biases, but we can make better decisions if we’re more aware of our biases.

Professor and psychologist Woo-kyoung Ahn has written an amazing book about how humans think. My review copy had eight chapters that cover the following biases: Fluency, Confirmation Bias, Causal Attribution, Examples, Negativity Bias, Biased Interpretation, Perspective-Taking, and Delayed Gratification. Each chapter gives a definition, some examples, some complications, and strategies for counteracting the effects of the bias. Each chapter flows into the next, and as this isn’t a long book, readers will tear through it. I found each chapter to be fascinating, and I could probably write an essay on each chapter and what I learned from it. Reading about all these biases in a row could be daunting and potentially sad. But instead, I found myself chuckling and nodding along with Ahn’s writing. I easily saw myself in the pages of her book. Instead of seeing how poor my own thinking was, I saw how entirely human I am. Ahn’s book connected me to people in a way that I wasn’t prepared for. I had picked this book up to learn why other people think the way they do, but instead, I found my gaze turned upon myself. And my mental health is all the better for it.

As an example of what readers will experience in the book, let’s look at the first chapter: The Fluency Effect. The Fluency Effect gives us overconfidence. It makes us think we’ve got skill or knowledge that we don’t have. Think of the Dunning-Kruger effect. This happens to all of us. For me, I watched several YouTube clips of bakers making bread. It looked easy enough. So, I tried it. My first bread was basically a cracker. It was thin and crunchy instead of puffy and soft. It was my third bake before I got something that looked right. It was my fifth one that I was actually proud of. On that first attempt, I suffered from the fluency effect. Simply because I watched a video doesn’t mean I had the skills or experience to perform like the video. This is the fluency effect. Woo-kyoung Ahn suggests trying things out to determine your skill level, but she also describes pitfalls associated with making attempts. Ahn discusses the planning fallacy and how it plays into life. She even tells how it builds into conspiracy theories.

Ahn’s chapters give similar detail to the above paragraph. She even breaks down some biases, like Causal Attribution, into cues that influence the conclusions we draw. For example, in Causal Attribution, she says the cues of similarity, sufficiency and necessity, recency, and controllability all effect the causes we attribute to certain actions. But she does more than just classify and define. She gives numerous examples to make her lessons concrete, and the lessons are interesting. One thing you learn writing is that the specific detail is more relatable than the universal. This means that readers are more able to relate to specific details, like saying a blue, porcelain vase instead of simply vase. Or a man with bushy gray mustache, wrinkled forehead, and gray locks of hair in his face paint a better picture than simply old man. Ahn uses this well; her examples are specific and targeted to each lesson she’s giving us. The examples in each chapter are what make this book so useful to me and why I recommend it for anyone. Her years of experience teaching come through the page, and it’s clear that she’s a very good teacher.

Confirmation Bias

Of all the biases she discusses, confirmation bias is the one that I see happen most often. Whether in politics, medicine, video game or book reviews, personal job performance, etc., we all fall into the trap of confirmation bias. This bias is where we seek information that confirm our suspicions rather than disprove them. Ahn recommends a strategy for confirmation bias. When forming a hypothesis, seeking confirmation is not enough. We must also seek what disproves our hypothesis. If we can disprove the hypothesis, then it must be wrong. If we can’t, it may be correct. Her examples of how this hurts us and hurts society are clear and pointed. I loved them. They drive home the need for us to note our confirmation biases.

In her chapter on confirmation bias, she also discusses maximizers versus satisficers. Maximizers are people who are always on the lookout for something better. They want to pull the maximum satisfaction out of life and, even if they’re happy in whatever they’re doing, they’re still looking for something better. Satisficers are people who end their search when they are satisfied enough instead of continuing to look for the absolute best. I had never heard these terms before, but based on Ahn’s descriptions, I knew exactly which one I was. She says this is a maladaptive effect on our lives. I can confirm. I love my job; yet, I always am researching other fields, such as law, economics, programming, etc. I have no idea why other than the fear of missing out on the best job I can have. It is maladaptive because I never feel good in my position despite the fact that I love my job.

Biases Aid Structural Racism

Ahn throws in examples of how biases can lead to unhealthy results and also racist results. I enjoyed her use of Bayes Theorem to show why conflating terrorism with Muslim is not statistically sound. She describes how the harm of biased interpretations perpetuate prejudice. But the most damning thing she says is that contradictory evidence doesn’t make us change our minds. Instead, evidence that contradicts us results in us being even more polarized. People double down on their belief when presented with evidence that the belief is wrong or misguided. People will make excuses as to why the contradictory evidence exists but doesn’t apply to them. This is all over the place in today’s U.S. political scene. Despite the fact that there’s no basement in Comet Pizza, Qanon people still believe there’s a child abuse ring underneath that restaurant.

This is why educating people about racism will never solve racism. White people are very good at making excuses for why they’re not racist or how because America freed the slaves or elected Barack Obama, that the U.S. isn’t racist. Because of this, the Supreme Court of the United States used this very logic to gut the Voting Rights Act in 2013. No matter how many times systemic bias is demonstrated; individuals rationalize it away with excuses.

How Does This Help?

This book claims to be able to help improve our daily thinking. So, did it work? Has my thought processes improved? Yes, I think they have. I wouldn’t say my thought processes have changed, but I’m more aware of my own internal biases and how they’re working. The book didn’t promise to free me from my biases; it can’t do that. However, I can be more aware of what affect my decision making process. I definitely know that I suffer from confirmation bias, but through this book, I learned about planning fallacy. It’s something I’ve learned the hard way that I suffer from. I’ve never put a name on it, and it’s only been in the last few years because of work that I’ve begun to consider. At first, I thought it was rose colored glasses and being optimistic. Now, I see that it’s an actual cognitive process.

The main benefit I’ve gotten from reading this book is a bit of stress relief. I’m giving myself a little more room to make mistakes because despite my best efforts, there will be biases that I cannot overcome. Part of Ahn’s book talks about people knowing they have these biases and yet falling prey to them anyways. Knowing that I have biases slows down my thinking and gives me some grace towards myself.

In addition, Ahn includes strategies for overcoming biases, and a lot of them boil down to putting yourself in another person’s place. Empathy is the strategy to combat biases, and I love this. Cultivating empathy is something this nation needs.

Can I Use This Book to Win Arguments?

Yes. This book has a number of strategies that could be used to manipulate others. But that’s not the book’s intended use, and, in fact, manipulation goes against what Ahn is trying to do with her book. Yes, you can use this book to win arguments, to play upon people’s biases for your benefit, but that doesn’t help your thinking. Using it in such a way is the opposite of empathy.

Ahn’s project isn’t for us to connect with other people through manipulating their ability to think. It’s for us to connect with ourselves by cultivating empathy for ourselves and for others. It’s to de-center ourselves and attempt to view our thoughts from a different perspective. This allows us to view self-criticism in a new light. Often we criticize ourselves way more harshly than we would a friend or, even, a complete stranger. By looking at this criticism as if it were applied to a different person, we can begin to give ourselves a break.

But we’re also training ourselves to see alternate perspectives. If you wish to win arguments, then learning to view a subject from your opponent’s perspective is key. By having empathy for them, you can begin to understand how they’ve formed the conclusions they’ve made. It doesn’t mean you have to agree with those conclusions, simply that you can see where they’re coming from. This allows you to structure your arguments and responses in manners best suited to them, not yourself. Debates can’t be won by laying down a card declaring your opponent’s bias. Debates are won by getting the other side to agree with you or, at least, to consider your arguments as having merit. The best way to do that is to consider your opponent’s thought process and to consider their biases. Thinking 101 will help you do that.

Conclusion

Woo-Kyoung Ahn’s Thinking 101 taught me a lot about myself. I originally requested it to learn how other people think, but instead, I learned how my mind works. Knowing that bias is an inherent part of being human and that there are strategies to work on my biases made me feel better. I plan to work on these, particularly the overthinking and rumination associated with Causal Attribution. My mental health will be better for it. If you read this, I think yours will be too. Highly Recommended.

Thinking 101 by Woo-Kyoung Ahn is available from Flatiron Books on September 13 th, 2022.
Profile Image for Philip.
434 reviews65 followers
October 1, 2023
"Thinking 101" is a book for anyone and everyone who has or listens to opinions, and who consumes news, statistics, or social media posts. In short, everyone.

It's a book that started in the author's self-doubt and need for a purpose - hence the "Live Better" bit in the title, and she makes a solid argument for that little add-on too - but it is so much more than one person's search for a meaning. It's a book to remind you of humanity's and your own cognitive flaws - fallacies, bias, misunderstandings, and a fucked-up rewards system.

I'm sure you've already read some and heard plenty on the topic, and this book adds very little - if anything new. But if there is one topic that constantly need reinforcing it is this one, and "Thinking 101" is a very solid addition to the corpus!

While I won't go into too much of the content of the book since a) most will likely be familiar with it in one form or another, and b) you can read Frank's review if you want more details, I definitely recommend picking this one up - especially if you've never anything like it before.

However, there are are a few negatives in the book. Chief among them for me, there are a few personal value/morals statements used as examples that I think the subject matter would have been better off without. Not because I think she's necessarily right or wrong, but because she doesn't allow herself the space to expand upon or support them, she states them as simple this-is-how-it-is statements - which I find more than a bit odd considering what the book deals with. I think the purpose of the book would have been better served had the author stayed away from those.

This is also very much an introductory read that frequently just skims the surface. Which isn't a negative if this is a reader's first foray into the subject, but it does leave more to be wished for at times.

Nonetheless, a very strong 3-stars for me, recommended!
54 reviews
January 19, 2023
She had a great idea for writing this book. She gathered great and key concepts in psychology and cognitive & behavioral sciences and massively failed when choosing her examples and how to explain them. I am a counselor and have a bachelors in psychology, so I am familiar with many of the terms and concepts she explained and I am so sad that she made this so personal and political. And mind you, I do share most of her views. But she missed the opportunity to create a great book for everyone. The amount of covid talk is so unnecessary. She missed the chance to teach those topics to everyone, but because she chose to explain her point with such political matters, I am positive many readers didn’t even finish her book. Her Amazon reviews are massively about how political this book is.
And it’s sad because she’s a good writer with great material to teach.
Profile Image for Jacob Williams.
608 reviews18 followers
October 3, 2022
I like reading books about cognitive biases periodically, even if it's mostly familiar material. I'm hopeful I'll eventually get better at detecting when they're clouding my judgment. (I don't think it's working yet.) This is a solid entry in the genre, though not my favorite.

Explanations. Ahn tries not just to describe the biases we have, but to explain why we have them. These explanations seem speculative but they are fascinating:


- Illusion of fluency: Why do we overestimate our ability to do something after we've read about it or watched others do it? Because our brains need a way of "knowing whether [we] know something", and one heuristic we use for that is how familiar we are with it. But when (as in an example Ahn gives) we merely watch someone perform a dance, we become familiar with the dance without actually building the skill of performing it, so the heuristic misfires.
- Confirmation bias: Why do we seek evidence that would confirm what we already believe, when seeking evidence to disconfirm it is more helpful in actually finding the truth? It "might be a side effect of meeting our need to satisfice, stopping our search when it's good enough in a world that has boundless choices."
- Loss aversion: Why are we often more motivated to keep a thing once we have it, than we would be to acquire the same thing if we didn't? Possibly "our ancestors lived so close to the margins of survival, where losing something meant dying, so they had to prioritize the prevention of potential losses."


Strategies. Here are some of the book's suggestions for dealing with one's own biases. They seem sensible, though I was hoping for more.


- Confronting the illusion of fluency: Make yourself do the thing so you can assess your true competence level:
Some people think they're trying out skills when they're simply running the process in their heads and not using their physical muscles. When you imagine doing those dance steps or giving a presentation to your client, you are reinforcing the illusion. Everything flows smoothly in your mental simulation, feeding your overconfidence. You have to actually write down your presentation word for word and speak out load using your tongue and vocal cords, or enact every movement of the dance using your arms, legs, and hips.


- Accounting for the planning fallacy: Encountering unforeseen obstacles is routine, so Ahn recommends assuming stuff will take you 50% longer than you expect. One strategy that she warns is not reliable is creating more detailed plans: she discusses one study in which "step-by-step plans exacerbated the effects of the planning fallacy" (it seems like participants were overly optimistic about the individual steps, and thus became overconfident about the plan as a whole), but another in which it helped.
- Pitting confirmation bias against itself: Since we naturally seek evidence in favor of whatever hypothesis we're considering, it's helpful "to consider not just one but two mutually exclusive hypotheses and try to confirm both." Relatedly:

...ask a question framed in two opposite ways. For instance, in thinking through how happy you are with your social life, you can ask yourself whether you are happy or whether you are unhappy. These two questions inquire about the same thing, and should elicit the same response—like "I'm sort of happy"—no matter how the question is framed. Yet if you ask yourself whether you are
unhappy, you are more likely to retrieve examples of unhappy thoughts, events, and behaviors. If you ask yourself whether you are happy, you are more likely to retrieve opposite examples.


- Using randomness to fight confirmation bias: Ahn mentions an app that software engineer Max Hawkins created that Ubers him to random locations. Randomly doing something different than what you normally do increases the odds that you'll find evidence disconfirming your existing beliefs. I like Hawkins' statement about his practice of "Randomized Living" on his website: "I believe giving in to chance is a way to liberate yourself from personal and social programming that traps you in a narrow sense of self."
- Admitting we're bad at perspective-taking: Ahn cites multiple studies showing how astonishingly bad we are at correctly interpreting other people's words and accurately imagining what the world looks like from their point of view. We suck at it, we suck at it worse than we think we do, and it's not clear that there's any quick or easy way to get better at it. Thus, she advises:

Stop letting others guess what we think and just tell them. ... Likewise, stop trying to read people's minds and feelings. If you are a compassionate and accommodating person, it is particularly hard to resist the temptation to guess others' thoughts. But study after study has shown us how disastrous this can be. The only sure way to know what others know, believe, feel, or think is to ask them.


- Motivating yourself to wait for delayed gratification: Imagining a future period of your life in detail can give you more self-control. Evidence for this includes a study that asked people "to list events they had planned over the next seven months" and one in which women "listened to audio recordings of their own musings on good things that could happen to them in the future".


Examples. Ahn uses several politically-charged examples. That makes sense; politics is perhaps the area of life where clear thinking is most important, since poor decisions hurt not just individuals but entire societies. But it's also risky. For most contentious issues, I don't think you can really cover all the relevant considerations in a short discussion. Pointing out that one side is committing a particular fallacy and declaring the other side the winner is overly simplistic, and problematic for two reasons. Those who don't already agree with you will focus on the weaknesses or omissions in your discussion, and may dismiss your entire work as biased. Those who do already agree with you may develop the illusion that getting the right answers is easy, and subconsciously think of cognitive biases as primarily tools for explaining why the other political party is so stupid, rather than as problems that almost certainly affect their own thinking too in ways that are extremely difficult to detect and overcome.

In particular, consider Ahn's discussion of confirmation bias causing underrepresentation of women in science. She establishes the problem by citing a single anecdote about an award ceremony, not statistics. She makes broad statements like "society believes that men are better at science than women" and "[w]hen male students say something insightful during a seminar or in a class, they receive more compliments than female students who say similar things" without providing citations. Then to establish that this is depriving society of scientific advances, she again uses just a single anecdote (about the fact that the first page of results when you do a search for "scientists who developed COVID-19 vaccine" mostly returns women). The whole section comes across not as a serious investigation of a hypothesis, but as a recitation of the first few things that came to mind in support of a conclusion the author already believed in. To be clear, I'm not saying she's wrong. But sloppy arguments are dangerous even when the conclusions are correct. If someone who is skeptical of gender discrimination in science reads this part of the book, I suspect it will make them more skeptical, by making it easier for them to assume that concerns about gender discrimination are typically rooted in lazy thinking.

An example I did like was how job interviews encourage overreliance on a single sample:


...given that face-to-face interactions are vivid, salient, concrete, and memorable, interviewers think they are observing who the candidate truly is, rather than a biased portrayal of the person tinted by random factors. And this impression of a small sample of qualities on exhibit that particular day can make the decision-makers ignore the records that more accurately reflect the candidate's skills, demonstrated over many years. A person who looks amazing and brilliant during an interview may not be as awesome once they are hired. Given regression toward the mean, that is what we should, to some extent, expect. And a person who didn't perform brilliantly in an interview ... could turn out to be a big catch the company missed.


Studies. Some interesting studies Ahn references:


- Kardas & O'Brien 2018: Repeatedly watching a video of Michael Jackson do the moonwalk increased people's belief that they could do it, but not their actual ability.
- Fisher et al 2015: People who had recently used Google to answer some questions had more confidence in their answers to other, unrelated questions than people who hadn't.
- Wason 1960: Peter Wason's 2-4-6 task neatly demonstrates confirmation bias. (If you're not familiar with it, click here to try it.)
- Ahn et al 2019: College admissions officers valued the absence of B+'s more than the presence of A+'s when comparing students with identical GPAs.
- Fryer et al 2012: Promising teachers a $4000 bonus if their students' test performance improved was not effective. But giving them a $4000 bonus up-front which they'd have to pay back if student performance didn't improve had a significant effect. (Ahn isn't endorsing this as a policy recommendation, just using it as an illustration of negativity bias.)
- DeWall et al 2015: Acetaminophen removed people's tendency to assign a higher sale price to objects they own than objects they do not.
- Kahan et al 2017: When interpreting made-up data about the effects of a skin cream, people who had demonstrated higher numeracy were better at making correct inferences from the data than others. However, when the same data was labeled as being about the effects of gun control, Ahn notes that "people with stronger quantitative reasoning abilities used them only when the data supported their existing views."
- Savitsky et al 2011: People were no better at interpreting the meaning of ambiguous sentences when they were spoken by a friend or spouse than when they were spoken by a stranger.
- Birch & Bloom 2007: A complicated experiment demonstrated that adults, not just toddlers, have difficulty ignoring facts that they know but that someone else does not when trying to predict how that person will behave.
- Eyal et al 2018: 24 experiments showed that just mentally trying to view things from someone else's perspective doesn't help you reach accurate conclusions about them at all.
- Grosch & Neuringer 1981: Pigeons are more willing to choose a better delayed reward over a worse immediate reward if they have a superfluous task to distract themselves with during the waiting period.
Profile Image for Chris Boutté.
Author 8 books273 followers
March 27, 2023
I always have a book like this in my rotation because we always need refreshers on our thinking errors. I came across this new book and decided to give it a read. Woo-kyoung Ahn did an awesome job with this book. If you’ve read a lot of these books like I have, there won’t be a lot that’s new, but it’s definitely different. She has a ton of anecdotes that makes the topic more engaging and simple to understand. Other books mainly cover studies around the topic, and this one does too, but the stories make it easier to understand.

If you’re new to this topic and want to become a better thinker, this is the book for you.
Profile Image for David.
360 reviews
August 10, 2025
There is nothing new in this book that has not been rehashed repeatedly over the last decade or so. If you have never read anything on the subject, you may benefit from it, but I would still argue that Thinking, Fast and Slow and The Signal and the Noise are far better than this book.

The reason I gave this three stars rather than two or fewer is that the book provides some interesting insights into the mind of a Yale academic. We all reveal much of who we are when we speak and write—both intentionally and unintentionally. Her examples and tone, while covering the exact same material, reveal insight into the mind of someone in her position: a prime example of academia and much of the new modern bourgeoisie in the United States.

As a psychological and sociological study, this is a good book, reminding me of the insights found in works like Zakaria’s latest.
Profile Image for Jonathan.
984 reviews13 followers
December 28, 2022
2/10

Inconsequential, inconsistent and inconsiderate drivel. Ahn read Thinking, Fast and Slow, and assumes you didn't. This is easily one of the most derivative and downright dumb books I've read this year, and I've read some doozies.

She seems like a nice lady, but one that just learned something cool about psychology and went to share it with everyone she didn't know before learning how to write or make something original.
Profile Image for Alex Edmundson.
36 reviews6 followers
May 31, 2023
I liked most of this book. Where it got me is when she started on leftist talking points about things like COVID and Global warming. Her Covid talking points were debunked before she wrote this book, but she still put it on there. If she didn’t have that I would have given it a much higher rating.
Profile Image for Amelia.
590 reviews21 followers
January 25, 2023
“Because we project our own knowledge and feelings onto others, we are overconfident and believe we know what they think. As a result, we don’t bother or we forget to verify whether our assumptions are correct. Gathering facts is the only sure way we have to understand each other.”

Woo-Kyoung Ahn teaches courses on how to think critically. By dispelling mythos about hot topics within thinking--including fallacies and the like--through anecdotes, examples, and studies, she illuminates how the more facts we have, the better. Some concepts she describes include cognitive dissonance, confirmation bias, negativity bias, biased interpretation, and more. While at times she uses difficult math to further elaborate on some of these topics (for things such as probability or likelihood of events occurring), she condenses it to make sense for us armchair readers in a way that is both accessible and still academic at heart.

Beyond explaining the different biases we may hold, intentionally or otherwise, she argues that it is these biases that make it more difficult to understand one another on a systematic and personal level. Confirmation bias, for example, "proves" to us that an entire type of people (class, race, sex, otherwise) are all alike because one person from that group confirms a stereotype. Similarly, fear of being placed within that stereotype from confirmation bias interpreted by others causes others to reject assistance they very well may need (for example, a black woman may not want to go on welfare or unemployment due to the Welfare Queen stereotype).

In learning more about these biases, Ahn hopes we can better understand and navigate the world around us by asking more questions and looking for more facts. But be warned--there is also bias in knowing too much and constantly looking for more information or ruminating can impact the data we have gathered.

I think this book would pair nicely with Dark Data: Why What You Don’t Know Matters, which also discusses how there is more information, bias, and genuine ignorance when it comes to collecting data.
Profile Image for Kevin Haar.
Author 1 book7 followers
January 1, 2023
Super interesting and accessible breakdowns of inherent biases in human thinking and how to be aware of them and combat them. Ahn covers things like fluency bias, negativity bias, and confirmation bias. Through detailing experiments and numerous examples from her time as a professor at Yale, Ahn explains each of these biases and the effects it has on human thinking. She explains why we are programmed to think this way and what we can do to recognize our biases. Ahn believes that cognitive science can make the world a better place. In Thinking 101, she makes her case quite effectively. This book has helped me be aware of more of my own tendencies and strategies for how to overcome them.
Profile Image for Paul.
168 reviews
April 20, 2024
This book covers many topics you may have already found if you read or follow pop psychology. This is not bad since biases and entrenched ways of thinking are inherently challenging to overcome or recognize, even when you know about it. It is especially difficult to identify when you are having a conversation or argument, at least for me; others may have faster brains to analyze more on the fly. Afterward, though, it is good to identify where one's thinking may be flawed or biased. It's good to challenge our own beliefs, thought patterns, and sources. No, seriously, this fortress you have constructed out of your beliefs, occasionally knock out a couple of bricks and examine them to make sure they aren't made of cheese, sweet, stinky cheese. I hope more people will read this book. Unfortunately, as a cynic, people who most need to read this book won't. Tangent aside, the author does an excellent job of teaching this information and providing examples. Thinking 101 definitely made me think.
Profile Image for Sabrina.
81 reviews12 followers
February 2, 2023
I really appreciated the concepts Ahn shares in this book. She did a masterful job of taking ideas that could be complicated and breaking them down in a very accessible way. And with all the practical examples, her book is not only useful but also very engaging. Though I don't share all her viewpoints or agree with all her applications of the concepts, I would still highly recommend the book for learning how to think better.
18 reviews
August 21, 2023
Listened to the audio book. A lot of the concepts that are discussed are common and well-known so there wasn’t a lot of “new” information but everything was presented well and had a lot of statistics and examples to illustrate. It’s always good to get a reminder of the biases that influence our behavior so we can think about ways we want to address behaviors we view in ourselves as harmful.
Profile Image for Adrian Garcia.
22 reviews1 follower
February 25, 2024
3.5*

este libro se supone que es lo equivalente a tomar la clase “Thinking” de Prof. Ahn de Yale y evidentemente leerlo se siente como estar sentado en un salón de clases. el material esta muy interesante y bastante útil la verdad, just not a super exciting read. si creo que voy a usar varias de las estrategias que aprendí de este libro tho.
Profile Image for Josh Crutchfield.
4 reviews
November 7, 2022
This book was very good, very insightful. If you are into psychology and understanding why we (or others) think the way we do, the information shared in this book sheds a lot of light. The author is very knowledgeable in her field and uses personal examples from her own life and experiments that she has personally conducted in order to help the reader understand the topics discussed. Highly recommend!
168 reviews8 followers
March 1, 2024
Overall quite good! I don't know if it was exactly revolutionary for me considering I'd read a lot of the books that were cited, but someone who's new to this sort of book would do well with starting with this one.
103 reviews1 follower
November 25, 2024
Era partito bene per poi andare un po fuori tema. Mi aspettavo un po' di più. Sono spiegati tanti test fatti a gruppi di persone per far vedere come si cambia risposta a seconda di come viene posta la domanda. Qualcosa è interessante, qualcosa meno.
Profile Image for Jung.
1,826 reviews40 followers
Read
December 16, 2022
Thinking 101 (2022) asserts that by understanding and overcoming thinking biases, we can better solve or even avoid most problems, from everyday conflicts to larger societal issues.

WOO-KYOUNG AHN is the John Hay Whitney Professor of Psychology at Yale University. After receiving her Ph.D. in psychology from the University of Illinois, Urbana-Champaign, she was assistant professor at Yale University and associate professor at Vanderbilt University. In 2022, she received Yale’s Lex Hixon Prize for teaching excellence in the social sciences. Her research on thinking biases has been funded by the National Institutes of Health, and she is a fellow of the American Psychological Association and the Association for Psychological Science. Thinking 101 is her first book.

---

Improve your life – and society – by overcoming common biases in thinking.

One of Yale University’s most popular undergraduate courses is simply titled Thinking. The course explores common thinking pitfalls and, better yet, how people can overcome them to make better-informed decisions to improve their own lives and society as a whole.

You don’t have to attend Yale to unlock this knowledge. Professor Woo-kyoung Ahn, who developed and teaches the course, makes the top takeaways available to everyone in Thinking 101.

This book explains common types of thinking biases and how they can affect our decision-making, including evidence for how they’re often at the root of problems ranging from poor financial decisions to societal prejudice. You’ll also discover solutions for overcoming each of them.

---

Our minds overestimate our abilities to do things that seem easy.

Have you ever watched a YouTube video for a recipe, makeup tutorial, or home repair that seemed simple until you tried it and had to file it away as a failure? Woo-kyoung Ahn runs a similar experiment with her students: They watch a six-second dance routine eleven times plus a slower instructional video. Then, they can volunteer to do the dance with the promise of prizes for doing it successfully. There’s no shortage of volunteer dancers, yet no one nails it.

Why? Because fluency, meaning how easily our brains process new information, can fuel overconfidence, decision-making, and outcomes.

While fluency informs metacognition – which is the critical process by which we judge situations to determine the next steps – we can’t rely on it entirely to ensure good outcomes.

Thankfully, there’s a pretty simple way to overcome the fluency effect: you can practice new things, like rehearsing a speech or interview responses. Of course, there are situations when you don’t get a trial run, like tackling a home renovation project. In those cases, you can plan, but you should be aware that studies show people also tend to be overly confident and optimistic about planning. To counter that, add padding to your initial estimate of what it will take to accomplish your goal, whether that’s time, money, effort, or a combination. The author recommends adding 50 percent to your initial estimate. So for example, if you think you can meet a deadline in two days, tell your boss to expect it in three.

---

We tend to go with what we think we know without considering all possibilities.

Imagine you’re given three numbers in a sequence and told that the sequence follows a rule that you must determine. You then have to give another set of three numbers that follow what you believe that rule to be.

All clear? Good. Then let’s try this one out.

The numbers you’re given are 2-4-6.

What’s your series of numbers, and why?

When Ahn poses this question to her students, many give the same answers as participants in the famous experiment conducted in 1960 by cognitive psychologist Peter Wason. They say “4-6-8,” which indeed follows the rule. Then, they assert that the rule is “even numbers increasing by two.”

But it’s not.

Once they know this, guesses become more complicated and the students become more exasperated until finally, they figure out the rule guiding the series is simply “any increasing numbers.”

This example illustrates the concept of confirmation bias and how it can block our ability to solve problems – sometimes with simpler solutions.

So how can you overcome confirmation bias? You could, for example, come up with two mutually exclusive hypotheses and work equally to confirm both. There also are small ways to try it in your day-to-day life, like taking a different route to work, ordering a dish you’ve never tried next time you order takeout, or letting a friend pick out something for you while shopping. You may end up with a more pleasant commute, a new favorite dish, or maybe a sweater you never wear, but your mind will be more open than before.

---

We prefer examples and stories at the expense of more rational, statistical data.

Imagine if you’d enrolled your young child in ice-skating lessons and then saw little to no skill improvement over three years. Around the same time you started the skating classes, you also let them give soccer a go. Then at a match, you noticed your child was actively running away from the ball when it was kicked toward them.

You’d probably think, “My child is just not into sports.” The author certainly did when she experienced these situations with her son. Yet in high school, he discovered cross-country running and became captain of the team.

It wasn’t that he disliked all sports, just the two his mother chose when he was young. Ahn explains that she came to a faulty conclusion based solely on just two examples when, in reality, there’s a world of different sports out there. She uses the story – and yes, it is anecdotal – to illustrate the law of large numbers and how it can help better inform our decisions.

Researchers have long asserted that storytelling through anecdotes and examples is powerful because it appeals to our senses and is therefore more relatable than abstract concepts. For example, when the Centers for Disease Control and Prevention rolled out a campaign featuring testimonials by former smokers who’d experienced life-altering conditions, data soon followed showing a 12 percent increase in people attempting to quit. These types of campaigns are consistently shown to be more effective than abstract approaches used in the past, such as warning labels on tobacco product packaging.

The challenge is when data overwhelmingly points to conclusions that conflict with specific examples. In such cases, we need to override our sensual preferences and look to statistics to make rational decisions.

To balance your natural reactivity to anecdotes, you need to become more comfortable with data science. Ahn explains that people aren’t comfortable with statistics for many reasons, first because it’s just not a regular thing to work with the types of large numbers and demographics that make up study samples. Even the concept of using probabilities in reasoning is relatively new to humanity – it wasn’t documented until the 1560s. And while most statistical concepts are learnable, they’re just not easy to mentally summon for day-to-day situations.

But some data science concepts are simpler than you may think, like the law of large numbers we referenced earlier. It simply means that the more data there is, the better it is for decision-making. If you consider any data available to you instead of relying solely on a single story, you’re more likely to arrive at a logical conclusion.

---

We care too much about negative facts and fear losing ownership.

Many studies support the idea that people give more mental real estate to negative events than positive ones. Others show how our fear of losing something can keep us from even considering possible gains, at least up to a point. Even the emotional attachment of ownership clouds our judgment in what is known as the endowment effect.

In one study, researchers gave a group of people a choice between a mug or a chocolate bar as a gift, and the group split about fifty-fifty in which they chose. A second group received a mug each as a gift to keep. They then had the option of swapping the mug for a chocolate bar – only 11 percent took that offer. You might wonder if there was something special about the mug over the chocolate bar, but when the gift was reversed for a second group, only 10 percent wanted to exchange their chocolate for the mug. In the latter two groups, an overwhelming number of people didn’t want to part with something simply because they considered it to be theirs.

This and other types of negativity bias can lead us astray in making the best choices to start with. On top of that, we have difficulty getting rid of things that aren’t serving us. Thankfully, there are ways to make this duality work to our benefit. We can positively reframe options before us, like considering the 90 percent chance of surviving necessary surgery or choosing a flight with an 88 percent on-time arrival rate. On the flip side, we should equally scrutinize when a salesperson frames a potential sale by removing options versus adding to a baseline. In the case of overcoming perceived ownership, we need to be more watchful about offers for free trials of a service. Remember not to let believing it’s yours sway your decision to keep or cancel. Think instead of whether you’ll use it enough to justify the cost.

---

We shape new facts to fit what we already know.

Biased interpretation is the root of the confirmation bias we discussed previously. It’s when we rely on our existing beliefs so much that we even take new, conflicting data and shape it to fit the story we’re telling ourselves instead of being objective.

For example, when the author was pregnant with her first child, she happened to read an article in Nature about how babies exposed to any light while sleeping were found to be five times more likely to develop nearsightedness. With that, she crossed night-light off her nursery shopping list.

A year later, Nature reported that the study had come to faulty conclusions. Researchers had failed to consider that the parents of the babies who developed nearsightedness had the condition themselves. Because of that, the parents were more likely to have night-lights in their children’s nurseries. Also, the condition of nearsightedness can be passed on genetically, which was more likely the cause of the babies becoming nearsighted than exposure to night-lights as they slept.

Even in light of this new information, by the time she had her second child, Ahn still refused to use a night-light. Nearsighted herself, the first study’s erroneous correlation between night-lights and infant nearsightedness had stuck in her mind, even when the follow-up study all but debunked it.

Overcoming biased interpretation is tougher than for most other cognitive biases because it’s part of our top-down processing – the subconscious mental framework we use to take in new information. Cognitive behavioral therapy has been shown to be effective yet rigorous. Additional broader solutions to counter possibly biased interpretation include simply being aware of it and how it can cause significant problems at a societal level, such as having long-held prejudices against people who differ from ourselves. This kind of awareness can lead to changing systemic policies and regulations for the betterment of society.

---

We rarely understand perspectives outside of our own.

The research on how poorly people pick up nuances in tone may frighten you. In one study, friends paired off, with each writing a series of single-sentence emails which they then sent to one another. Some of the sentences were sarcastic. Others were serious. The recipients then had to determine which were which. The results? Their perceptions were accurate only half the time. While these sets of friends perceived the messages accurately when delivered verbally, other studies on ambiguity in verbal communication show significant confusion, even among people who knew one another well.

We have misunderstandings all the time, even with the best intentions and with people we know well. That’s because, despite a desire to see others’ perspectives, research shows we’re really bad at it. When we assume too much familiarity in our communications, things quickly go awry.

While considering and caring about others’ perspectives is a critical first step to understanding, the only surefire ways to get it right are as simple as they sound. Be clear about your own thoughts, even if it means adding an emoji to a text or overstating yourself verbally. With others, don’t try to mind read, guess, or assume, no matter how well you think you know them. Instead, just ask.

---

We prefer to settle for less now instead of waiting for more later.

If someone offered you $340 now or $350 in six months, which would you choose? If you’re like most people, you’ll go for the $340 now. What if we up that offer to $390 in six months versus $340 now? Even then, most people would go for the immediate reward instead of the promise of $50 more later.

These are typical tests and results that show how we so easily blow off delayed gratification, even in instances where it makes more sense rationally to wait. Take the second question, where many people justify taking the lower amount now by reasoning they could invest the money to make more than the $50 increase. Or, they say some major event between now and six months could prevent them from getting the money. But no market investment in a normal economy has a higher rate of return than the promised increase over that time, and most other events also are unlikely. Yet research proves we still cling to that reasoning, at least when the differences are relatively small.

We struggle with delayed gratification for three reasons. To overcome these we need to consider each of them and take them on individually.

One is simply our lack of self-control. Studies show that one of the most effective ways to resist temptation is to find a useful distraction.

Another is our difficulty sorting through uncertainty, often delaying one decision until some unknown thing is resolved. Be sure that in those instances, the unknown is truly conditional on the outcome. For example, you may take a vacation for different reasons depending on whether you pass or fail a test. If you are going to take the vacation regardless, take advantage of planning ahead.

Last, since it’s difficult to feel future experiences, there’s a disconnect in the present when making decisions for our future selves. To address that, set goals, remind yourself of them often, and imagine how they’ll impact your life for the better.

---

You need to be aware of your thinking biases and employ simple techniques to counter them. As a consequence, you’ll make better decisions, which will lead not only to an improvement in your own life but also in society as a whole. Use the strategies outlined in this book to be fairer to yourself, but also to others through fostering understanding and cooperation.
Profile Image for Izzy Corbo.
213 reviews1 follower
January 8, 2023
This little book packs a lot of info on why we have biases in an entertaining manner.
Profile Image for Sushmitha Nayak.
31 reviews
August 9, 2025
I think it was a decent pop-sci book. it's written cleanly, and is fairly relatable on general matters which I liked more towards the end of the book. the books style is very study and example oriented, which I appreciated at the beginning as a way of expression but as a pattern it got a little frustrating - since everything is peppered with a long explanation of an example. although it does provide a neat overview of general biases and thought terms which is a good starter.
Profile Image for Zoë Routh.
Author 12 books68 followers
December 9, 2022
good expose of bias research experiments

A comprehensive look at various biases and why we have them backed up with plenty of experiments. A little short on what to do about it. I would recommend Rachel Audige’s book, Unblinkered, as a very practical alternative to dealing with common biases at work in particular.
27 reviews
September 4, 2023
There are lots of other books in this genre referencing the same studies. Go read them instead.
Profile Image for Nancy.
1,067 reviews33 followers
April 6, 2023
I misjudged and ran out of time with my libby hold on this audiobook. Will have to give it another go at some future date!
Profile Image for Travis.
861 reviews14 followers
January 23, 2024
Thinking 101 "examines 'thinking problems'—such as confirmation bias, causal attribution, and delayed gratification." I shamelessly stole that from the Goodreads description because it's quite simply the most apt description of this book. The book serves as a good introduction and overview of human thought processes. Many readers will be familiar with several of the problems described, so not everyone will be blown away by its insights.

Each chapter of Thinking 101 dives into a different mistake people make when thinking:
The allure of fluency
Confirmation bias
The challenge of causal attribution
The perils of examples
Negativity bias
Biased interpretation
The dangers of perspective taking
The trouble with delayed gratification
Undoubtedly you've heard of most of those.

The most interesting bits of the book, then, are the end of each chapter where Ahn attempts to provide ways to counteract each thinking problem. For example, to overcome confirmation bias you should consider two mutually exclusive hypothesis and try to confirm both; you can also ask a question framed in two opposite ways. Of course, the biggest piece of advice is just learning to identify when you are falling into one of the thinking problems or having one used against you.

I was reminded a bit of the book I Never Thought of It That Way by Monica Guzmán, if only because both books provide examples of how people's miscommunication can cause tension and problems.

I can see some of the real world examples in this book putting off some readers. There are plenty of examples using COVID, and others using politically charged issues. If you are on the opposite side of the author, then those are likely to set you off from everything else. Which is ironic, on both the author's side and reader's side, given the book's focus on moving past such biases and assumptions.
Displaying 1 - 30 of 184 reviews

Can't find what you're looking for?

Get help and learn more about the design.