Kalvin Lam's Reviews > Thinking, Fast and Slow

Thinking, Fast and Slow by Daniel Kahneman
Rate this book
Clear rating

's review

liked it

Thinking Fast and Slow by Daniel Kahneman is supposedly the bible of social psychology. This book is supposed to change the way you think about life and the way in which you perceive the world. Everyone says that it is a must-read for people from all walks of life. With this in mind, I came into it with extremely high expectations. Unfortunately, I was disappointed. I expected many interesting insights from Kahneman, especially since this book clocks in at around 500 pages. Instead, what I got were small pockets of interesting knowledge amidst lots and lots of fluff and over-explanation of basic concepts. I’m going to focus on the concepts I found notable in this book.

Kahneman explores an interesting notion known as priming. In simple terms, priming occurs unconsciously in our memory when exposure to one stimulus affects the response to another stimulus.

“An experiment showed that exposing people to images of classrooms and school lockers increased the tendency of participants to support a school initiative.”

Because people were exposed to images of school-related objects, they saw more urgency to support the school initiative. The subjects were not aware that their judgment had been influenced.

Another interesting concept is the availability heuristic, which is a flawed mental shortcut that relies on immediate examples that come to mind. It causes us to misjudge how frequent things occur – we overestimate the probability and likelihood of similar things happening in the future, when in reality, the events are unlikely to occur.

“With a few horrible exceptions such as 9/11, the number of casualties from terror attacks is very small relative to other causes of death. Even in countries that have been the targets of terror campaigns, such as Israel, the weekly number of casualties almost never came close to the number of traffic deaths. The difference is in the availability of the two risks, the ease and the frequency with which they come to mind.”

The next example illustrates hindsight bias (aka, the knew-it-all-along effect). After an event occurs, people are inclined to see the event as having been predictable, even if there was little or no objective basis for predicting it. This occurs because people hate being wrong, and the hindsight bias helps them evade the fact that they were wrong. Hindsight bias is “the tendency to revise the history of one’s beliefs in light of what actually happened.”

As humans, we automatically search for causes to events. This shapes our thinking. A great example of this was the capture of Saddam Hussein from his hiding place in Iraq. On that exact day, two news articles were released by two different companies. One was titled “U.S Treasuries Rise; Hussein Capture May Not Curb Terrorism,” and the other was titled “U.S. Treasuries Fall; Hussein capture Boosts Allure of Risky Assets.” This example illustrates how our yearning for understanding leads us to make faulty assumptions.

“Consumers have a hunger for a clear message about the determinants of success and future in business, and they read stories that offer a sense of understanding, however illusory. Stories of success and failure consistently exaggerate the impact of leadership style and management practices on firm outcomes, and thus their message is rarely useful.”

This explains why we buy books that describe successful business leaders even though they do not have much influence on the outcome of a company. As consumers, we crave the clear message regarding success and failure, and we attribute causes to all events. It’s difficult to accept that some things are just beyond our control. Another insight Kahneman discusses is how experts in a given field are often the least trustworthy.

“…those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.”

This effect is illustrated by the fact that stock traders who trade less actually make more money. This is because the world is unpredictable, and algorithms typically win statistically in situations where there is uncertainty involved. An interesting thing Kahneman brings up is that the cause of a mistake matters. Though algorithms can make better judgments and diagnoses than doctors and surgeons, the public would be outraged if a death were caused by an algorithm. This expands the discussion to include moral preference. Though it’s irrational to be afraid of algorithms (especially if they would yield better results), we, as humans, inherently fear the unnatural (hence our obsession with organic products, etc.).

The bottom line is that I would not recommend this book due to the fact that it is simply too long for what it’s worth. Sure, there are good insights here and there, but it’s just not worth the time invested into it. Best of luck!

Sign into Goodreads to see if any of your friends have read Thinking, Fast and Slow.
Sign In »

Reading Progress

May 31, 2016 – Started Reading
May 31, 2016 – Shelved
June 15, 2016 – Finished Reading

No comments have been added yet.