More on this book
Community
Kindle Notes & Highlights
We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.
In daily life, because triumph is made more visible than failure, you systematically overestimate your chances of succeeding.
The media is not interested in digging around in the graveyards of the unsuccessful. Nor is this its job. To elude the survivorship bias, you must do the digging yourself.
Survivorship bias means this: People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.
cheerfulness—according to many studies, such as those conducted by Harvard’s Dan Gilbert—is largely a personality trait that remains constant throughout life.
Be wary when you are encouraged to strive for certain things—be it abs of steel, immaculate looks, a higher income, a long life, a particular demeanor, or happiness. You might fall prey to the swimmer’s body illusion. Before you decide to take the plunge, look in the mirror—and be honest about what you see.
The human brain seeks patterns and rules. In fact, it takes it one step further: If it finds no familiar patterns, it simply invents some.
When it comes to pattern recognition, we are oversensitive. Regain your skepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically. And if the crispy parts of your pancake start to look a lot like Jesus’s face, ask yourself: If he really wants to reveal himself, why doesn’t he do it in Times Square or on CNN?
Social proof, sometimes roughly termed the “herd instinct,” dictates that individuals feel they are behaving correctly when they act the same as other people. In other words, the more people who follow a certain idea, the better (truer) we deem the idea to be. And the more people who display a certain behavior, the more appropriate this behavior is judged by others. This is, of course, absurd.
“If fifty million people say something foolish, it is still foolish.”
The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes.
“We’ve come this far . . .” “I’ve read so much of this book already . . .” “But I’ve spent two years doing this course . . .” If you recognize any of these thought patterns, it shows that the sunk cost fallacy is at work in a corner of your brain.
Rational decision making requires you to forget about the costs incurred to date. No matter how much you have already invested, only your assessment of the future costs and benefits counts.
Psychologist Robert Cialdini can explain the success of this and other such campaigns. He has studied the phenomenon of reciprocity and has established that people have extreme difficulty being in another person’s debt.
The confirmation bias is the mother of all misconceptions. It is the tendency to interpret new information so that it becomes compatible with our existing theories, beliefs, and convictions.
“Facts do not cease to exist because they are ignored,” said writer Aldous Huxley.
“What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.”
For example, worshippers always find evidence for God’s existence, even though he never shows himself overtly—except to illiterates in the desert and in isolated mountain villages. It is never to the masses in, say, Frankfurt or New York.
The Internet is particularly fertile ground for the confirmation bias. To stay informed, we browse news sites and blogs, forgetting that our favored pages mirror our existing values, be they liberal, conservative, or somewhere in between. Moreover, a lot of sites now tailor content to personal interests and browsing history, causing new and divergent opinions to vanish from the radar altogether. We inevitably land in communities of like-minded people, further reinforcing our convictions—and the confirmation bias.
Whenever you are about to make a decision, think about which authority figures might be exerting an influence on your reasoning. And when you encounter one in the flesh, do your best to challenge him or her.
We judge something to be beautiful, expensive, or large if we have something ugly, cheap, or small in front of us. We have difficulty with absolute judgments.
The availability bias says this: We create a picture of the world using the examples that most easily come to mind. This is idiotic, of course, because in reality, things don’t happen more frequently just because we can conceive of them more easily.
Thanks to the availability bias, we travel through life with an incorrect risk map in our heads. Thus, we systematically overestimate the risk of being the victims of a plane crash, a car accident, or a murder. And we underestimate the risk of dying from less spectacular means, such as diabetes or stomach cancer. The chances of bomb attacks are much rarer than we think, and the chances of suffering depression are much higher. We attach too much likelihood to spectacular, flashy, or loud outcomes. Anything silent or invisible we downgrade in our minds. Our brains imagine showstopping outcomes
...more
The availability bias has an established seat at the corporate board’s table, too. Board members discuss what management has submitted—usually quarterly figures—instead of more important things, such as a clever move by the competition, a slump in employee motivation, or an unexpected change in customer behavior. They tend not to discuss what’s not on the agenda. In addition, people prefer information that is easy to obtain, be it economic data or recipes. They make decisions based on this information rather than on more relevant but harder-to-obtain information—often with disastrous results.
We prefer wrong information to no information.
A mere smoke screen, the it’ll-get-worse-before-it-gets-better fallacy is a variant of the so-called confirmation bias. If the problem continues to worsen, the prediction is confirmed. If the situation improves unexpectedly, the customer is happy, and the expert can attribute it to his prowess. Either way he wins.
Suppose you are president of a country and have no idea how to run it. What do you do? You predict “difficult years” ahead, ask your citizens to “tighten their belts,” and then promise to improve the situation only after this “delicate stage” of “cleansing,” “purification,” and “restructuring.” Naturally you leave the duration and severity of the period open.
What is clear is that people first used stories to explain the world, before they began to think scientifically, making mythology older than philosophy. This has led to the story bias.
Stories attract us; abstract details repel us. Consequently, entertaining side issues and backstories are prioritized over relevant facts.
Objectively speaking, narratives are irrelevant. But still we find them irresistible.
The hindsight bias is one of the most prevailing fallacies of all. We can aptly describe it as the “I told you so” phenomenon: In retrospect, everything seems clear and inevitable.
So why is the hindsight bias so perilous? Well, it makes us believe we are better predictors than we actually are, causing us to be arrogant about our knowledge and consequently to take too much risk.
Hindsight may provide temporary comfort to those overwhelmed by complexity, but as for providing deeper revelations about how the world works, you’ll benefit by looking elsewhere.
The overconfidence effect also applies to forecasts, such as stock market performance over a year or your firm’s profits over three years. We systematically overestimate our knowledge and our ability to predict—on a massive scale. The overconfidence effect does not deal with whether single estimates are correct or not. Rather, as Taleb puts it, “it measures the difference between what people actually know and how much they think they know.” What’s surprising is this: Experts suffer even more from the overconfidence effect than laypeople do. If asked to forecast oil prices in five years’ time,
...more
The overconfidence effect is more pronounced in men—women tend not to overestimate their knowledge and abilities as much. Even more troubling: Optimists are not the only victims of the overconfidence effect. Even self-proclaimed pessimists overrate themselves—just less extremely.
Be aware that you tend to overestimate your knowledge. Be skeptical of predictions, especially if they come from so-called experts. And with all plans, favor the pessimistic scenario. This way, you have a chance of judging the situation somewhat realistically.
According to Charlie Munger, one of the world’s best investors (and from whom I have borrowed this story), there are two types of knowledge. First, we have real knowledge. We see it in people who have committed a large amount of time and effort to understanding a topic. The second type is chauffeur knowledge—knowledge from people who have learned to put on a show. Maybe they have a great voice or good hair, but the knowledge they espouse is not their own. They reel off eloquent words as if reading from a script.
The same superficiality is present in business. The larger a company, the more the CEO is expected to possess “star quality.” Dedication, solemnity, and reliability are undervalued, at least at the top. Too often shareholders and business journalists seem to believe that showmanship will deliver better results, which is obviously not the case.
To guard against the chauffeur effect, Warren Buffett, Munger’s business partner, has coined a wonderful phrase, the “circle of competence”: What lies inside this circle you understand intuitively; what lies outside, you may only partially comprehend. One of Munger’s best pieces of advice is: “You have to stick within what I call your circle of competence. You have to know what you understand and what you don’t understand. It’s not terribly important how big the circle is. But it is terribly important that you know where the perimeter is.” Munger underscores this: “So you have to figure out
...more
The illusion of control is the tendency to believe that we can influence something over which we have absolutely no sway.
Crossing the street in Los Angeles is a tricky business, but luckily, at the press of a button, we can stop traffic. Or can we? The button’s real purpose is to make us believe we have an influence on the traffic lights, and thus we’re better able to endure the wait for the signal to change with more patience. The same goes for “door-open” and “door-close” buttons in elevators: Many are not even connected to the electrical panel. Such tricks are also designed in open-plan offices: For some people it will always be too hot, for others, too cold. Clever technicians create the illusion of control
...more
Do you want to influence the behavior of people or organizations? You could always preach about values and visions or you could appeal to reason. But in nearly every case, incentives work better. These need not be monetary; anything is possible, from good grades to Nobel Prizes to special treatment in the afterlife.
My advice: Forget hourly rates and always negotiate a fixed price in advance.
Keep an eye out for the incentive super-response tendency. If a person’s or an organization’s behavior confounds you, ask yourself what incentive might lie behind it. I guarantee you that you’ll be able to explain 90 percent of the cases this way. What makes up the remaining 10 percent? Passion, idiocy, psychosis, or malice.
When you hear stories such as: “I was sick, went to the doctor, and got better a few days later” or “the company had a bad year, so we got a consultant in, and now the results are back to normal,” look out for our old friend, the regression-to-mean error.
Never judge a decision purely by its result, especially when randomness and “external factors” play a role. A bad result does not automatically indicate a bad decision and vice versa. So rather than tearing your hair out about a wrong decision, or applauding yourself for one that may have only coincidentally led to success, remember why you chose what you did. Were your reasons rational and understandable? Then you would do well to stick with that method, even if you didn’t strike it lucky last time.
In his book of the same title, psychologist Barry Schwartz describes why this is so. First, a large selection leads to inner paralysis. To test this, a supermarket set up a stand where customers could sample twenty-four varieties of jelly. They could try as many as they liked and then buy them at a discount. The next day, the owners carried out the same experiment with only six flavors. The result? They sold ten times more jelly on day two. Why? With such a wide range, customers could not come to a decision, so they bought nothing. The experiment was repeated several times with different
...more
This highlight has been truncated due to consecutive passage length restrictions.
“There’s nothing more effective in selling anything than getting the customer to believe, really believe, that you like him and care about him.”
So, if you are a salesperson, make buyers think you like them, even if this means outright flattery. And if you are a consumer, always judge a product independently of who is selling it. Banish the salespeople from your mind or, rather, pretend you don’t like them.
endowment effect. We consider things to be more valuable the moment we own them. In other words, if we are selling something, we charge more for it than what we ourselves would be willing to spend.