More on this book
Community
Kindle Notes & Highlights
For example, it is much more common that we overestimate our knowledge than that we underestimate it.
I now had categories, terms, and explanations with which to ward off the spectre of irrationality.
Not all cognitive errors are toxic, and some are even necessary for leading a good life.
Although this book may not hold the key to happiness, at the very least it acts as insurance against too much self-induced unhappiness.
We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity – all we need is less irrationality.
In daily life, because triumph is made more visible than failure, you systematically overestimate your chances of succeeding.
The media is not interested in digging around in the graveyards of the unsuccessful.
To elude the survivorship bias, you must do the digging yourself.
But you should recognise that the survivorship bias is at work, distorting the probability of success like cut glass.
Survivorship bias means this: people systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments and careers. It is a sad walk, but one that should clear your mind.
How their bodies are designed is a factor for selection and not the result of their activities.
Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion.
‘trying to be happier is as futile as trying to be taller.’
The human brain seeks patterns and rules. In fact, it takes it one step further: if it finds no familiar patterns, it simply invents some.
In conclusion: when it comes to pattern recognition, we are oversensitive. Regain your scepticism. If you think you have discovered a pattern, first consider it pure chance.
Social proof, sometimes roughly termed the herd instinct, dictates that individuals feel they are behaving correctly when they act the same as other people. In other words, the more people who follow a certain idea, the better (truer) we deem the idea to be. And the more people who display a certain behaviour the more appropriate this behaviour is judged to be by others. This is, of course, absurd.
‘If 50 million people say something foolish, it is still foolish.’
‘We have spent the $30 regardless of whether we stay or leave, so this factor should not play a role in our decision,’ I said, desperately trying to clarify the situation.
This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes.
This irrational behaviour is driven by a need for consistency.
After all, consistency signifies credibility. We find contradictions abominable.
If we decide to cancel a project halfway through, we create a contradiction: we admit that we once thought differently. Carrying on with a meaningless project delays thi...
This highlight has been truncated due to consecutive passage length restrictions.
Rational decision-making requires you to forget about the costs incurred to date.
The confirmation bias is the mother of all misconceptions. It is the tendency to interpret new information so that it becomes compatible with our existing theories, beliefs and convictions.
‘Facts do not cease to exist because they are ignored,’
‘What the human being is best at doing, is interpreting all new information so that their prior conclusions remain intact.’
‘The rule is this: the next number must be higher than the previous one.’
Astrologers and economists operate on the same principle. They utter prophecies so vague that any event can substantiate them: ‘In the coming weeks you will experience sadness,’ or ‘in the medium term, the pressure on the dollar will increase.’
Moreover, a lot of sites now tailor content to personal interests and browsing history, causing new and divergent opinions to vanish from the radar altogether.
Axing beliefs that feel like old friends is hard work, but imperative.
Authorities crave recognition and constantly find ways to reinforce their status.
In conclusion: whenever you are about to make a decision, think about which authority figures might be exerting an influence on your reasoning. And when you encounter one in the flesh, do your best to challenge him or her.
Both of these stories epitomise the contrast effect: we judge something to be beautiful, expensive or large if we have something ugly, cheap or small in front of us. We have difficulty with absolute judgements.
Without the contrast effect, the discount business would be completely untenable.
A share price is never ‘low’ or ‘high’. It is what it is, and the only thing that matters is whether it goes up or down from that point.
When we encounter contrasts, we react like birds to a gunshot: we jump up and get moving.
The availability bias says this: we create a picture of the world using the examples that most easily come to mind. This is absurd, of course, because in reality things don’t happen more frequently just because we can conceive of them more easily.
The chances of bomb attacks are much rarer than we think, and the chances of suffering depression are much higher.
We attach too much likelihood to spectacular, flashy or loud outcomes. Anything silent or invisible we downgrade in our minds. Our brains imagine show-stopping outcomes more readily than mundane ones. We think dramatically, not quantitatively.
We prefer wrong information to no information.
We require others’ input to overcome the availability bias.
But beware: situations do exist where things first dip and then improve.
We simply build the meaning into them afterward. Stories are dubious entities.
They simplify and distort reality, and filter things that don’t fit.
Stories attract us; abstract details repel us.
Consequently, entertaining side issues and backstories are prioritised over relevant facts. (On the upside, if it were not for this, we would be stuck with only non-fiction books.)
Objectively speaking, narratives are irrelevant, but still we find them irresistible.
The real issue with stories: they give us a false sense of understanding, which inevitably leads us to take bigger risks and urges us to take a stroll on thin ice.
We can aptly describe it as the ‘I told you so’ phenomenon: in retrospect, everything seems clear and inevitable.
If a CEO becomes successful due to fortunate circumstances he will, looking back, rate the probability of his success a lot higher than it actually was.