More on this book
Community
Kindle Notes & Highlights
Read between
April 10 - June 3, 2023
Split-brain confabulation is an extreme and amplified version of your own tendency to create narrative fantasies about just about everything you do, and then believe them. You are a confabulatory creature by nature.
When you listen to someone else tell a story, you expect some embellishment and you know they are only telling you how the events seemed to transpire to them. In the same way, you know how reality seems to be unfolding, how it seems to have unfolded in the past, but you should take your own perception with a grain of salt.
In these and many other studies the subjects never said they didn’t know why they felt and acted as they did. Not knowing why didn’t confuse them; they instead found justification for their thoughts, feelings, and actions and moved on, unaware of the machinery of their minds.
Over time, by never seeking the antithetical, through accumulating subscriptions to magazines, stacks of books, and hours of television, you can become so confident in your worldview that no one can dissuade you.
In science, you move closer to the truth by seeking evidence to the contrary. Perhaps the same method should inform your opinions as well.
With all three examples there are thousands of differences, all of which you ignored, but when you draw the bull’s-eye around the clusters, the similarities—whoa. If hindsight bias and confirmation bias had a baby, it would be the Texas sharpshooter fallacy.
The powerlessness, the feeling you are defenseless to the whims of chance, can be assuaged by singling out an antagonist. Sometimes you need a bad guy, and the Texas sharpshooter fallacy is one way you can create one.
In the struggle between should versus want, some people have figured out something crucial: Want never goes away. Procrastination is all about choosing want over should because you don’t have a plan for those times when you can expect to be tempted.
In any perilous event, like a sinking ship or a towering inferno, a shooting rampage or a tornado, there is a chance you will become so overwhelmed by the perilous overflow of ambiguous information that you will do nothing at all. You will float away and leave a senseless statue in your place. You may even lie down. If no one comes to your aid, you will die.
Introspection THE MISCONCEPTION: You know why you like the things you like and feel the way you feel. THE TRUTH: The origin of certain emotional states is unavailable to you, and when pressed to explain them, you will just make something up.
Believing you understand your motivations and desires, your likes and dislikes, is called the introspection illusion.
It’s simply easier to believe something if you are presented with examples than it is to accept something presented in numbers or abstract facts.
Kids were more likely to get shot in school before Columbine, but the media during that time hadn’t given you many examples. A typical schoolkid is three times more likely to get hit by lightning than to be shot by a classmate, yet schools continue to guard against it as if it could happen at any second.
Many other studies have shown it takes only one person to help for others to join in. Whether it is to donate blood, assist someone in changing a tire, drop money into a performer’s coffers, or stop a fight—people rush to help once they see another person leading by example.
So the takeaway here is to remember you are not so smart when it comes to helping people. In a crowded room, or a public street, you can expect people to freeze up and look around at one another. Knowing that, you should always be the first person to break away from the pack and offer help—or attempt escape—because you can be certain no one else will.
THE MISCONCEPTION: You prefer the things you own over the things you don’t because you made rational choices when you bought them. THE TRUTH: You prefer the things you own because you rationalize your past choices to protect your sense of self.
Even if they actually enjoyed Pepsi more, huge mental constructs prevented them from admitting it, even to themselves.
Whether in churches or legislatures, botany or business, those who are held in high regard can cause a lot of damage when no one is willing to question their authority.
The Ad Hominem Fallacy THE MISCONCEPTION: If you can’t trust someone, you should ignore that person’s claims. THE TRUTH: What someone says and why they say it should be judged separately.
Perhaps someone criticizes your driving and you respond with “You have no room to talk. You are the worst driver in the world.” There it is again. You are dismissing the other person’s argument by attacking the person instead of the claim.
The ad hominem fallacy can also work in reverse. You might assume someone is trustworthy because they speak well, or have a respectable job. It is hard to believe an astronaut would put on a diaper and drive across the country to kill the wife of her lover, but it did happen once.
A giant amount of research has been done since Lerner’s studies, and most psychologists have come to the same conclusion: You want the world to be fair, so you pretend it is.
The just-world fallacy tells them fairness is built into the system, and so they rage when the system artificially unbalances karmic justice. Why do we think this way? Psychologists are unsure.
The Ultimatum Game THE MISCONCEPTION: You choose to accept or refuse an offer based on logic. THE TRUTH: When it comes to making a deal, you base your decision on your status.
THE MISCONCEPTION: You are too smart to join a cult. THE TRUTH: Cults are populated by people just like you.
Groupthink THE MISCONCEPTION: Problems are easier to solve when a group of people get together to discuss solutions. THE TRUTH: The desire to reach consensus and avoid confrontation hinders progress. When a group of people come together to make a decision, every demon in the psychological bestiary will be summoned.
It turns out, for any plan to work, every team needs at least one asshole who doesn’t give a shit if he or she gets fired or exiled or excommunicated. For a group to make good decisions, they must allow dissent and convince everyone they are free to speak their mind without risk of punishment.
Finally, assign one person the role of asshole and charge that person with the responsibility of finding fault in the plan. Before you come to a consensus, allow a cooling off period so emotions can return to normal.
THE MISCONCEPTION: Men who have sex with RealDolls are insane, and women who marry eighty-year-old billionaires are gold diggers. THE TRUTH: The RealDoll and rich old sugar daddies are both supernormal releasers.
If you associate something with survival, but find an example of that thing that is more perfect than anything your ancestors could have ever dreamed of—it will overstimulate you.
In many studies around the world, no matter what cultural significance is placed on body type, a ratio in which the waist is about 70 percent the width of the hips is always preferred. According to Buss, a hip-to-waist ratio of .67 to .80 correlates to health, reproductive and otherwise.
Since a mouse needs to eat all the time, it is constantly faced with situations where it must weigh the danger of foraging against its hunger for calories.
Dunbar’s Number
THE MISCONCEPTION: There is a Rolodex in your mind with the names and faces of everyone you’ve ever known. THE TRUTH: You can maintain relationships and keep up with only around 150 people at once.
The truth is, out of this cluster of humans you can reliably manage to keep up with only around 150 people. More specifically, it’s between 150 and 230.
No matter the actual difficulty, just telling people ahead of time how hard the undertaking would be changed how they saw themselves in comparison to an imagined average. To defeat feelings of inadequacy, you first have to imagine a task as being simple and easy. If you can manage to do that, illusory superiority takes over.
The Spotlight Effect THE MISCONCEPTION: When you are around others, you feel as if everyone is noticing every aspect of your appearance and behavior. THE TRUTH: People devote little attention to you unless prompted to.
When you vent, you stay angry and are more likely to keep doing aggressive things so you can keep venting. It’s druglike, because there are brain chemicals and other behavioral reinforcements at work. If you get accustomed to blowing off steam, you become dependent on it. The more effective approach is to just stop.
THE MISCONCEPTION: Memories are played back like recordings. THE TRUTH: Memories are constructed anew each time from whatever information is currently available, which makes them highly permeable to influences from the present.
A quarter of the subjects incorporated the fake story into their memory and even provided details about the fictional event that were not included in the narrative. Loftus even convinced people they shook hands with Bugs Bunny, who isn’t a Disney character, when they visited Disney World as a kid, just by showing them a fake advertisement where a child was doing the same.
Loftus has made it her life’s work to showcase the unreliability of memory. She has rallied against eyewitness testimony and suspect lineups for decades now, and she also has criticized psychologists who say they can dredge up repressed memories from childhood.
Each time you build a memory, you make it from scratch, and if much time has passed you stand a good chance of getting the details wrong. With a little influence, you might get big ones wrong.
The shocking part of these studies is how easily memory gets tainted, how only a few iterations of an idea can rewrite your autobiography. Even stranger is how as memories change, your confidence in them grows stronger.
Considering the misinformation effect not only requires you to be skeptical of eyewitness testimony and your own history, but it also means you can be more forgiving when someone is certain of something that is later revealed to be embellished or even complete fiction.
He found one or two people had little effect, but three or more was all he needed to get a small percentage of people to start conforming. The percentage of people who conformed grew proportionally with the number of people who joined in consensus against them. Once the entire group other than the subject was replaced with actors, only 25 percent of his subjects answered every question correctly.
Milgram showed when you can see your actions as part of just following orders, especially from an authority figure, there is a 65 percent chance you will go to the brink of murder. Add the risk of punishment, or your own harm, and chances of conformity increase. Milgram’s work was a response to the Holocaust. He wondered if an entire nation could have its moral compass smashed, or if conformity and obedience to authority were more likely the root of so much compliance to commit unspeakable evil.
THE MISCONCEPTION: If you stop engaging in a bad habit, the habit will gradually diminish until it disappears from your life. THE TRUTH: Any time you quit something cold turkey, your brain will make a last-ditch effort to return you to your habit.
THE MISCONCEPTION: When you are joined by others in a task, you work harder and become more accomplished. THE TRUTH: Once part of a group, you tend to put in less effort because you know your work will be pooled together with others’.
Once you know your laziness can be seen, you try harder. You do this because of another behavior called evaluation apprehension, which is just a fancy way of saying you care more when you know you are being singled out.
So it’s likely, especially if you work for a corporation, that your output is being monitored in some way and you are being told about it so you’ll work harder. They know when it comes to group effort, you are not so smart.