More on this book
Community
Kindle Notes & Highlights
We make our first judgments rapidly, and we are dreadful at seeking out evidence that might disconfirm those initial judgments.43 Yet friends can do for us what we cannot do for ourselves: they can challenge us, giving us reasons and arguments (link 3) that sometimes trigger new intuitions, thereby making it possible for us to change our minds. We occasionally do this when mulling a problem by ourselves, suddenly seeing things in a new light or from a new perspective (to use two visual metaphors).
Far more common than such private mind changing is social influence. Other people influence us constantly just by revealing that they like or dislike somebody. That form of influence is link 4, the social persuasion link. Many of us believe that we follow an inner moral compass, but the history of social psychology richly demonstrates that other people exert a powerful force, able to make cruelty seem acceptable45 and altruism seem embarrassing,46 without giving us any reasons or arguments.
And you can’t change people’s minds by utterly refuting their arguments.
And as reasoning is not the source, whence either disputant derives his tenets; it is in vain to expect, that any logic, which speaks not to the affections, will ever engage him to embrace sounder principles.
” The persuader’s goal should be to convey respect, warmth, and an openness to dialogue before stating one’s own case.
• The mind is divided into parts, like a rider (controlled processes) on an elephant (automatic processes). The rider evolved to serve the elephant. • You can see the rider serving the elephant when people are morally dumbfounded. They have strong gut feelings about what is right and wrong, and they struggle to construct post hoc justifications for those feelings. Even when the servant (reasoning) comes back empty-handed, the master (intuition) doesn’t change his judgment. • The social intuitionist model starts with Hume’s model and makes it more social. Moral reasoning is part of our lifelong
...more
Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. If you ask people to believe something that violates their intuitions, they will devote their efforts to finding an escape hatch—a reason to doubt your argument or conclusion. They will almost always succeed.
That might be good news for rationalists—maybe we can think carefully whenever we believe it matters? Not quite. Tetlock found two very different kinds of careful reasoning. Exploratory thought is an “evenhanded consideration of alternative points of view.” Confirmatory thought is “a one-sided attempt to rationalize a particular point of view.”13 Accountability increases exploratory thought only when three conditions apply: (1) decision makers learn before forming any opinion that they will be accountable to an audience, (2) the audience’s views are unknown, and (3) they believe the audience
...more
thought. People are trying harder to look right than to be right.
51 It’s hard because the confirmation bias is a built-in feature (of an argumentative mind), not a bug that can be removed (from a platonic mind).
To demonstrate the strategic functions of moral reasoning, I reviewed five areas of research showing that moral thinking is more like a politician searching for votes than a scientist searching for truth:
Given Hume’s concerns about the limits of reasoning, he believed that philosophers who tried to reason their way to moral truth without looking at human nature were no better than theologians who thought they could find moral truth revealed in sacred texts. Both were transcendentalists.