Not Born Yesterday explains how we decide who we can trust and what we should believe--and argues that we're pretty good at making these decisions. In this lively and provocative book, Hugo Mercier demonstrates how virtually all attempts at mass persuasion--whether by religious leaders, politicians, or advertisers--fail miserably. Drawing on recent findings from political science and other fields ranging from history to anthropology, Mercier shows that the narrative of widespread gullibility, in which a credulous public is easily misled by demagogues and charlatans, is simply wrong.
Why is mass persuasion so difficult? Mercier uses the latest findings from experimental psychology to show how each of us is endowed with sophisticated cognitive mechanisms of open vigilance. Computing a variety of cues, these mechanisms enable us to be on guard against harmful beliefs, while being open enough to change our minds when presented with the right evidence. Even failures--when we accept false confessions, spread wild rumors, or fall for quack medicine--are better explained as bugs in otherwise well-functioning cognitive mechanisms than as symptoms of general gullibility.
Not Born Yesterday shows how we filter the flow of information that surrounds us, argues that we do it well, and explains how we can do it better still.
This has become my book of the year! Despite my being a psychologist, and a cognitive neuroscientist in particular, I learned a lot of new things from this book, both minutiae and a broader paradigm shift in how I view the "gullibility" of individuals and how humans process information. What you get out of this book is a deeper, more useful, understanding on how mass persuasion (doesn't) work, why fake news is so viral but at the same time inherently superficial, how individuals decide who and what to trust, how these mechanisms fail, and how to avoid this. Reading this will teach you more about yourself and the people around you.
The basic premise of the book is that people are not inherently gullible, even if this is a popular justification for why we're faced with pizzagate, obamagate, antivaxxers, Alex Jones, etc. Instead, Mercier argues that things are never so simple, there are some things that are more likely to be believed than others, and understanding human information processing is key. Often, this is a case of a seemingly insurmountable failure in human cognition is actually a rare edge case to an otherwise well functioning system. Other times, it's a question of misunderstanding a situation, or overloooking ulterior motives. At some point though, the author tries moving the goal posts on what it means to be "gullible", and this is my main point of disagreement with the book (see concept 12 below). Sometimes, he destroys interpretations based on gullibility by providing more information. Other times, he just walks you through the irony of the situation. For example, it's common to argue that people are gullible because they believe something crazy like vaccines cause autism...but if they were so gullible, why don't they believe YOU that vaccines are safe?
The idea of this book is not a key that opens all locks, but more a magnifying glass, an encouragement to look for ulterior motives and deeper explanations for what might on the surface appear to be blind gullibility.
For whoever couldn't be bothered to read the book, I've identified key concepts illustrated throughout the book, and then try to chew over them in light of 2020.
Concept 1: We process information with "open vigilance", that is we're open to new information, but we judge it for plausibility and reliability. This is comparable to an omnivorous diet; we try new things, but keep track of things that make us sick so we can avoid it in the future.
Concept 2: Gullibility is biologically implausible, because when it assumes that people will believe "anything", it assumes that sources of information are honest. If people believed anything, then it would not take opportunists long before they lied about everything.
Concept 3: Traditionally, gullibility is associated with being "dumb", and not thinking. Instead, the opposite is true, people who stop thinking don't absorb more information.
Concept 4: The form in which information arrives determines how seriously it's taken. Just saying "8 out of 10 scientists agree..." is not going to have the same impact as being in a room with 10 scientists, 8 of which agree. This explains the discrepancy between "herd effects" in which people will trust a majority, and the ineffectiveness of vaccine research communication.
Concept 5: Lying is hard (and detecting lies is hard); negligence is easy. So we are not finetuned to detecting lies, but diligence. We trust people who make more of an effort to communicate relevant (to us) information. In this context, commitment becomes especially important; we don't accuse people who are very confident about something that they are lying to us, but rather that they are overconfident and unreliable.
Concept 6: We trust those we can hold accountable with reputation; when someone is not affected by how we view them, we have less reason to trust them. On the flip side, we trust people we already have information on (a reputation). This is where name recognition can play a role.
Concept 7: We trust based on interests; if someone has a common interest with us, or a self-interest in preserving reputation, we have reason to trust them.
Concept 8: Demagogues don't guide people's wills, they reflect it. Easily seen in Trump: he probably doesn't care about immigration particularly (his wife is an immigrant), but his base does, so he makes it a key issue.
Concept 9: Crusades led to thousands of peasants heading off to their deaths; witches were widely believed to cause ill and many innocent people were burned alive; Xhosa, in the middle of famine and conflict with invaders were convinced to sacrifice all their cattle to create a ghost army. While all these at the face of it seem like gullible people falling for stupid ideas, closer scrutiny reveals ulterior motives and more complex situations. The peasants who went to crusade were on the verge of dying of starvation at home, people accused of witchcraft were already disliked by the community for other things, and this was more of an excuse than the main reason; in fact, a "witch" who confessed to cursing someone was more likely to be spared. Only communities whose cattle were already suffering from lung disease sacrificed their cattle; furthermore most of those cattle were owned by a privileged few who had recently chosen to export rather than keep them for the community, and the community didn't like that. People are only mass persuaded to do things when they wanted to do them anyway.
Concept 10: mass persuasion doesn't work. In dictatorships, propaganda is only accepted by people who benefit from the regime (and the excuse the propaganda provides). Nazis tried to enforce anti-semitism, but this only took hold in already anti-semitic communities. Current Chinese regime knows this, and has a different strategy: friction and flooding. Make true information difficult to find (don't collect government statistics, ban websites and keywords), and flood the airwaves with meaningless propaganda and pop culture distractions. That amount of propaganda is mainly for people already on board with the regime, to keep them convinced.
Concept 11: The best you can hope for in influencing people is: a) establish the criteria to use for evaluating people/ideas, b) how to frame an issue, c) what issues are worth thinking about. Essentially, it comes down to having the power of selecting information, but whether people buy it or not depends on the individual. Advertisement works only in as much as it's an information source.
Concept 12: there is a distinction between information held "intuitively" and "abstractly". Intuitive information is what we draw inferences from and make every day decisions, like the ground is solid, that person doesn't like me, walking into a road is dangerous. Abstract information is most anything we learn in school but don't draw upon regularly to act on, like the planets in the solar system, the laws of thermodynamics, where Obama was born. Most fake news and false rumors are based entirely in abstract information, and doesn't genuinely affect individuals. Pizzagate is a case in point: the idea was that a pizzeria was running a sex trafficking ring. With one notable exception, most people who believed this limited themselves to spreading the rumor and giving negative ratings online. The only person who actually stepped up was a guy who went in guns blazing demanding children be set free; that's what you do when you really believe that something bad is happening.
The author here essentially is saying that people don't *really* believe these crazy ideas, because it's at an abstract level. I think this is a bit shifting the goal posts, because I would argue that no matter how superficially you believe something, the moment you buy into something even a bit, and that something is very obviously stupid, that makes you gullible. How else to distinguish that slice of population who bought into pizzagate from the rest of us who didn't? Still, it's operationally useful to recognize the difference between a deeply held belief and a superficial one.
Concept 13: cost asymmetry explains why we care about crazy conspiracies like 9/11 truthers and pizzagate: it's better to know all possible threats, even implausible ones, than risk ignoring real ones. In general, information about threats is considered "useful" and so people who provide it get reputation points, so the ideas are more easily spread. Furthermore, since most people avoid danger, and most rumors involve far away situations anyway, there's little risk of being proven wrong, so people are even less likely to fact-check before spreading exactly Alex Jones types of rumors. "Fale rumors spread so well not because people take them too seriously but because they don't take them seriously enough."
Concept 14: We rather suck with sources. While we recognize that information from a good source, even second hand, is better than a bad source, we are not so attentive when sources are not properly defined. When reporting our own sources, we also tend to distort the information for both simplicity ("my friend's friend's cousin" becomes "my friend's cousin") and improving our own standing. When sources are not kept track of, then you can learn from different people the same bit of information, thinking you're getting converging evidence, when really everyone got it from the same source. This is how "social" sources become so powerful, like religion (god is more believable because it seems like everyone you know independently started believing in god) and even science (wash your hands, they're covered in germs).
Concept 15: Individuals use bad information for their own gain. Someone trying to be accepted by a group can make themselves "unclubbable" by all rival groups, thus getting fast tracked into acceptance. If you say something so absurd, that most other groups would reject you, even if it's a little crazy by your target group's standards, the fact that you burned bridges with everyone else makes you more trusted by the group you're after. This is clearly what Trump is up to when he say such blatantly false things like injecting bleach for curing coronavirus, immigrants and protesters are criminals. Likewise, flatearthers and creationists, especially the more "intellectual" ones, don't beleive these theories, they're just burning bridges with the rest of "mainstream" academia to be included in a new club of anti-intellectuals.
Concept 16: Blaming fake news and mistaken beliefs for bad decisions is backwards; people justify bad things they wanted to do anyway with bad "facts", taking any excuse they can find, however flimsy, because an excuse is better than no excuse. When Trump supporters buy into the idea of Mexican Caravans infiltrated with drug dealers, it's just because it works in their favor of stopping immigrants from entering the country. They are not "gullible" they are opportunists.
Concept 17: Science information is crazy; it's actually more surprising that people accept it than that they don't. Very few people actually understand any given portion of science, and not many more actually know a scientist they could trust. So they rely on "coarse cues" like degrees and knowledge of math. The author doesn't seem to have a good reason why laymen would trust science, but it's actually pretty obvious by his own criteria: science has a really good track record. We've gone to the moon, gotten rid of polio, everyone has smartphones.
Ultimately, I think the book makes pretty clear that most situations are more complicated than just "people are gullible". Instead, individuals are pretty selective about what and who to trust (it has to match the rest of what the person already knows, it has to come from someone that can be held accountable), and it is in fact quite difficult to get a lot of people to believe anything you want. Your best chance of success is to tell them something they want to hear, and doesn't clash too hard with reality. This book is optimistic; it stands with the idea that people are not as stupid as they're portrayed, and that individuals can be trusted to do what's in their best interest. This book was published just before the COVID19 pandemic, so the author did not have the opportunity to address the latest crazy theory, that the virus itself is a inflated hoax, nothing worse than the flu that Democrats are using to destroy Trump before the election. It makes sense that the people who believe this are not blindly gullible (they would go extinct pretty quickly) but rather have ulterior motives. What those motives are though, enough to go against self-preservation, I'm not sure. In the UK, there was the rumor that 5G caused COVID. People didn't passively give negative ratings to Verizon, they actively burned 5G towers (witches!). The easy thing is to label these people as gullible, but it does seem more plausible that they had some other, prior, reason for distrusting 5G. What that could be, I don't know.
Updated. The top Goodreads review of this says Mercier moves the goal posts on "gullibility,” and I completely agree. Mercier’s key argument is that human beings, contrary to a popular misconception (or gripe: the charge of gullibility is political, leveled at one’s enemies), are not “gullible.” But what emerges in the course of this book is that human beings are not gullible only if we mean by gullibility an inherent credulousness (what Locke et al., to own the Scholastics, called an Occult Quality), the invocation of which explains by itself how human beings behave. In other words, human beings, for the most part, don’t just go around believing whatever other people say “because that’s how they are,” and the armchair bullshitter view that “people are just stupid” says more about the opinionator's fatuousness than social reality.
So, rather than invoking gullibility, which is much less clear than the troubling realities we want explained — namely, that people often believe crazy and destructive things — we need a scientific, rational account of how people come to trust and believe in the things they do. And the bulk of the book is devoted to providing this account, through what Mercier calls the suite of cognitive mechanisms of openness and vigilance (which evolved together). Mercier’s account is very detailed and thorough, and not a little dry. The upshot, however, is hardly comforting. If we’re pleased to hear that humans aren’t just “gullible,” it isn’t great to then realize that there are, after all, many circumstances that lead humans to believe crazy and destructive things, that believing crazy and destructive things is squarely within the domain of human cognitive capacities, and that our currently polarized social media information environments frequently exploit humans’ worst features for trusting and believing crazy and destructive things.
I guess the optimistic view is that since gullibility is not inherent and ineradicable, since people can adapt, and since we can understand why people believe and trust the way they do, “we” can improve information environments and so forth so that it’s not rational to be insane, so to speak — so that people can put their trust in genuinely reliable sources and have better, more informed beliefs and practices. This is perhaps a version of Dewey’s pro-democracy, human perfectability view. The accompanying Lippmanesque view is that all this Deweyan “educating for democracy” stuff is utopian, the stuff of airy TED Talks and expert panel discussions, while in reality disinformation, conspiracies, and ideological authoritarianism are only going to get much worse before not getting any better, since That's Democracy in the 21st Century, Folks — and, to top it all off, all of this ugliness is perfectly in accord with that exalted gem, Human Reason. Personally, I find Lippmann’s critique of democracy extraordinarily forceful, as did Dewey — only to be matched by Dewey’s tremendous response. In other words, I’m torn, vacillating between optimism and pessimism (though more of the latter recently). At any rate, I thought this was a fascinating and worthwhile (if boring) read, especially when viewed as an ambiguous contribution to the old Lippmann-Dewey debate on the viability of liberal democracy.
Nice book, well-reasoned, denying that people are generally gullible - especially about things that are important to their daily life and decision-making. For every person that falls for a scam, thousands of others ignore it or laugh at it. When mobs of people seem to be inspired by a demagogue to do awful things, it may be that these people have their own longstanding motives for their behavior and the leaders have simply jumped in front of an already existing proto-mob.
Mercier’s justification is that gullibility in humans would have been selected out by evolution - people that are generally gullible would have been victimized to the point where they’d be unlikely to pass on their genes. Sociality in general is a good thing and may imply a bit of gullibility — but just a bit.
In Not Born Yesterday Hugo Mercier argues against the view that human beings are inherently gullible and easily mislead (for example through propaganda, advertising, or a foreign or covert influence campaign). I previously read excerpts from this book as part of a Master’s Degree course on Influence Campaigns and Cyber Operations at National Defense University. A good read for the millions of people freaking out over covert or foreign influence campaigns in recent US elections. The book covers several other areas (including fringe beliefs like Flat-Earthers or Obama is a Muslim nonsense), but most of my notes focus on this main subject of the difficulty of mass influence campaigns. 4 stars.
What follows are my notes on the book:
Mercier starts by examining the evolution of communication in the animal kingdom. Different signaling that proves advantageous is kept and disadvantageous signaling is snuffed out. The evolution of communication in various animals, between both friends and predators evolved in such a way to penalize unreliable signals (i.e. increased the costs of communication). For example, antelope that fake signals of speed to predators get discovered as frauds and eaten; habitual (human) liars get penalized and ostracized from the tribe. However, in the modern world human beings interact in infinitely more complex ways than animals and it is not convenient to implement costly communication in everyday life. The author argues human’s evolved a number of cognitive “open vigilance mechanisms” to communicate without having to be costly every time. Furthermore, because gullibility is too easy to take advantage of and is therefore not adaptive, according to evolutionary theory it should not persist at large scale in the population.
Cognitive mechanisms help us decide how much weight to put on what we hear or read. Are good arguments being offered? Is the source competent? Does the source have my interest at heart? As stated, these methods don’t scale well. So we deal with this through open vigilance mechanisms including plausibility checking and reasoning. Plausibility checking is always on, which makes it tremendously difficult to change people’s minds (especially through mass persuasion techniques like propaganda and advertising).
The author examines the supposedly successful propaganda of demagogues like Cleon (Peloponnesian War) or Hitler (WWII). He lays out a compelling case that these individuals were not remarkably persuasive but caught “the feeling of the people.” In other words, they reflected rather than guided the people’s will. They did not gain power by manipulating crowds but by championing opinions that were already popular. [Not mentioned by name, but it applies to President Trump as well. He didn’t brainwash just under half the country but put forth messaging that tapped into existing cultural, economic, and class beliefs and grievances).
He dives deeper into Nazi propaganda and shows that sheer exposure had no effect at all on sentiment. The messaging was only successful in regions where the presence of pre-existing anti-Semitic beliefs were high. Because most of the German population was not highly anti-Semitic, he ultimately only achieved his goals through “terror and legal discrimination” tactics, not popular will.
The author also takes a look at the ineffective propaganda efforts of the Soviets and Communism under Mao. When these methods proved inadequate, China moved away from propaganda and towards methods labeled “friction and flooding.” Friction is making info less easy to access (censorship, blocking key words, etc). Flooding is distracting people from sensitive issues by bombarding them with other issues of less importance to the state (like celebrity gossip). This continues today with China’s 50 cent army of online trolls. The US media does a pretty good job of this as well (intentional or not).
The author argues that the billions of dollars spent on US campaigns is largely wasted as people rarely change their minds. Even the fears of Russia’s interference in the 2016 election likely made no impact. Any Russian influence campaigns (and they did exist) were highly unlikely to change Clinton voters to Trump voters, rather it only preached to the choir that already held opinions it was pushing.
People today choose what they watch. People who watch political shows/news have been shown to have high levels of political knowledge and are the same people who are least likely to change their minds in response to what they see on the news. On the vast majority of political issues, average people have no strong opinion (or any opinion whatsoever). Which is why so many people use other methods (party affiliation) to decide what they think on an issue (for example, you could say Barack Obama supported (insert Republican position) and many Democrats would agree with it (and vice versa).
The author also examines advertising and likewise concludes that ads have small effects at best. The most reliable indicator on whether an ad will be effective or not is whether the audience has preconceived opinions. Ads relying on celebrities are only effective if the audience views the celebrity as a trustworthy expert in the relevant domain.
So what about all the people that believed Barack Obama was a foreigner or Muslim? Flat-Earthers? The nut who attacked a DC pizzeria because it was a front for child trafficking? Or the elderly person who falls for an extremely obvious Nigerian Prince scam? The author argues that if these beliefs are held, it is only reflectively as some sort of “mind candy.” According to polls, millions of Americans were supposed to believe that children were trafficked in the basement of a DC pizzeria. Yet only one (Edgar Welch) actually stormed the store with a gun to save the supposedly captive children. Millions of others did nothing (besides maybe a 1-star review or trolling comments). Such behavior can only be explained if the belief in Pizzagate is held reflectively, not intuitively. The same for people who believed Obama wasn’t a US citizen, nobody actually acted as if that was the case.
For the Nigerian Prince email scams. The author made an interesting observation that sending out spam is cheap and easy, but actually reeling in a victim is time consuming and costly. The scam is intentionally designed to be absurd because it guarantees that the only people who respond are truly gullible outliers.
For flat-earthers, the author argues that this belief is often held (and exaggerated) as a means of maintaining prominence or fellowship with an in-group than it is a genuine belief in a preposterous theory (better to be a wealthy flat earth YouTuber with a following than a poor nobody).
I really wanted to give this book a 5 star, but I can’t go beyond 3. Sometimes, I find that essays such as this one are too long for no real reason and should be reduced by 30%. It is not the case here. In fact, I think Not Born Yesterday would have gained a lot if it had been 30% longer—although that extension would probably have weakened the author’s thesis. Let me explain.
Why the wish for a 5-star review? Because the approach taken by Mercier is inspiring, original, taught provoking, and goes beyond the standard take on the topic of trust and belief acquisition. Mercier isn’t satisfied with the opinion that people are just gullible and need to be educated. It doesn’t take much cognitive effort to see that education can’t solve everything and that people are not simply gullible (if it were the case, why would it be so hard to convince them that they are wrong?).
In the first few chapters, Mercier uses evolutionary biology and psychology to discriminate between different views on trust and communication. Can any species really afford total gullibility? Probably not. The way Mercier develops his concept of open-vigilance is fantastic! And then, how he uses the framework he built in these chapters throughout the rest of the book is definitely fascinating. For instance, I found his analysis of how different types of rumours spread and to what levels they are accurate and/or believed, or his analysis of who we trust particularly insightful.
While I was reading Not Born Yesterday, my reactions were constantly cycling between “uh, good point” and “oh come on, that’s a little dishonest”. At one point, the amount of “oh, come on, that’s a little dishonest” became simply unacceptable. Like many thinkers, Mercier seems to be unable to simply point out a flaw in one theory or make a nice addition to it; he must take the whole thing down and make his contribution appear disproportionally big. That tendency really got me annoyed. For readers with even a modest background in cognitive science, it is hard not to think that every two chapters Mercier avoids confronting important counterarguments, makes relatively solid and nuanced results on a topic appear all wrong, and cherry-picks the studies he presents. Often, I would read a passage and be like “no way, what about x, y, z?” Sometimes, only a few paragraphs later I would read a quick sentence nuancing his take, but without affecting his argument, like: by the way, this applies only in that situation or if we interpret x to mean y. Other times, this nuancing but purposefully hidden remark would be found in a footnote, like: many studies show that [the opposite of what I think], but see my article on this. He also depicts some accomplished researchers in a way that can mistakenly lead one to believe they are complete charlatans or flatly incompetent, when Mercier only touches on one aspect of their work. As I progressed in his book, I—ironically—lost trust in his ability to present a charitable literary review on topics he wishes to discuss.
So, why did I wish Not Born Yesterday had been 30% longer? Because Mercier often fails to argue conclusively for many of the key points his thesis is built on and side-steps important critics. For instance, there are specifics places where when he argues against gullibility, I wondered why he did not discuss the sunk-cost fallacy which is relevant in this context (could we be gullible when we first accept a belief in a debate, but not gullible when this belief is later challenged?). (ii) When he argues that propaganda does not work very well, I wondered why he didn’t address the effect of repetition bias or availability cascade. (iii) When he argues that people rationally evaluate arguments, I wondered why he did not discuss belief bias and confirmation/my-side bias. (iv) And so on…
Overall, this book was surprisingly very fun and annoying to read all at once. I like how Mercier challenges my views and often introduces nice ideas in the quest to better understand how reasoning works in general—I just wished he cherry-picked less, argued more rigorously, and that I could trust him more. I recommend it mostly to people with a background in cognitive science, and one should probably take his words with many grains of salt.
This is a really important book. Mercier argues people aren't all that gullible. If anything, the problem is the opposite: we are difficult to convince of anything we don't already believe, because our passions usually rule our thoughts. Mercier draws on interesting empirical findings, and offers good, clear advice about how to turn down our passions and open our minds to new ideas.
3rd read: This was my third time reading this book, and it’s still one of the best I’ve read about why we listen to certain people and trust them. Hugo debunks a lot of myths about gullibility, and the book helps you understand why people listen to certain figures when the rest of us can clearly see the person is lying or sharing bad information. This book is an excellent source if you’re looking to learn more about human reasoning and behavior. I still have a bunch of questions as I continue to be interested in this topic, but this book always answers most of them.
2nd read: This is one of my favorite books, and I had to read it again. Each day, we’re flooded with information and have a ton of conversations, but why do we trust who we trust? And are we naturally gullible or skeptical? During times of science denial, misinformation, and people having a tremendous amount of reach on social media, we should all understand how trust works. Mercier breaks this down in such a unique way blending evolutionary psychology with actual data, and he argues that we’re naturally skeptical. I think one reason I love this book is because it’s the only one that doesn’t seem to fully embrace the truth default theory, and Mercier has extremely strong arguments about how we get to a place of trusting people. Throughout the book, he also debunks myths about misinformation on social media and other pieces of conventional wisdom that doesn’t have strong scientific backing. This was my second time reading this book, and I’ll most likely be reading it again.
This book makes a well-presented case that people are not as gullible as they're often claimed to be; rather, what appears to be gullibility is a tendency to follow "evolutionarily valid cue[s]" (p. 73) - including, importantly, the behavior and persuasive efforts of others in our environment.
While there's a lot of concern about fake news and propaganda, the evidence presented here suggests that people are not typically fooled or convinced to act against their interests. Rather, they choose to believe, support, and join groups that already offer something advantageous to them. When the leaders of those groups promulgate information in an effort to persuade people to behave in certain ways, it's not necessarily the information that causes the behavior change. Instead, the information provides a justification for people to behave in ways that they were already inclined toward anyway. Regarding the sentiment attributed to Voltaire that "Those who can make you believe absurdities can make you commit atrocities", Mercier points out that "this is in fact rarely true. As a rule, it is wanting to commit atrocities that makes you believe absurdities." (p. 202)
I learned at least a few things from this book, including the fact that in some languages, the grammar rules require that you "specify how you acquired a given piece of information." (p. 168) Mercier also puts forth a theory explaining the proliferation of blatantly obvious email scams. He points out that "while sending millions of messages was practically free, responding to them cost the scammers time and energy." Therefore, they may have made the messages "voluntarily preposterous. In this way, the scammers ensured that any effort spent engaging with individuals would only be spent on the most promising marks." (p. 251) Hard to say whether this theory is true, but I haven't heard any better explanation for what we observe.
Mercier allows that it still makes sense to combat authoritarianism, misinformation, and other sources of false beliefs, but suggests that we shouldn't expect this to prevent people from making "wrong" decisions that they see as being advantageous to them. The points made here are both heartening (misinformation isn't as damaging as we might have thought) and depressing (people will behave badly anyway as long as it's in their interest to do so).
The one thing I take issue with is the idea that we need to recognize and be sympathetic to people's unstated desires and goals, rather than whether they actually believe what they profess to. While there may be social value in this, it seems to me that we need to take what people claim to believe at face value - it's both patronizing and unfair to do otherwise, and claim that we know what they "really" believe. It seems obvious to me that it should be socially unacceptable to profess something known to be false in order to gain some personal advantage.
In any case, that's a tangential issue; Mercier's main points about trust and belief are solid and intriguing, and this book is a clearly written exposition of those points.
This is a helpful corrective to panic about the effects of "fake news" and propaganda on democracy. Mercier's view is that meaningful, "intuitive" beliefs that bear on action are very difficult to change. That's both reassuring (because people aren't just swayed back and forth by political propaganda) and challenging (since it means it's hard to make substantial progress in changing your opponents' underlying beliefs). That makes the Spinozan view of belief formation behind some recent work in philosophy about the pervasive bad effects of media seem a lot less plausible. The picture that emerges is one that is more or less consonant with the small-c conservatism of Seeing Like a State: local knowledge is good and hard to supplant, and democracy is the best way of channeling that knowledge.
This is an excellent, readable, concise, and clarifying little summary of how people trust information. Because the public debate about these issues is drowning in bullshit about viral misinformation, fake news, the allure of populism, and algorithmically curated echo chambers – bullshit perpetuated, to my eternal shame, also by myself – this book is also incredibly relevant and timely.
The main message is that people aren’t particularly gullible. But don’t take my word for it – read the book.
Ik heb dit boek onder andere gelezen voor mijn scriptie dat gaat over pseudowetenschapen. Hugo Mercier is een cognitief psycholoog die met veel voorbeelden vanuit de praktijk en actualiteit laat zien hoe wij denken en hoe we bepalen wat we geloven. Uiteraard een erg actueel onderwerp in tijdens van fake news.
De hoofdonderwerpen zijn geloofwaardigheid, goedgelovigheid en de invloed van individuele levensovertuigingen op individuen. Dit boek stelde me soms gerust, maar er is zeker ruimte voor verbetering. Hoe dan ook, het heeft me aan het denken gezet over fake news en omgaan met waarheid en zal zeker informatie uit dit boek gebruiken in mijn scriptie.
First off, “Not Born Yesterday” is. One of my favourite books this year. I enjoyed reading his book the reason enigma and found this title, well his name while scrolling my book challenges and found out he had another book. I’ll have read it again but some of the parts of reading his book were a pleasure to read.
A backfire effect for a cognitive bias is rare when you think the Suez canal or Nile canal is 75000 metres or 120000. Most would agree it was long. The backfire effect bias would less happen.
Where does a cognitive bias lead is.
If your having alot of simpleton thoughts, they may not have your best interest at heart.
I have like the rest of us said,"what the bleep was that?". I'm a simple guy, not a simpleton, let be clear, just simple.
Who has more compliance in a compliance passive domain.
They have a component in weirdness, do not change there minds, or fake a census. And if compliant and agree for independent conclusion, an apprentice has to ignore these Q's. For performance that all these smart people may not have his best interest at heart.
As archaic in well reading I just find TV or broadcast news over budget and to costly where part and components are getting rarer and rarer, I find paper way better more affordable where when TV says come here, I want to call to talk to it and say text me and hang up.
Where a prank phone call that may go over once or twice, but more would be harrassment and legal law suits today.
Politician, demogue, preachers intercept conformitality and gullibility.
Propagandists, campaigners, advertisers
teasing for amusement, manic Raz for amusement for CBD generalizer, manipulater, analogizer of an integrated information theory for an event horizon in lateral is. Time In. Fake census, in business is entertainment mind of every employee and employer an act, sage, business, customer, ceo, devil, god or narrator? Another's shoes Fractured information in theory, for a stable conformitality and gullibility to control the mass. As it's said a jack of all trades, is a jack of none. But many is better than one
There never going to teach you what you need to know by using both sides of your brain.- one making a problem to fix the problem. It's a business if you have not screwed the customer before you come home, you'll get beat.
Teasing for amusement in love bombing who's a bigger predator, other trades are brutal, back stabbing friends ouch, betray self, oh yeah. For unconformitality and ungulability is rarely.
or Witch confession that implicated a whole town of witch hunts. Or fake pandemic news being the leading threat in a pandemic.
Weirdness never ends. Futile fake news.
Or Shallow gurus, angry pundits and skilful con men. Angry a group not the individual. Many are not going to take on the popular bully or help the diary of a wimpy kid.
Most bullies likely grow to be a realtor showing biggest drive way.
Get this republicans believe that their champions of free trade which they are not and democrats as against free trade which they are not. But we’re inundated by news articles saying the opposite.
The case against gullibility. Are a few of the last chapters.
This book is a shout out that humans are not gullible, allowing us to learn from our elders is and cost are to high. Geriarchry has a power of memory, as it’s said grandparents have no roots, children have all there roots. Children think in movies and are very smart but have delays. Adolescence your brain changes thinking in movies to internal monologue or narrative of those movies. Adults are not very smart but don’t have delays. If gullibility is exceptable then the cost is to high
If either cooperation or communication did not stop gullibility in memory both senders and receivers must benefit from it. If Perceivers are excessively gullibly then would be mercifully abused by senders. Until the simple stop preceiving what they are being told. In bullies or class harassment in name calling for complete grades, the gaslighting is far from being gullible as humans we have mechanical -isms is what we hear or read. As popular as Jordan Peterson is, Goes to say let kids skateboard, that if a kid doesn’t stick up for himself then he’ll be an outcast all his life.
Is peer initiation people reject popular low influence that are completely complex in more Q’s that more I’m right and others are wrong. Going from influences from others more and more in views of power of our predecessors to views that our language let’s us express.
People believe the C.I.A took down the world trade centre but they’re obnoxious that they couldn’t take down the few bloggers.
Such seperate a god, Cognition or intuitive. Or absurd views or obnoxious views that alienate everybody. Or biases the thousands stolen by an Nigerian email but all the people laugh at the thousands of email that they received on it. It’s a miss confabulation that people are gullible is.
This entire review has been hidden because of spoilers.
I'm sort of conflicted about this one. On the one had there's a lot of interesting science in the book and a fairly good framework for putting stuff together. On the other hand, I didn't run into much that was new and the one sided framing of this in terms of how good humans are at choosing what to believe was sort of offputting.
For example, most people involved in the pizzagate conspiracy theory only believed in the conspiracy in a performative way but didn't act like they believed that so the author says that means that most people aren't actually very gullable. I'm skeptical that this is something to be proud of since there was that one guy who ended up actually believing and also since the author also shows the ways many people believe in various aspects of established science without believing them.
The author also shows various ways beliefs about things like witches might be instrumentally rational for the individuals who have them due complex social forces despite being false and harmful, and that this should be counted as a victory for human judgement over credulity. I'm also skeptical at calling this a victory. Or there are tendencies that would have been adaptive in the evolved environment but aren't now. Or the ways in which false and ridiculous beliefs shared by a group can increase that group's solidarity. This is all interesting and plausible but still doesn't move my judgement on human competence.
Still, unlike Why We Sleep the author might be spinning facts but he seems to be happy to face them head on rather than hiding bits that he disagrees with. That's to years of being interested in this sort of thing I think I could have noticed major omissions and I didn't. So it's just the dramatic framing or mood that's annoying rather than a potentially dangerous cherry picking of facts that I'm complaining about above.
I'm still giving it four stars, though, because it covers a lot of interesting ground in a single book while combining both the experimental and theoretical in a good balance. It didn't convey much new ground for me but I think it would be very valuable for most people.
A collection of examples of how humans aren't actually stupidly gullible, or gullible with caveats, with some theoretical background up front (epistemic aka open vigilance, see Merceir & Sperber, 2011, 2017 and Sperber 2000 (?, i think) for a more formal treatment), and a summary chapter recapping everything last – check that one if you want a quick dive.
Haven't read Steven Pinker's "Better Angels" yet, but I am guessing these two go together in a "humans aren't a lost cause" kind of genre, which I find preferable over doom & gloom lamentation on human imperfections. It's also more helpful, because it focuses on the nuances of when and how people decide what information is true and what to do with it, and makes people with beliefs different from your own (at least as they declare them) seem less insane and more tolerable.
My main gripe with this book is that the theory could benefit from being more specific. Open vigilance by itself is not a proper theory per se, more like a concept that the author applies and uses to interpret inconsistencies in the data on how humans do/not believe certain things. This kind of retrospective application weakens the punch, opening room for just-so-story critiques commonly weighted against evolutionary psychology. From memory of reading some of his & Sperber's previous work, there is some more meat to it, which could have been included. I am guessing from the general vibe of the book that it was trying to tilt more towards 'popular science', which could explain this (although somewhat unexpected, since its a university press).
I have never really been convinced about mass suggestion, that we are dumb in masses and just follow the majority. Maybe because I've never been part of the majority, as I am a libertarian politically, so I know that nobody can judge me by just following any popular view. I have encountered the thought, however, that people who vote status quo, that they simply follow no mind but the majority - but even then, talking to these people easily show that they have thought through their ideas. This idea is thus something that you never put on yourself but is very easy to put on others, which is why there are many that think there is something in this.
I'm happy that I encountered Hugo Mercier who puts this assumed truth to a test and gives good arguments that it aint true. He is not always equally convincing, but enough for me to be careful when the urge to judge others as only trusting or believing what is mainstream and easy rather than thinking themselves. I do still believe that most people are misguided in what to believe or who to vote for, but this is still based on the information and experience they have.
I like this kind of reading that challenge something we all thought to be so but that we have a hunch that is not so, really.
Mercier is tackling an important subject and does so with thoroughness and excellent analysis.
I want to believe that most people know how to evaluate who to trust. Mercier does but together, a careful analysis of how and why people make the decisions that they do and how they figure things out. I just think that things and those prior cultural tools that he described just aren't able to keep up with today's online world.
It is a book worth reading, studying, and wrestling with - in that I haven't yet figured out if I agree or disagree with him. I guess I will just have to come back to re-read this book to evaluate it further.
He lays out studies that show that people, average and ordinary, including all of us, can figure out the truth and what to believe. At least in principle.
Our online world has just made such a mess of things.
Perhaps this book needs to be balanced with the next book I will be reading: Cass R. Sunstein's Going to Extremes: How Like Minds Unite and Divide.
Mercier poses (and attempts to prove) an interesting thesis, that people are not gullible but rather err on the side of caution. We reject more information than we accept. We could and should be more open, but vigilantly. The problem is that over the past couple of years, too many things have happend to accept Merciers thesis at face value. The book was written pre-corona. It does discuss anti-vaxxers but his conclusions on the subject do not hold in the face of the sheer amount of covid anti vaxx sentiment. and the accompanying conspiracy theories which are more widely believed then Mercier will probably accept. Add to this that Mercier, when setting out the presuppositions that he wishes to disprove, does so by giving flawed examples wich can be easily refuted. Thereby building a flawed opposing case to topple.
All in all, a valiant try, but Mercier comes up short.
This book is remarkably upsetting. It confronts many points of view that are considered true and with evidences discards them. (At least a part of them). We are not as gullible as we think we are, for example. Probably what happens is that we are pointing, in terms of personal convictions, to wrong origins (propaganda) instead of focusing to real and tangible sources (human nature and psychology) . I was excited to read about how fake news did not constitute opinions but rather confirm them. The author insists on the challenge of introducing a very urgent alternative of rational processes in comprehending human argumentation (which is currently an exception). Once I finished this book it left me a lot of things to think about and a unexpected desire of reading it again.
I would probably give this a 3.5 star if I could. I like the concept and sort of taking commonly accepted views and turning them on their head. That’s essentially the recipe for this book: 1) take a commonly accepted view of persuasion/gullibility/etc. (e.g., Nazi’s propaganda was an interesting one in the book), 2) turn that view on its head by narrating and weaving in actual scientific or expert accounts to suggest ineffectiveness or alternative explanations for the original view, 3) rinse and repeat. It is a somewhat effective approach in terms of structuring, but can get a bit stale by the end. Overall, this was solid, though.
It was a rough start but in some point I start to enjoy the book.
The point of the book is that we aren't gullible because we have an open-vigilance mechanism to perceive if something is true or not. That mechanism makes us more resistant to new ideas. To believe something we need right cues to accept it as true.
I think the author went round and round sometimes which make me lost the track of the narrative.
This is a really interesting read. Although, to be honest, I don't fully agree with his argument. I like the idea of "open vigilance" - meaning we should be open to new ideas and test them out. But, I disagree that open vigilance is essentially an evolutionary adaptation which proves people aren't as gullible as you'd think. Some of his arguments felt a bit circular to me, and I don't think the logic fully holds. But I like the concept as a personal approach.
3.5 A promising start quickly becomes dry and at times meanders into a tangent corn maze before coming back around to the matter at hand. Overall some excellent research and application of theory to hold up the truths and dispel the old wives tales. Given better writing and structure, perhaps a bit more editing, this could pull out a 5.
We don’t get brainwashed as much as we glom on to ideas that fit our already existing intuitions. Also, trusting risks betrayal but not trusting means you lose out repeatedly on valuable information. Overall, trusting is wiser. A well-reasoned and fun read like The Enigma of Reason. A bit long for the points made, but I’m told that such is the nature of book publishing….
An interesting and somewhat convincing argument that humans are far less gullible than we have been led to think. A timely piece in the age of fake news, and a somewhat decent reflection of how people can be misled. I didn't find it a very gripping read, it seems like it's yet another book praising the new discovered "god" of rationality.