This book identifies eight key mechanisms that can transform a set of ideas into a psychological flytrap. The author suggests that, like the black holes of outer space, from which nothing, not even light, can escape, our contemporary cultural landscape contains numerous intellectual black-holes—belief systems constructed in such a way that unwary passers-by can similarly find themselves drawn in. While such self-sealing bubbles of belief will most easily trap the gullible or poorly educated, even the most intelligent and educated of us are potentially vulnerable. Some of the world’s greatest thinkers have fallen in, never to escape.
This witty, insightful critique will help immunize readers against the wiles of cultists, religious and political zealots, conspiracy theorists, promoters of flaky alternative medicines, and various other nutcases by clearly setting out the tricks of the trade by which such insidious belief systems are created and maintained.
Stephen Law is a philosopher who teaches at Heythrop College in the University of London. He also edits the journal THINK, a source of philosophy aimed at the general public, affiliated with The Royal Institute of Philosophy.
A must read for anyone interested in the pursuit of truth. The author is a former postman who went on to attain his doctorate in philosophy. He now lectures at the University of London. His interests include philosophy of religion from an atheist perspective. In this provocatively titled new work, Law examines the strategies employed by proponents of ridiculous belief systems such as homeopathy, conspiracy theorists and some (but not all) religions. These strategies include “Playing the Mystery Card”, “Piling Up the Anecdotes” and what he calls “Pseudoprofundity”. The best parts of the book include discussion of the “Going Nuclear” strategy in which the bullsh*tter employs radical scepticism to attack all beliefs (after all, how do we know we aren’t all brains in vats?) in order to deflect legitimate criticism of their crazy beliefs. Law also skewers Young Earth Creationists (YECs) who artfully attempt to immunize their beliefs from falsification. In his chapter “But it Fits and The Blunderbluss”, he describes how YECs answer the question: How did Noah feed all his animals while they were at sea? Answer: they hibernated. How did polar bears and possums make it to Noah’s Ark? Answer: There were no separate continents and the force of the Flood broke them apart. Law continues by explaining how the same ad hoc manoeuvres employed by YECs could be applied by someone who insists that dogs are actually spies from Venus. Faced with the objection that dogs can’t speak, the believer could suggest that dogs just choose to hide their ability to speak from humans. When presented with the fact that no life can be observed nor sustained on Venus, the believer can respond by suggesting that the dogs live in deep underground bunkers ... and so on and so on.
Believing BS: How Not to Get Sucked into an Intellectual Black Hole by Stephen Law
"Believing BS" is an informative book that identifies eight key mechanisms that can lead ideas into an intellectual abyss. Philosopher, educator and accomplished author, Stephen Law provides an interesting book that will help immunize readers against the follies of poor thinking. It's an expose of popular rhetorical tricks used to defend BS belief system. The author provides many practical examples and shows us quite clearly how to avoid being sucked into these intellectual black holes. This helpful 271-page book includes the following eight chapters (mechanisms): 1. Playing the Mystery Card, 2. "But it Fits! and The Blunderbus, 3. Going Nuclear, 4. Moving the Semantic Goalposts, 5. "I Just Know!", 6. Pseudoprofundity, 7. Piling Up the Anecdotes, and 8. Pressing Your Buttons.
Positives: 1. A well-researched and accessible book. The author has a pleasant, engaging style. 2. Despite the provocative title, I found the book to be fair, reasonable and even-handed. 3. Succeeds in achieving its main goal of providing readers with intellectual tools to defend against intellectual black holes ("systems constructed in such a way that unwary passerby can find themselves similarly drawn in"). 4. Explains eight key strategies in detail by providing illustrations that clearly show how they are applied and what's wrong with it. 5. Includes many religious examples such as Young Earth Creationism and Christian Science. Young Earth Creationism debunked with just the following: "What of the seasonal layers of ice found at the poles, the drilled-out cores of which reveal a seasonal history dating back hundreds of thousands of years?" 6. Provides plausible explanations on why we are predisposed in believing in invisible agents. As an example, the Hypersensitive Agent Detection Device (H.A.D.D.). "Thus evolution will select for an inheritable tendency to not just detect--but overdetect--agency." 7. The problem of evil strikes again. "Even if God had to allow some evil for the sake of certain greater goods, surely he could have no reason to allow quite so much." "In any case, what about the countless generations of humans that suffered before the Bible was written?" 8. Many interesting philosophical questions, "Is it true that beliefs about supernatural agents, gods, powers and other phenomena are essentially immune to scientific refutation? Find out. 9. The scientific method, always a worthwhile discussion. The value of other approaches like philosophy to make reasonable refutations. "What a scientific theory requires if it is to be credible is not merely consistency with the evidence but confirmation by the evidence--the stronger the confirmation, the better." 10. Good quotes always add value to a book, "It is important to stress that what we are looking at here is not a mere absence of evidence for the claim that crystals have such effects, but rather that it is some positive evidence of the absence of any such effects." 11. I like the concept of genuine confirmation of scientific theory. "The theory must make predictions that are: 1) clear and precise, 2) surprising, and 3) true." 12. Going Nuclear as a last ditch strategy to avoid defeat that lays waste to every position. The two main variants of "Going Nuclear": skeptical and relativist. Many good examples. 13. Effing the ineffable. "What I'm objecting to is the unjustified and partisan use of this suggestion to immunize Theism against powerful counterarguments, while at the same time allowing a degree of effability whenever, say, there appears to be something positive to be said in its favor." 14. Believing something in perspective. "Perhaps the most obvious way in which you might be justified in believing something is if you have good evidence that what you believe is true." 15. Pseudoprofundity exposed. "Mockery may be both useful and legitimate if we can show that it is deserved." 16. Using anecdotes instead of significant evidence to support a supernatural claim. "What would be more impressive is if, say, after being prayed for, someone's amputated leg grew back." Agreed. 17. The power of suggestion. "Expectation strongly shapes perception." Many great examples. 18. Belief-shaping mechanisms...brainwashing. The five core beliefs behind it. 19. A very good summary of the eight mechanisms and the main nine examples. 20. Notes included and linked.
Negatives: 1. The book is overall a bit uneven. That is, some topics get the royal treatment while others get the gloss over. As an example, using mockery as a tool. I was hoping for a little more depth on a tool I believe is underrated in its effectiveness. 2. I didn't really care for "The Tapescrew Letters"; sure it brings everything together but it didn't do much for me. 3. No formal bibliography.
In summary, I really enjoyed reading this book. Law succeeds in providing the public with an accessible tool to defend against intellectual black holes. He defines new terms well and provides many examples that clearly illustrate belief-shaping mechanisms in practice. Perhaps a couple of missed opportunities, the power of ridicule seems to be a very effective tool that received little ink. That being said, this turned out be an informative and helpful book. I recommend it!
Further recommendations: "The God Argument: The Case against Religion and for Humanism" by A.C. Grayling, "Sense and Goodness Without God: A Defense of Metaphysical Naturalism" by Richard Carrier, "The Science of Good and Evil: Why People Cheat, Gossip, Care, Share, and Follow the Golden Rule" and "The Believing Brain: From Ghosts and Gods to Politics and Conspiracies---How We Construct Beliefs and Reinforce Them as Truths" by Michael Shermer, "A Rulebook for Arguments" by Anthony Weston, "The Philosophy of Science" by Samir Okasha, "42 Fallacies" by Michael C. LaBossiere, "50 Popular Beliefs That People Think Are True" by Guy P. Harrison, "Evolution vs. Creationism: An Introduction" by Eugenie C. Scott, "The Rocks Don't Lie: A Geologist Investigates Noah's Flood" by David R. Montgomery, "God's Problem: How the Bible Fails to Answer Our Most Important Question--Why We Suffer" by Bart D. Ehrman, and "The Fallacy of Fine-Tuning: Why the Universe Is Not Designed for Us" by Victor Stenger.
We want to be rational, says Stephen Law. We also find ourselves drawn, for whatever reason, toward Intellectual Black Holes, such as believing in supernatural beings or medicines that aren’t scientifically proven to work. To deal with the cognitive dissonance of our self-understanding, we find strategies to help ourselves believe that we “are not being nearly as irrational as reason might otherwise suggest”. (p. 19)
He outlines eight strategies. I am using one of these strategies, which I paraphrase below, if I… (1) …appeal to the claim that “God works in mysterious ways.” I would make such a statement to claim that my inability to explain what I’m talking about should not be counted as a strike against my theory. (2) …launch an overzealous attempt to force evidence to fit my ridiculous theory. (3) …backpedal and try to escape upon noticing that I am losing an argument, and if I try to make this escape specifically by pointing out that no one can really know anything (radical skepticism) or that different people have different truths and we can’t judge or critique each other’s beliefs, after all (what is commonly referred to, although I dislike this term, as a type of “relativism”). (4) …change the meanings of my established terms in the middle of an argument. For example, if I say, “Everyone should believe in God,” do I intend to demonstrate that a godlike being exists or do I intend to advocate for the adoption of religious behaviors? There is possible ambiguity in what it means to “believe in” something or someone as well as in what is meant by “God.” (5) …attribute my beliefs to some special intuition that I do not identify. (6) …cloak my beliefs in aphoristic statements that seem profound but, once unpacked, are revealed to be nonsense. (7) …employ a list of anecdotes that seem to support my position instead of a logical argument or rigorous data to prove my point. (8) …use brainwashing techniques (“isolation, control, uncertainty, repetition, and emotional manipulation,” as the author quoted from Kathleen Taylor’s analysis).
While I basically agreed with the author’s ideas, I found this presentation a little incomplete for the following reasons.
The author seemed to have multiple focuses. Sometimes he spoke about the psychological mechanisms that enable people to maintain their own foolish beliefs. Other times, he spoke about the rhetorical strategies that are used in (un)persuasive arguments of the caliber one finds in a college dormitory. These internal psychological mechanisms and external rhetorical strategies must be related, as the arguments we have had recited to us and that we continue to recite to ourselves mentally will be similar to the arguments we choose to recite to others to propagate the worldview. This is my own observation. I don’t think it was in the book. I think an acknowledgement something like this could have more strongly informed the book’s structure. Of the eight strategies given above, each seems to have an internal and external meaning. It would have been useful to break down when each was being talked about.
Furthermore, some the eight strategies seem overlapping, and they were not assigned any particular order. I suggest it may be possible to impose an order such as “stages of self-awareness” or “stages of sophistication in persuasive ability.” For example, when just starting out, I might say that I “just know” that God exists (5); when someone challenges me about the world not appearing to be designed or maintained by a benevolent God, I might say that God’s reasons aren’t the same as human reasons and can’t be known, which means I am denying the validity of the objection rather than answering it (1); and then I might walk away, saying that, since no one can know anything about God anyway (contradicting my own original point!), atheism has no special advantage over theism, so there’s no reason to argue. (3) Realizing I performed very badly in that debate and prepared to make a more studied effort, I might go home and spend time developing more extensive arguments which unfortunately remain bad (2, 7) and resort to an alternative definition of God whenever I find myself unable to prove my original claim (4). If I elevate my bad arguments to a leadership level, I might pre-package some aphorisms for future use (6) or study how to brainwash my followers (8). When presented in this order, I start to see one potential narrative of how false beliefs receive layers of justification and "hardening," but this is my own interpretative work, not a sequence that was provided in the book.
As a side note to clarify the choice of title, Stephen Law is aware of Harry Frankfurt's definition of "bullshit" as something one says without caring whether it is true. Law challenges that definition, saying that "bullshit artists" often do have sincerely held beliefs but in that case they have simply managed to fool even themselves. I think the difference is that the former type could call themselves a "bullshit artist" because they are aware they are making up stories while the latter type would never use this as a term of self-reference because they believe themselves to be making factual statements.
I was interested in this based on seeing it somewhere else. Originally I thought it would be more about what people believe and why they believe it and how those "bullshit" beliefs could be countered. Instead it was Stephen Law's personal ideas on why people believe nonsense and how that can be combated generally.
The writing is meandering and repetitive. The whole thing could be summed up in 20 pages instead of 200. It was an easy read, but it also wasn't very interesting. The author started by giving reasons for not believing in a Christian god, but then used that to say why not believe in "God" generally, without ever mentioning any other religion, as if Christianity is bullshit, but Islam, Hinduism, etc are not. Now, I'm not saying any of those beliefs are or or not silly, but it seemed to needlessly attack one set of beliefs while ignoring all others.
There are citations, but generally, the whole book just seems very simple yet written by someone who seems to find his own writing profound. Not very interesting.
This book attempts to present a number of logical fallacies, and other shady tactics, in an accessible manner. It fails to do so. While it tries to cover concepts which can be hard to wrap your brain around, I think the biggest failing is the use of poor phrasing.
Another problem is that the author relies too heavily on shooting down religious ideas, such as Creationism. I am actually an atheist myself, but I felt that he could have easily found more examples in other areas to convey his points, and should have spent more time trying to express himself clearly.
The books does contain some good information, but I would suggest looking for other books about critical thinking instead of this one.
I wanted to like this book a lot more than I did. The material is useful, but has more padding than I needed. The book could also use some closer editing.
Whereas I found that using a parallel argument for an evil god (mimicking the stereotypical argument for a good god) to be thought provoking, I found the parallel rewriting of C.S. Lewis' _Screwtape Letters_ to be just a little embarrassing.
A fascinating book that should be taught to everyone, especially in an era when we are so concerned about fake news and pseudoscience.
If you are religious - the probably is not for you though, because it is mainly dismantling religious beliefs. Thus the negative reviews - people have to distinguish between anecdotal evidence and analogies used by the author in order to simplify its contents.
The simplicity of the book can be applied to any bullshit ideas - astrology, ideas spread by public figures on various topics (gurus), politicians, populists and others.
The best exercise is to take the chapters and apply them to any ideology or religious belief (for example postmodernism used in the book), or any kind of belief. My favourite is the last one, speaking about how to shape beliefs and letțs apply it to a fashionable modern belief, how about some post-modern agenda (social justice for example). The leaders of such movements are using all tricks mentioned in the book: 1) Isolation, even when living in a social setting - the important thing is to separate potential supporters - young people and surround them with the ideas you need and any dissent must be labelled - fascist or right-wing (in case of religion it would be heresy). 2) Control over the information your victims are exposed to - clearly stifle and censor anything that can be dangerous to the ideology you are promoting. Label anything you dislike as "hate speech". 3) Spread uncertainty about your opponent's beliefs - it is best to even demonise them. 4) Repetition - a well-known trick - repeat the same words - patriarchy, racism, fascists and so long - as often as possible and as mantras or slogans (similar to prayers a couple of time per day). 5) Emotion - the messages spread have to be heartbreaking, involving children and awful crimes and it would be perfect if graphic images can be used of dead children again or cities destroyed by war. And here you go - you have a successful ideology that has no rational or scientific backing.
In conclusion - instead of censoring Facebook or Youtube, let's teach people to identify bullshit and easily reveal it. It can be done by beginning to read this book.
I read this book not to win any religious debate, but as a facilitator and dialogue mapper, to be able to better understand patterns of conversation and how my capture of stakeholder rationale can be improved.
I found the book pretty fun and educational. I liked the writing style with its subtle and not-so-subtle use of sarcasm and occasional tutorials as how you two can become the next "guru". The conclusion chapter is definitely the best one of the book, and could have done with greater expansion.
Some chapters offered practical approaches to counteract your newly formed bullshit radar, but others just taught you to recognise bullshit for what it is. I think there is a missed opportunity for readers to further develop their bullshit defeating kung-fu tools at times.
On balance it was an easy read but some bits get a little trippy (as tends to happen with philosophical arguments). Since it focuses on religious arguments, it is likely to put many people off. This is a pity, because I feel what is written here has much greater applicability...
Credere alle cazzate = credere a una qualunque religione. Questa è la sinossi stringata di questo pamphlet pedante e noiosissimo. L'approccio a questo libro rivela immediatamente un tradimento nei confronti del lettore: qui non si parla delle argomentazioni più tipiche che l'autore ha isolato e classificato sulla base del ricco panorama attuale di ciarlatani e truffatori pronti a vendere anche la madre per un guadagno. Di fatto tutta la fatica di Law è canalizzata in un'unica, reiterata direzione: tutto ciò che viene promulgato dai testi religiosi (con particolare accanimento per quelli cattolici/cristiani) è una cazzata. Mi sembra incredibile che dopo qualche millennio di evoluzione e qualche secolo di fortuna del metodo scientifico si sia fermi ancora al punto di Law. Chissenefrega se i testi religiosi raccontano fatti e cercano di spiegare accadimenti assolutamente indimostrabili scientificamente! Mi sembra pacifico che a un ateo mai interesseranno le argomentazioni religiose, perché riconoscerà un dio ben diverso (la scienza), mentre nessun credente vedrà cadere alcuna certezza proveniente dalla sua fede, se il metodo scientifico non ne avvalorerà l'attendibilità. Che noia. Chiaramente lo scientismo e la religione viaggiano su binari incompatibili e differenti. La scienza punta a spiegare quante più cose possibili attraverso l'applicazione di un metodo rigoroso che dovrebbe essere infallibile e a prova di bomba, la religione tende a spiegare i "misteri" della vita (cui sicuramente la scienza non può dare risposta) parlando per metafore, per colpire l'immaginario di un pubblico spesso e volentieri digiuno di conoscenze superiori (per lo meno in passato), impartendogli così lezioni di vita importanti. Se poi si vuole dissezionare l'Antico Testamento e, di fronte ad affermazioni come "la Terra è piatta e il sole ci gira attorno", o come "nell'Arca di Noè ci entrarono coppie di ciascun animale terrestre", affermare con autocompiacimento che sono tutte cazzate, bene, lo si può certo fare e da un punto di vista letterale lo si farebbe pure a ragione. Ma la cazzata più grande rimane a mio parere spendere tempo e parole proprio per impegnarsi in analisi banali e ovvie, proprio perché non si è voluto comprendere fin dall'inizio che un testo religioso non è un testo scientifico. Sarebbe bastata da parte dell'autore questa semplice constatazione per giungere, prima di dare l'opera alle stampe, all'epifania che la vera cazzata è l'esistenza di questo libro. Sempre che ci si trovasse di fronte a un autore in buona fede.
I don't agree fundamentally with Law's secular values and atheistic free-thought. But I think this is a witty and thoughtful book. It discusses a series of dishonest argument tactics employed by people to shut down honest debate. Playing the Mystery Card, for instance, gives one an ability to endlessly evade objections, by just shrugging one's shoulders and claiming the issue is mysterious. Piling Up the Anecdotes means that one keeps telling favorable stories that do not amount to statistical significance. Law closes the book with a Screwtape Letters type spoof in which he pretends to be someone trying to induce someone to join a religion. I took issue with this because it made evangelization somehow seem dishonest. I guess Law prefers people to lack any sense of meaning or objective value in their life.
TL;DR Philosopher dissects why smart people believe stupid things through the arguments they make for it
Law creates a framework to analyze different methods through which intellectual black holes are defended. He seeks to describe the logical fallacies associated with convincing other people of unlikely things - the existence of a benevolent God versus an evil God, the impending alien invasion, the effectiveness of Homeopathy.
I didn't enjoy the book as its conclusions and its theory felt pretty obvious. It is not a psychological study of people and their experiences, but rather a well reasoned philosophical outline of how the logical slippages work.
books like this are really empowering . it helps inoculate you from the illusory "reasonableness" of the ideas of con men, liars and cultists, and shows you how they trick even smart mostly rational people into believing irrational things . a must read for those on the quest to find the truth of reality
A great guide to identifying and avoiding intellectual black holes. Presents a "relatively" unbiased view of various of these intellectual black holes while mentioning how they have managed to avoid being taken down by scrupulous rationalists (also leaving tad bit of doubt about my understanding of rationality). An interesting read.
This is supposedly a book on critical thinking. It is, but largely reflects the author's opinion. For me it rambled on too much. Also the mere title reflects one of the techniques of propagandists that is thought in most critical thinking courses. Using "Bullshit" is an excellent example of name-calling.
Okay for what it was, but misleading. It's more direct logic than psychology, sociology, or an exploration of how false beliefs work in a broader sense. Also far more focused on religion than the descriptions suggested.
In this book, Stephen Law proposes eight ways or strategies people use to pull you into an intellectual black hole. These mechanisms are as follows:
1. Playing the Mystery Card: which involves immunizing your beliefs against refutation by making unjustified appeals to mystery. For example, deal with any scientific evidence against your paranormal beliefs by insisting, without justification, that what you believe is “beyond the ability of science to decide.”
2. “But It Fits!”: which involves coming up with ways of making evidence and theory “fit” after all. Any theory, no matter how absurd, can be made consistent with the evidence (even a theory that says dogs are Venusian spies).
3. Going Nuclear: which involves exploding a skeptical or relativist philosophical argument that appears to bring all beliefs down to the same level, rationally speaking. You can thereby force a draw in any debate. Then, once the threat has receded, the skepticism/relativism may be conveniently forgotten.
4. Moving the Semantic Goalposts: which involves dodging possible refutations by switching back and forth between meanings (between effable and ineffable meanings, for example).
5. “I Just Know!”: which involves suggesting that the truth of your belief has somehow been revealed to you, by, for example, some sort of a psychic or god-sensing faculty (this suggestion is unreasonable if you are aware of grounds for supposing that at least a large proportion of these supposedly revelatory experiences are, in fact, delusional).
6. Pseudoprofundity: which is the art of making the trite, false, or nonsensical appear both true and deep. Various linguistic recipes are able to create the illusion that you have achieved some sort of profound insight into the human condition.
7. Piling Up the Anecdotes: Anecdotes are in most cases almost entirely worthless as evidence, particularly in support of supernatural claims. But they can be highly persuasive, especially when collected together.
8. Pressing Your Buttons: which involves reliance on certain kinds of non-truth-sensitive techniques for shaping belief, such as isolation, control, uncertainty, repetition, and emotional manipulation. These techniques are a mainstay of the “educational” programs of many cults and totalitarian regimes. Applied in a consistent and systematic way, they amount to brainwashing.
For example, those who believe in the efficacy of homeopathy typically rely heavily on anecdotal evidence to create the impression that what they believe is reasonable. They also tend to Play the Mystery Card. That's because there's a puzzle about how homeopathy could possibly work, given that it involves diluting substances to such a degree that there's not even a molecule of the supposedly active ingredient left.
Those who claim they have been abducted by aliens, nomally play the "I just know" card. Just like religious experiences, abduction experiences tend to halt at certain culturally significant national borders. Those abducted in the US, report a different story than those who live elsewhere, for instance.
Self-help gurus rely on pseudoprofundity to peddle their bullshit. There's often some value in what these self-styled experts say. However, as often as not, what the self-help “experts” are selling are little more than truisms padded with Pseudoprofundity. The impression that they possess THE SECRET is often bolstered by means of numerous testimonials from people who say that they followed the advice and now they are millionaires or happily married or whatever. That is to say, sellers of THE SECRET tend to rely heavily on Piling Up the Anecdotes to generate the impression that there's good evidence to back up their claims.
If we want to immunize the next generation against such bullshit, let's at least make sure they understand the warning signs. This book is a good start.
Why do people believe in alien abduction, 9/11 trutherism, or that the earth is a mere 6,000 years old? It’s due to eight persuasive techniques that Stephen Law analyzes, showing the flaws in each. An English philosopher who formerly lectured at Heythrop College, University of London, Law now edits the philosophical journal Think. He writes for the layman, however, as his book title suggests.
Playing the mystery card is one of the techniques used to persuade people of things for which solid evidence is clearly lacking. When they don’t have a good answer, persuaders may say it’s a mystery that mere humans can’t understand.
More than four in ten Americans subscribe to the view that earth is between 6,000 to 10,000 years old. Saying “but it fits” is a technique that young earth creationists prefer. They select bits of evidence and adjust their theory to fit them. The fossil record, for example, is a huge stumbling block. But inventive creationists claim the record actually supports their theory, which includes the assertion that dinosaurs co-existed with early humans. Ken Ham, who operates the Creation Museum and the Ark Encounter, says that “all the evidence supports exactly what the Bible says.”
Piling up anecdotes is a popular method of persuasion. For instance, lots of people say they feel better when using crystals. Experiments show that the crystals themselves have no healing powers. The power of suggestion, however, is potent.
Christian Scientists say that tens of thousands of anecdotes of healing by prayer demonstrates the efficacy of their faith that eschews modern medicine. Though they keep track of supposed healings, they ignore cases where their patients remained sick or died. Anecdotes also don’t account for the fact that the body heals some maladies with time. There are also the inconvenient findings of five double-blind studies between 1999-2006 testing whether patients who were prayed for did better than the control group of patients who weren’t prayed for. All five failed to find improvement in the first group compared to the second.
Other dubious persuasive techniques are going nuclear, moving the semantic goalposts, “I just know,” pressing your buttons, and psuedoprofundity.
One error the author makes is when he cites the Biblical chapter and verse where Noah was told how big to make the ark. Law lists it as Genesis 16:2, but it is actually Genesis 6:15.
This book helps readers to identify the methods used by cults, purveyors of world-wide conspiracy theories, belief in the flat earth and Loch Ness monster, among the things. I found it readable and useful.