More on this book
Community
Kindle Notes & Highlights
by
David Robson
Read between
April 1 - July 27, 2019
Jack is looking at Anne but Anne is looking at George. Jack is married but George is not. Is a married person looking at an unmarried person? Yes, No, or Cannot Be Determined?
Cognitive scientists such as Keith Stanovich describe two classes of rationality. Instrumental rationality is defined as ‘the optimisation of someone’s goal fulfilment’, or, less technically, as ‘behaving so that you get exactly what you want, given the resources available to you’. Epistemic rationality, meanwhile, concerns ‘how well your beliefs map onto the actual structure of the world’. By falling for fraudulent mediums, Conan Doyle was clearly lacking in the latter.
One of their most striking experiments asked participants to spin a ‘wheel of fortune’, which landed on a number between 1 and 100, before considering general knowledge questions – such as estimating the number of African countries that are represented in the UN. The wheel of fortune should, of course, have had no influence on their answers – but the effect was quite profound. The lower the quantity on the wheel, the smaller their estimate – the arbitrary value had planted a figure in their mind, ‘anchoring’ their judgement.6
A truly rational thinker should consider both sides of the argument, but Perkins found that more intelligent students were no more likely to consider any alternative points of view.
Even student lawyers, who are explicitly trained to consider the other side of a legal dispute, performed very poorly.
Scientists today use the term ‘motivated reasoning’ to describe this kind of emotionally charged, self-protective use of our minds.
smart people do not apply their superior intelligence fairly, but instead use it ‘opportunistically’ to promote their own interests and protect the beliefs that are most important to their identities. Intelligence can be a tool for propaganda rather than truth-seeking.34
The same polarisation can be seen for people’s views on vaccination,37 fracking38 and evolution.39 In each case, greater education and intelligence simply helps people to justify the beliefs that match their political, social or religious identity. (To be absolutely clear, overwhelming evidence shows that vaccines are safe and effective, carbon emissions are changing the climate, and evolution is true.) There is even some evidence that, thanks to motivated reasoning, exposure to the opposite point of view may actually backfire; not only do people reject the counter-arguments, but their own
...more
When the statement was accompanied by a stock photo of the singer, they were more likely to believe that the statement was true, compared to the participants who saw only the plain text.
Once again, the photos added no further evidence, but significantly increased the participants’ acceptance of the statement.
detailed verbal descriptions (such as of the celebrities’
Perhaps the most powerful strategy to boost a statement’s truthiness is simple repetition.
But they were almost as convinced by the argument when it came from a single person, multiple times.
To make matters worse, the more we see someone, the more familiar they become, and this makes them appear to be more trustworthy.14 A liar can become an ‘expert’; a lone voice begins to sound like a chorus, just through repeated exposure.
This tactic was regularly employed by tobacco industry lobbyists in the 1960s and 70s. The vice president of the Tobacco Institute,
With repetition, their message begins to sound more trustworthy – even though it is only the same small minority repeating the same message.
To make matters worse, attempts to debunk these claims often backfire, accidentally spreading the myth. In one experiment, Schwarz showed
By repeating the claim – even to debunk it – you are inadvertently boosting its truthiness.
This may help to explain why more educated people seem particularly susceptible to medical misinformation: it seems that fears about healthcare, in general, are more common among wealthier, more middle-class people, who may also be more likely to have degrees. Conspiracies about doctors – and beliefs in alternative medicine – may naturally fit into that belief system.
Despite a sixteen-month, $10-million-dollar campaign retracting the statements, the adverts were only marginally effective.19
‘information deficit model’
They are then repeated again, as bold headlines, underneath. According
to the latest cognitive science, this kind of approach places too much emphasis on the misinformation itself: the presentation means it is processed more fluently than the facts, and the multiple repetitions simply increase its familiarity.
If possible, you should avoid repeating the myth entirely.
It’s better to headline your article ‘Flu vaccines are safe and effective’ than ‘Myth: Vaccines can give you the flu’.
Instead, they argue that it is best to be selective in the evidence you present: sometimes two facts are more powerful than ten.
If you are trying to discuss the need for companies to pay for the fossil fuels they consume, for example, you are more likely to win over conservative voters by calling it a ‘carbon offset’ rather than a ‘tax’, which is a more loaded term and triggers their political identity.
World Health Organisation announced that they had now adopted these guidelines to deal with the misinformation spread by ‘anti-vaccination’ campaigners.22
students at Ivy League colleges – only answer between one and two of the three questions correctly.
Keith Stanovich’s rationality quotient.
Gordon Pennycook
Pennycook’s research would seem to imply that we could protect ourselves from misinformation by trying to think more reflectively – and a few recent studies demonstrate that even subtle suggestions can have an effect.
One tantalising experiment has even revealed that a single meditation can improve scores on the Cognitive Reflection Test, which would seem promising if it can be borne out through future research that specifically examines the effect on the way we process misinformation.
Pennycook has, incidentally, shown that reflective thinking is negatively correlated with smartphone use – the more you check Facebook, Twitter and Google, the less well you score on the CRT. He emphasises that we don’t know if there is a causal link – or which direction that link would go – but it’s possible that technology has made us lazy thinkers. ‘It might make you more intuitive because you are less used to reflecting – compared to if you are not looking things up, and thinking about things more.’
Voiko olla niin että somessa pitää kommentoida nopeasti että mielipide nousee näkyviin, ei jää aikaa kunnon järkeilylle? Sukkela kommentti pärjää paremmin kuin analyyttinen.
As a consequence, your well-meaning attempts to protect yourself from bad thinking may fall into the trap of motivated reasoning.
Patrick Croskerry’s
‘inoculation’
the approach can be very powerful.
Even more importantly, the inoculation had neutralised the effect of the misinformation across the political spectrum; the motivated reasoning that so often causes us to accept a lie, and reject the truth, was no longer playing a role.32
inoculation works despite your political background,’
‘Regardless of ideology, no one wants to be misled by logical fallacies – and that is an encour...
This highlight has been truncated due to consecutive passage length restrictions.
inoculation theory shows that we need to be taught about it explicitly, using real-life examples that demonstrate the kinds of arguments that normally fool us.34
Even more importantly, these courses also seem to improve measures of critical thinking more generally – such as the ability to interpret statistics, identify logical fallacies, consider alternative explanations and recognise when additional information will be necessary to come to a conclusion.35
these measures of critical thinking don’t correlate very strongly with general intelligence, and they predict real-life outcomes better than standard intelligence tests.
The first step is to learn to ask the right questions: Who is making the claim? What are their credentials? And what might be their motives to make me think this? What are the premises of the claim? And how might they be flawed? What are my own initial assumptions? And how might they be flawed? What are the alternative explanations for their claim? What is the evidence? And how does it compare to the alternative explanation? What further information do you need before you can make a judgement?
Do they actually add any further proof to the claim – or do they just give the illusion of evidence? Is the same person simply repeating the same point – or are you really hearing different voices who have converged on the same view? Are the anecdotes offering useful information and are they backed up with hard data? Or do they just increase the fluency of the story? And do you trust someone simply because their accent feels familiar and is easy to understand?
overwhelming evidence shows that many people pass through university without learning to apply them to their daily life.37 And the over-confidence bias shows that it’s the people who think they are already immune who are probably most at risk.
start out by looking at relatively uncontroversial issues (like the flesh-eating bananas) to learn the basics of sceptical thought, before moving on to more deeply embedded beliefs (like climate change) that may be harder for you to question.