Tim Harford's Blog, page 43
June 1, 2021
What magic teaches us about misinformation
“The things right in front of us are often the hardest to see,” declares Apollo Robbins, the world’s most famous theatrical pickpocket. “The things you look at every day, that you’re blinded to.”
As he says these words, he’s standing on stage at a TED conference in 2013. He invites the audience to close their eyes, then to try to recall what he’s wearing. It’s not easy. We imagine that we would have filed all those details away, after a couple of minutes of looking at him speaking. And indeed we could have done. But we didn’t. When we open our eyes we see he’s wearing a dark waistcoat and jacket, a striped tie and a dark-purple shirt.
Robbins ambles into the audience, finding a volunteer — Joe — and leading him on stage. For the next three minutes, Robbins proceeds to bewilder Joe. He announces that he’s trying to steal Joe’s watch, but then asks Joe to check his pockets. In that instant of distraction, the watch is gone. It reappears a moment later on Robbins’s wrist. Robbins’s larcenous skills are legendary — he once stole actress Jennifer Garner’s engagement ring, and the badges of Jimmy Carter’s secret service bodyguards. Poor Joe didn’t stand a chance.
But it is the final flourish of this talk that is most intriguing. After sending Joe back to the audience, Robbins asks everyone, this time keeping their eyes open, what he is wearing. He has been in plain view of a thousand people the whole time — quite literally in the spotlight. And yet somehow the shirt is now pale and checked, not plain and dark. The tie and waistcoat have gone. As he says: often the hardest things to see are right in front of us.
It’s difficult for any of us not to be fascinated by Robbins’s skill and particularly by that final act of stagecraft. But for me, after more than a decade dabbling in the field of fact-checking and fighting misinformation, there was an important truth in the disappearance of the waistcoat: we pay less attention than we think.
Why do people — and by “people” I mean “you and I” — accept and spread misinformation? The two obvious explanations are both disheartening. The first is that we are incapable of telling the difference between truth and lies. In this view, politicians and other opinion-formers are such skilled deceivers that we are helpless, or the issues are so complex that they defy understanding, or we lack basic numeracy and critical-thinking skills. The second explanation is that we know the difference and we don’t care. In order to stick close to our political tribe, we reach the conclusions we want to reach.
There is truth in both these explanations. But is there a third account of how we think about the claims we see in the news and on social media — an account that, ironically, has received far too little attention? That account centres on attention itself: it suggests that we fail to distinguish truth from lies not because we can’t and not because we won’t, but because — as with Robbins’s waistcoat — we are simply not giving the matter our focus.
What makes the problem worse is our intuitive overconfidence that we will notice what matters, even if we don’t focus closely. If so, the most insidious and underrated problem in our information ecosystem is that we do not give the right kind of attention to the right things at the right time. We are not paying enough attention to what holds our attention.
The art of stage magic allows us to approach this idea from an unusual angle: Gustav Kuhn’s recent book, Experiencing the Impossible, discusses the psychology of magic tricks.
“All magic can be explained through misdirection alone,” writes Kuhn, a psychologist who runs the Magic Lab at Goldsmiths, University of London. Such a strong claim is debatable, but what is beyond debate is that the control and manipulation of attention are central to stage magic. They are also central to understanding misinformation. The Venn diagram of misinformation, misdirection and magic has overlaps with which to conjure.
Consider the following headline, a false claim that circulated online in 2018: “President Trump Readies Deportation of Melania After Huge Fight At White House”.
It was among 36 headlines which were shown to a thousand experimental participants in a study conducted by psychologists Gordon Pennycook, Ziv Epstein, Mohsen Mosleh and others, and published recently in the scientific journal Nature. Half of the headlines were true and half false, some favouring narratives from the political right and some from the left. Some participants were asked which headlines they would consider sharing on social media.
Others were asked instead which headlines were accurate, unbiased descriptions of real events. Recall the two leading explanations of why people spread misinformation: first, that they aren’t capable of distinguishing between truth and lies; second, that for partisan reasons they don’t want to.
In this experiment, most people had no trouble distinguishing truth from lies: false headlines were usually spotted and the true headlines were very likely to be identified as such, even when they clashed with a participant’s political preconceptions. A committed Democrat might savour the idea that Donald Trump was about to deport his own wife, but nevertheless both Republicans and Democrats had no trouble figuring out that the headline was implausible.
When asked to sift truth from lies, participants did just that. But when asked instead which headlines they would consider sharing, people suddenly seemed blind to the difference between truth and lies: they happily shared the headlines that fit with their political sympathies, with false headlines scarcely penalised relative to the truth.
Does this mean that people knowingly spread false information? No doubt some do, but Pennycook and his colleagues think this is not typical. When the participants were asked what they valued “when deciding whether to share a piece of content on social media”, the most popular answer was overwhelmingly clear. Not surprise, not political alignment, not humour. It was that whatever they shared, it should be true.
A puzzle, then: people share material based on political tribalism rather than truth, despite being able to distinguish truth from lies; yet people say that they value accuracy above all else when deciding whether to share. What explains the apparent contradiction?
“People are lazy,” says Pennycook, a psychologist at Regina university. “People don’t engage.”
Is it that simple? We don’t like to think of ourselves as lazy and disengaged. But Pennycook’s research yields clues pointing in that direction.
“If you force people to give intuitive responses, responses that they aren’t permitted to really think that much about, it makes people worse at recognising false content,” explains Pennycook. A study he published jointly with David Rand of MIT, titled “Lazy, not biased”, found that the ability to pick out fake news headlines from real ones was correlated with performance on a “cognitive reflection test”, which measures people’s tendency to stop and think, suppressing a knee-jerk response.
This suggests that we share fake news not because of malice or ineptitude, but because of impulse and inattention. It is not so different from Robbins’s disappearing waistcoat and tie. Can people spot the difference between a man in formal attire and one with an untucked, open-necked shirt? Of course we can, just as we can spot the difference between real news and fake news. But only if we pay attention.
In their 1999 book Magic in Theory, Peter Lamont and Richard Wiseman explore the links between psychology and magic. Wiseman, a professor of psychology at the University of Hertfordshire, warns against drawing too close a link between stage magic and the everyday misdirection we experience in the media and social media ecosystem. For him, a truly effective stage illusion requires a combination of specialised methods. One cannot simply rely on a broad psychological tendency.
Wiseman is fascinated by “change blindness”, which includes our tendency to overlook the disappearance of Robbins’s waistcoat and tie. (Wiseman pioneered his own version of the stunt.) But change blindness only goes so far; not everyone will overlook the change.
“If you are a magician and half the audience notices what you’re up to,” says Wiseman, “then you’re having a bad day.”
Yet if your aim is to get people to remember a political talking point, or to share a video on social media, then you merely need to fool some of the people some of the time. Crude misdirection can work, and the approach has a name in political communications: the “dead cat strategy”. If a dinner party conversation turns awkward, simply toss a dead cat on to the table. People will be outraged but you will succeed in changing the subject. Trump had an unrivalled gift for producing a dead cat whenever he wanted to.
As Boris Johnson faces damaging accusations of accepting large undeclared donations to pay for a lavish refurbishment of his Downing Street flat, one cannot help but wonder about recent leaks claiming that the prime minister had said “let the bodies pile high in their thousands”. Another dead cat on the dinner table?
Magicians have a rather more pleasing approach. Lamont and Wiseman note in Magic in Theory that “a moment of strong natural misdirection occurs when a dove is produced and is allowed to fly upwards. All eyes naturally follow the flight of the dove.”
“At that point,” one magician told Lamont and Wiseman, “you can do anything you want.” Dead cat or white dove, either attracts our attention. And when we are focused on the distraction, the real tricks can begin.
Watching Robbins at work, one is struck by his shamelessness: he announces that he is a pickpocket, and then proceeds to invade the personal space of his chosen victim, fiddling with lapels, touching shoulders and wrists, and patting pockets. It’s clear that he’s up to something, but wherever you look, the larceny is occurring somewhere else.
Among those who study misinformation, these tactics have a parallel: the “firehose of falsehood”. The firehose strategy is simple: barrage ordinary citizens with a stream of lies, inducing a state of learnt helplessness where people shrug and assume nothing is true. The lies don’t need to make sense. What matters is the volume — enough to overwhelm the capabilities of fact-checkers, enough to consume the oxygen of the news cycle. People know you’re lying, but there are so many eye-catching lies that it feels pointless to try to sift for the truth.
The firehose of falsehood was perfected by 21st-century Russian propagandists, but also seemed to characterise the behaviour of the Trump administration, which would lie about anything, no matter how inconsequential or easily disproved — from the size of the crowd at Trump’s inauguration (underwhelming, but who cares?) to whether he won the popular vote in 2016 (no, although in the US electoral system the answer is irrelevant) to whether the 2020 election apparatus in Georgia was run by Democrats (anyone can verify that the secretary of state Brad Raffensperger is a life-long Republican).
I cannot help but be reminded of Robbins. He isn’t trying to escape suspicion: instead, he overwhelms your senses with so many questionable pokes and pinches that you simply cannot see the moment he lifts your watch and straps it on his own wrist.
The silent half of Penn and Teller is not so silent when it comes to the theory of magic. In a piece for Smithsonian magazine, Teller explained the power of letting people leap to their own false conclusions. For example, the early 20th-century magician David P Abbott used to make a golden ball float around his parlour for guests, using an unseen thread to support the ball.
The real magic came when Abbott would wander off to fix drinks, leaving the ball behind. Guests would scurry across to examine it, and discover to their astonishment that the ball was much heavier than it looked. The real trick was not only to plausibly disguise the thread; it was to swap the lightweight ball for the hefty duplicate.
“When a magician lets you notice something on your own,” writes Teller, “his lie becomes impenetrable.”
I have often seen the same tendency in the way we interpret information and misinformation based on data definitions that seem intuitive but aren’t. We observe a statistical trend or a policy pledge, and then we leap to conclusions that turn out to be quite mistaken. Why? Because the trend or the pledge is based on an underlying definition we had misunderstood. In my book, How To Make The World Add Up, I call this ill-fated leap “premature enumeration”.
For example, early in 2020, the UK’s home secretary, Priti Patel, defended her plans to restrict “unskilled immigration” by saying that instead UK employers would be able to recruit “economically inactive” UK residents. That all sounds rather progressive, until you realise that “economically inactive” is a definition that includes students and people who are chronically sick — and “unskilled immigration” typically means “paid less than £26,500 a year”, a category that happens to include early-career radiographers, physiotherapists and paramedics. We approve of reducing unskilled immigration and employing economically inactive people, as long as we never realise that means banning the immigration of medics and hoping students will step up to do the job instead. Or, to quote Teller, “Nothing fools you better than the lie you tell yourself.”
There is hope. Where our actions are based on reflex, a nudge towards making an active choice can make a difference. Alice Pailhès studies the psychology of stage magic with Kuhn at Goldsmiths. One of her experiments examines a “positional force”, in which the magician lays four cards in a line on the table and invites the subject to pick a card. It’s well-known that people tend to gravitate to the third card if right-handed, and the second card if left-handed, plausibly because these are simply the most convenient options. In the experiment, Pailhès sometimes says, “Push a card towards me”, and sometimes, “Choose a card and then push it towards me”, more explicitly framing it as a decision.
That subtle distinction makes a big difference. The first instruction induces 60 per cent of people to pick the expected target out of the four cards. The second instruction, with the faintest hint of encouragement to actively decide, causes the forcing technique to collapse: only 36 per cent of people choose the target card.
Could a similarly subtle reframing work to combat misinformation? Consider the subjects studied by the team including Pennycook, Epstein and Mosleh. Remember that those subjects displayed a puzzling contradiction: they were well able to distinguish fake news from true headlines, they said that they valued truth above everything else when considering what to share, and yet they were nearly as likely to share lies as true claims.
It does not take much to change this. In one follow-up study, the researchers primed people’s attention by asking them to rate the truth of a headline. After this priming question, people were substantially less likely to share false headlines than a control group shown the same headlines. People care about the truth, and they can discern the difference between truth and lies — but they do not always think about the truth. Invite them to focus on truth, just for a moment, and they start living up to their professed beliefs. They start paying attention to what is true.
Just as with Pailhès’s subtle prompt to make an active choice of card, when Pennycook and colleagues subtly prompted people to focus on truth, the difference was stark. That difference is observable not just in a survey about hypothetical behaviour but in the wilds of social media. Pennycook’s team sent a direct message to more than 5,000 Twitter users who had recently shared information from highly partisan websites. The message simply showed people a non-political headline and asked for their opinion as to whether the headline was true or not. This message primed people to think about accuracy. Different users received the message on different days, but in the 24 hours after receiving the message, users were more likely to share headlines from serious news sources such as The New York Times and less likely to share content from Breitbart and The Daily Caller. They were also more likely to practise what is sometimes called “engaged sharing” — adding comments rather than simply pressing a button to retweet.
A lone prompt to ponder whether a single headline was true then influenced what people shared all day. It is a striking demonstration that sometimes what we need is not more facts, more numeracy and less partisanship, desirable though all that might be. Sometimes what we need is to pay more attention to the truth.
Paying attention is not so hard, but we first need to realise that there is a problem. And the overarching lesson of the psychology of misdirection is this: we are blind to our own blindness. The psychologists Lars Hall and Petter Johansson of Lund University and their research team collaborated with professional magicians to devise an intriguing experiment. Hall and Johansson would repeatedly show research subjects a pair of portrait photographs and ask them which of the two faces they found more attractive. They would then hand over the photograph and ask the subjects to explain why. Participants gazed again at the photographs and had no trouble justifying their choices:
“I like his smile.”
“I’m a photographer, I like the way she’s lit.”
“I don’t know, looks a little bit like a Hobbit.”
All plausible reasons, but Johansson would often use sleight of hand to swap the photographs. Experimental subjects did not detect the swapping method, and rarely noticed that they were now gazing at the very face they had rejected just seconds previously. The justifications they used were indistinguishable from those they used when no swap had taken place.
Hall and Johansson repeated the trick with policy questions: they quizzed people about their voting intentions and asked them to place crosses indicating their positions on 12 different policy questions from “strongly opposed” to “strongly in favour”. Using an elegant bit of trickery, the researchers flipped the responses, showing people the exact reverse of their choices. While occasionally nonplussed, respondents usually did not notice that anything was amiss and produced plausible justifications for whatever they had “chosen”. (This technique turns out to be quite effective at shifting voting intentions, too.)
It is a remarkable finding: we will argue fluently in favour of a policy position that we did not hold simply because a conjuring trick has persuaded us that we did hold it. And as with Robbins’s waistcoat, the surprise is not just that we do not notice. It is that we are so certain that we would.
We retweet misinformation because we don’t think for long enough to see that it is misinformation. We obsess over bold lies, not realising that their entire purpose is to obsess us. We see one thing and assume it is another, even though we are only deceiving ourselves. We will argue in favour of policies that we opposed seconds ago, as long as we can be distracted long enough to flip our political identities in a mirror.
And behind all this is the grand meta-error: we have no intuitive sense that our minds work like this. We fondly imagine ourselves to be sharper, more attentive and more consistent than we truly are. Our own brains conspire in the illusion, filling the vast blind spots with plausible images.
It all seems relentlessly depressing, but there is plenty of hope in this account of why we fall for misinformation. It turns out that we can tell the difference between truth and lies, and that our political opinions are less stubbornly tribal than one might think. But we need to slow down and pay attention.
If Teller decides to slip a lemon under a cup during a cups and balls routine, or Robbins decides to remove your watch, you don’t have much of a chance. These professional performers are too skilled; their methods are too well-honed.
But if you decide to think carefully about the headlines, or the data visualisations that adorn news websites, or the eye-catching statistics that circulate on social media, you may be surprised: statistics aren’t actually stage magic. Many of them are telling us important truths about the world, and those that are lies are usually lies that we can spot without too much trouble.
Pay attention; get some context; ask questions; stop and think. Misinformation doesn’t thrive because we can’t spot the tricks. It thrives because, all too often, we don’t try. We don’t try, because we are confident that we already did.
Written for and first published in the Financial Times Magazine on 8/9 May 2021.
The paperback of “How To Make The World Add Up” is now out. US title: “The Data Detective”.
“Nobody makes the statistics of everyday life more fascinating and enjoyable than Tim Harford.”- Bill Bryson
“This entertaining, engrossing book about the power of numbers, logic and genuine curiosity”- Maria Konnikova
I’ve set up a storefront on Bookshop in the United States and the United Kingdom – have a look and see all my recommendations; Bookshop is set up to support local independent retailers.
May 30, 2021
AnthroVision, more bestselleration, and blockchain scenarios
AnthroVision I’m interviewing Gillian Tett about her excellent new book AnthroVision – details of the event here. Self-recommending for anyone who know’s Gillian’s work – she sets out a persuasive case for the value of anthropology in solving modern-day problems of business and beyond.
Chain Reaction My super-smart friend and former colleague Paul Domjan is one of the authors of a new book, Chain Reaction, exploring different scenarios for blockchain technologies in the developing world. You couldn’t ask for a better guide, and the scenario method is perfect for exploring this sort of field, where the range of possibilities is very wide.
Bestsellery I was chuffed a couple of weeks ago to see How To Make The World Add Up on the Sunday Times bestseller list; it turns out it is also on the Times bestseller list, which is different (I understand it relies on data from Waterstones and does not include sales through Amazon).
I am delighted, in any case – and if you are looking for signed copies they’re available at Blackwell’s in Oxford (who may be persuaded to sell a signed copy by mail order) or from MathsGear.
May 27, 2021
Cautionary Tales – Do NOT pass GO!
(Self promotion: the paperback of How To Make the World Add Up is now out worldwide (except North America). Please consider an early order, which is disproportionately helpful in winning interest and support for the book. Thank you!)
The woman who was key to the creation of Monopoly was denied of her share of the credit and the profits.
Lizzie J. Magie (played by Helena Bonham Carter) should be celebrated as the inventor of what would become Monopoly – but her role in creating the smash hit board game was cynically ignored, even though she had a patent.
Discrimination has marred the careers of many inventors and shut others out from the innovation economy entirely. Could crediting forgotten figures such as Lizzie Magie help address continuing disparities in the patenting of new inventions?
Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley and Marilyn Rust.
The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.
Thanks to the team at Pushkin Industries, Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Maggie Taylor, Daniella Lakhan and Maya Koenig.
Further reading and listening
The essential book on the secret history of Monopoly is Mary Pilon’s The Monopolists, supplemented by Christopher Ketcham’s article “Monopoly is Theft” and Mary Pilon’s feature in Smithsonian magazine.
Lisa Cook’s articles include “Unequal Opportunity: The Innovation Gap in Pink and Black,” in Wisnioski, Hintz, and Stettler Kleine, eds., Does America Need More Innovators? , MIT Press, 2019; “Overcoming Discrimination by Consumers during the Age of Segregation: The Example of Garrett Morgan.” The Business History Review, vol. 86, no. 2, 2012; and Lisa Cook, 2014. “Violence and economic activity: evidence from African American patents, 1870–1940,” Journal of Economic Growth, Springer, vol. 19(2), pages 221-257, June.
For more on Raj Chetty and the team at Opportunity Insights see Alex Bell, Raj Chetty, Xavier Jaravel, Neviana Petkova, John van Reenen “Who Becomes an Inventor in America? The Importance of Exposure to Innovation” and “American Inventors” The Economist 4th December 2017.
Point of order! The use and abuse of debating
Back in 1992, my school friend Daniel Saxby and I stood in London in front of an audience of worthies and a panel of celebrity judges including health minister Virginia Bottomley and Tory grandee Quintin Hogg, Lord Hailsham. We battled an opposing team, debating the motion, “This house would make Scotland independent”.
By the end of the evening, we had emerged triumphant — we were the best young debating pair in the UK, proud winners of the Observer Schools’ Mace. Why have I never boasted of this before? The truth is that I fell out of love with debating.
That is despite the fact that there is much to admire about competitive debating, once it is separated from the culture of upper-class schoolboys and distinguished from the mud-wrestling of presidential “debates”.
In a formal debate, both sides of the argument get equal time. Each speaker has their say. Regardless of age, race, gender or status, they are protected by the chairperson and by the rules.
It was only when I left school, and the debating scene, that I realised how quaint and unusual this level playing field really was. In my first corporate job, I noticed with naive dismay how middle-aged men would routinely talk over young women in meetings. In contrast, in a proper debate, nobody is allowed to talk over anyone. I took for granted the assumption that everyone gets equal time to speak.
There is an obvious appeal to not letting bullies dominate the conversation, but debates uphold another important principle of equality: both sides of the argument are heard. It has become rare to see opposing views presented seriously and in their own words. Instead we pick who to follow on social media, which websites to visit and which TV channels to watch. If we are not careful, all we will see of the opposing viewpoint are quotes carefully selected for mockery.
The advantage of a format in which both sides and all speakers have equal time is not lightly to be dismissed. But debating is a package deal, and along with those merits come some serious downsides.
Competitive debating largely ignores the meta-debate of what motion should be debated. In 2016, the UK electorate was asked whether or not we wished to leave or remain in the EU. Prime Minister David Cameron argued that the only sensible answer was Remain, but simply asking the question implied that either answer was reasonable.
Or consider climate change. We could debate the motion “This house believes veganism is necessary to meet the threat of climate change”, or “This house believes a carbon tax is sufficient to meet the threat of climate change”, or “This house believes there is no threat from climate change”. Which motion gets attention may be more important than any debate that follows.
Debating also feeds some of our less admirable urges. It sometimes pretends to be a search for the truth, but the real goal is not truth but victory.
Several good books have been published recently on the topic of disagreement. Julia Galef’s The Scout Mindset argues that we can deploy our mental faculties as “soldiers” or as “scouts”. Soldiers aim to defend a position; scouts aim to map out the landscape and discover things. Galef believes, plausibly, that we should spend more time reasoning like scouts, but the very structure of debating requires that we reason like soldiers.
Meanwhile, Ian Leslie’s Conflicted celebrates the virtues of productive disagreement, while admitting that plenty of disagreement is not productive at all. Leslie notes that “disagreements become toxic when they become status battles”. True enough, and a competitive debate is a pure status battle. In an ideal world, our goal is the truth and disagreement is merely a means to that end. In the reality of competitive debating, the truth is irrelevant. Disagreement itself is what matters: someone must lose and someone else must win. That is why so much competitive debating turns on flourishes of rhetorical skill, quick-witted put-downs and playing to the gallery.
That night that Daniel and I were crowned national schools debating champions is still a proud moment. I am less proud to admit that I cannot remember whether we were for or against Scottish independence. The topic hardly mattered, did it? What mattered was that we showed skill and panache, and we won.
The social psychologist Charlan Nemeth, author of No! The Power of Disagreement in a World that Wants to Get Along, argues that authentic dissent punctures groupthink and exposes organisational blind spots. But she suggests that there is little benefit in “playing Devil’s advocate” if everyone knows that the disagreement is just for show.
Authentic dissent, says Nemeth, requires someone to stick their neck out and risk opprobrium. Authentic dissent requires courage, or at least stubbornness. The group benefits from the challenge to its thinking, but the dissenter often pays a price.
So perhaps I should be prouder of an earlier debating performance. I was 10 years old and argued against unilateral nuclear disarmament. My entire primary school class voted against me. So did the teacher. So did my own teammate. As a debater, I failed. But I said what I thought was true, regardless of how unpopular it made me. Not many debaters can claim as much.
Written for and first published in the Financial Times on 30 April 2021.
The paperback of “How To Make The World Add Up” is now out. US title: “The Data Detective”.
“Nobody makes the statistics of everyday life more fascinating and enjoyable than Tim Harford.”- Bill Bryson
“This entertaining, engrossing book about the power of numbers, logic and genuine curiosity”- Maria Konnikova
I’ve set up a storefront on Bookshop in the United States and the United Kingdom – have a look and see all my recommendations; Bookshop is set up to support local independent retailers.
May 25, 2021
Daniel Kahneman has lunch with the FT
As I wave my plate of paella in front of the webcam, Daniel Kahneman drops the bombshell.
“I have had my lunch.”
Awkward.
A lunch over Zoom was never an especially appetising prospect, and perhaps it was too much to expect Kahneman to play along. He is, after all, 87 years old, a winner of the Nobel Memorial Prize in economics — despite being a psychologist — and, thanks to the success of his 2011 book Thinking, Fast and Slow, vastly more famous than most of his fellow laureates. For the sake of form I ask him to describe the lunch.
“Well, I had sashimi salad and shumai from a restaurant, and to be absolutely complete and precise, I had a baked apple which I baked myself.”
He raises his chin in defiance, then smiles impishly. “And that was my lunch. It was fine. Not exceptional, but it was fine.”
I set my paella to one side; I am somewhat relieved. A webcam lunch promised to be as cognitively taxing as the time the FT asked me to interview the Freakonomics co-author Steven Levitt while losing to him at poker, or teach Kahneman’s biographer Michael Lewis a board game while interviewing him about The Big Short.
Kahneman turns the tables on me and asks if I enjoy the challenge of juggling such tasks with an interview. Suddenly there is a slight hint of the psychotherapist about him, reinforced by his charmingly gentle Israeli accent, which is distinct even after decades living in North America.
Kahneman’s life and career defy summary. He was born while his mother was visiting Tel Aviv in 1934; his family lived in Paris and, as Jews, spent the war on the run and survived several close brushes with the Third Reich. His father died in 1944, of natural causes, and the family moved to Jerusalem in 1946.
Kahneman trained as a psychologist and became, in Lewis’s words, “a spectacularly original connoisseur of human error”. He formed an intense, productive and tempestuous intellectual partnership with Amos Tversky. The two men worked on judgment, decision-making and risk, laying the foundations for what became known as behavioural economics — and for Kahneman’s Nobel in 2002, after Tversky’s early death.
His fame has only grown since then, partly as behavioural economics has become fashionable, partly because his own work moved into another popular field, the psychology of wellbeing, and largely as the result of the blockbuster success of Thinking, Fast and Slow.
I tell him that, courtesy of that book, I have already had a memorably wonderful meal in his company. A few years ago, while travelling on business, I found myself a table for one in a Munich beer hall. I ate too many sausages and too much potato salad, drank a couple of excellent beers, and all while reading Thinking, Fast and Slow. It was one of those moments when you find yourself making a mental note that you are having a wonderful time. Kahneman looks delighted. “I take that as a compliment, that I added to the food.”
That book described the mind’s “System 1” and “System 2”. System 1 is intuitive, effortless, while System 2 requires conscious, deliberate and effortful calculation. I ask why the book was such a publishing phenomenon. Kahneman gives credit to his editor and also to the way the argument of the book was framed early on, with the two types of thinking described as though they were tiny decision-making agents inside each person’s brain.
“This is appealing in two ways. It corresponds to an experience people have, that some thoughts happen to them and other thoughts they produce. And the idea of presenting it as agents.”
Kahneman admits that this presentation violates the traditions of academic psychology: you’re not supposed to appeal to homunculi inside the brain. It is just a metaphor, but some professional psychologists hate it. “And the people who don’t like it feel free to despise it.”
But mostly Kahneman credits chance for the success of Thinking, Fast and Slow. Sometimes a book catches on and its popularity becomes a self-reinforcing loop. Things could easily have turned out differently. It is a modest claim, but it does neatly lead us to his new book, Noise, written with Olivier Sibony, a business school professor and former McKinsey partner, and Cass Sunstein, who is a law professor, co-author of Nudge and has been an official in both the Obama and Biden administrations.
Kahneman has spent much of his life studying bias in decision-making, but noise is the other source of error. If you imagine firing arrows at a paper target, bias would be a systematic tendency for the arrows to land (say) below the bullseye. Noise would be a tendency for the arrows to err in any direction, purely at random. In some ways, noise is easier to detect. You can measure it from the back of the target, without knowing where the bullseye is. And yet noise is often overlooked.
From the viewpoint of a social scientist, this oversight is understandable. Bias feels like the thing to observe, while noise is the fog obscuring the view. Experimental methods are designed to remove noise to allow bias to be measured more clearly. But noise is not merely an obstacle to scientific inquiry: it has real-world effects too.
Kahneman and his colleagues point to insurance underwriters, judges, child-custody case managers, recruiters, patent examiners and forensic scientists, all of whom act in a way that varies from one professional to another, and between different situations, effectively at random. It is not a problem to be assumed away. So why do we pay so little attention to noise and so much attention to bias?
The problem, says Kahneman, is that we think causally, about individual cases. You can observe bias in an individual case, but to observe noise one must measure — or at least imagine — multiple cases playing out in different ways. Can we reduce noise, I ask? Certainly.
“There is a medication for noise. I mean, if you average many judgments, noise will go down . . . if the judgments are independent, that’s guaranteed.” But getting a second and a third opinion is expensive. And we often unconsciously suppress evidence of noise. “People prefer their sources of information to be highly correlated. Then all the messages you get are consistent with each other and you’re comfortable.”
For example, when professors mark student exam booklets, they will often let the student’s performance on the first essay influence their judgment of the second and the third. Kahneman advises a different procedure.
“Read one question across all booklets, and you write the grade at the back of the booklet so you will not see it when you read the second question,” he explains. “The very striking thing is how less happy you are when you do it right.” To assess each essay separately gives a fairer view of the student’s overall performance — but the inconsistencies in the marks are disconcerting.
I had heard tales of how the process of writing Thinking, Fast and Slow had been agonising and interminable, so it amused me to think of Kahneman in collaboration with the absurdly prolific Sunstein, who has written nearly 50 books.
“I have a vision of the tortoise and the hare being handcuffed together,” I tell Kahneman. How did it work out? “My first thought was that Cass would just write the book,” he says. He glances sideways and smiles broadly. “But then it turned out that I couldn’t let Cass write the book, because I’m too slow. We worked out another way of collaborating.”
“Between the three of us there is a lot of sympathy,” he says. “And so, we had fun together. It was fun to work on that book, whereas writing Thinking, Fast and Slow was a very lonely experience.”
Kahneman was widowed when the psychologist Anne Treisman died three years ago, but he does not now seem to be a lonely man. It is hard to tell over Zoom, over the 3,000 miles from Oxford to New York, but the character in front of me on the screen seems very different from the man described in Lewis’s The Undoing Project, who was depressive, insecure and needy. Lewis portrays Kahneman’s celebrated collaboration with Tversky as a true intellectual love affair, full of tantrums, envy and passionate reconciliations. The stormy relationship grew distant, then abruptly ended with Tversky’s death from cancer in 1996.
Tversky is the Lennon to Kahneman’s McCartney: the relationship between them often seemed bigger than either man alone. But Tversky was not Kahneman’s only academic partner. “Everything I’ve done, actually, has been collaborative.”
It’s true. I’m most familiar with his work adjacent to economics, including with Richard Thaler and Angus Deaton; each subsequently won a Nobel Memorial Prize. I tell him he has good taste in economists. “Alan Krueger was among the best of them.”
I was going to ask him about Krueger. He was a hugely respected economist who worked with Kahneman as part of what Kahneman calls “the dream team”, studying wellbeing, happiness, misery and pain. He killed himself two years ago, a loss that shook the profession. It is hard not to wonder how Krueger’s interest in happiness and misery connects with the tragedy.
“I have no idea,” says Kahneman, shaking his head. “I don’t think being intellectually interested in wellbeing has much impact on personal wellbeing one way or another.”
Still, Kahneman does now seem happy, and full of energy. I met him a decade ago, in London, and he was in considerable discomfort with a bad back. If anything he seems younger now. Covid-19 has not depressed his spirits. “I was never afraid. It just was not a worry. And then working on the book turned out to be more pleasant. So I had a good experience, I had a good year.”
Sibony and Kahneman had been flying between New York and Paris to meet each other for days of focused work on the book. “But when Covid hit we switched to Zoom for an hour or two a day and that was immensely more productive. We might not have finished the book if it hadn’t been for the virus.”
I ask him for Zoom productivity tips, but he bats that away: he has no Zoom advice to offer.
I press him instead for a view on vaccines and the risk of blood clots, as someone who has spent a lifetime thinking about how humans respond to small risks. Do vaccine side effects loom too large in our thinking?
“It’s a standard example, I think, of a very general feature of how people think. This is a distinction between what is normal and natural, and what is artificial and human-made. The asymmetries are enormous.” Another example, he says, is the self-driving car, which will have to be vastly safer than human drivers. “This is almost without limit. They have to be so close to perfection.”
As do vaccines, I offer.
“Yes. This idea of somebody dying from a vaccine is really almost intolerable. The idea of somebody dying from a disease . . . ” he shrugs, and raises a single eyebrow. “That’s natural. That’s the world.”
Thinking, Fast and Slow and Noise have been packaged to appear as siblings, but there is an obvious difference: Thinking, Fast and Slow was based on a lifetime of research and Noise is not. I had been wondering how to raise this, but do not need to.
“You know, the book Noise is premature,” offers Kahneman. “If I had been 20 years earlier, that’s not what I would have done. Having identified the problem of noise I would have started the research programme on noise, and given talks about it, and thought about it and written articles about it. But I started this thing very late. I started it in my eighties and you just don’t have time. This, in a serious sense, is a book that came too early. And it shows.”
He delivers this downbeat assessment in an upbeat manner. He seems cheerful at the prospect that better books on the subject will be written in the future, happy to have made a contribution while he could, and accepting of the fact that, at 87, his story cannot continue for ever.
I admit that Kahneman now feels like something of a guru to me, or more precisely a dispenser of practical wisdom, someone to whom I can come for enlightenment. Just the other day, I was walking in the park with my daughter, trying to help her think through a difficult decision, and I found myself asking what Kahneman would say about the problem. He screws up his face at the idea.
“It’s not how I feel. I really don’t feel like a guru.” But I press him on the matter. I once interviewed Gary Becker, another Nobel laureate and a hugely influential economist. But I don’t carry Becker around in my head to help me think through tough decisions.
“I find this very flattering and quite surprising, because subjectively many of those things that I say look pretty obvious and trite, and I hope I’m not boring people. I’m flattered, thank you. And embarrassed.”
We’ve been talking for nearly an hour and a half. The afternoon is well advanced in Manhattan; in Oxford, the sun is going down. But the conversation has no natural end. There is no espresso, no mints, no bill. (I press Kahneman to send us the receipt, but I know full well that he will not.) So we look forward to a time when we might meet face to face once again.
Kahneman once wrote: “the optimism bias may well be the most significant of the cognitive biases.” Still: despite the pandemic, the distance, his age and his busy schedule, sharing a proper lunch with him is easy to envisage. But then, my System 1 always was an optimistic fellow.
Written for and first published in the Financial Times on 7 May 2021.
The paperback of “How To Make The World Add Up” is now out. US title: “The Data Detective”.
“Nobody makes the statistics of everyday life more fascinating and enjoyable than Tim Harford.”- Bill Bryson
“This entertaining, engrossing book about the power of numbers, logic and genuine curiosity”- Maria Konnikova
I’ve set up a storefront on Bookshop in the United States and the United Kingdom – have a look and see all my recommendations; Bookshop is set up to support local independent retailers.
May 23, 2021
More or Less returns, dataviz, more prizes, economic growth and a talk
More or Less returns on Radio 4 on Wednesday at 9am BST – do tune in, or subscribe to the podcast. If you have questions or comments drop us an email at moreorless@bbc.co.uk.
Still time to buy the Sunday Times bestseller How To Make The World Add Up – and a reminder that in the US and Canada, the same book is called The Data Detective.

The Big Picture. If you’re looking to understand how to create effective data visualisations, I recommend Steve Wexler’s The Big Picture.
More Prizes! After last week’s award from the Voice of the Listener and Viewer, I was doubly honoured to be nominated for best data journalist of the year by the Wincott Foundation (deservedly won by Ed Conway with the wonderful FT dataviz team also commended). I was floored when the Wincott Foundation named me Journalist of the Year.
Speech at How The Light Gets In. I’m speaking on Saturday morning, virtually – details here.
Growth. A lovely and thoughtful essay about economic growth and its measurement, by Max Roser of Our World in Data.
May 20, 2021
Cautionary Tales – Wrong Tools Cost Lives
(Self promotion: the paperback of How To Make the World Add Up is now out worldwide (except North America). Please consider an early order, which is disproportionately helpful in winning interest and support for the book. Thank you!)
Microsoft Excel is great for business accounts… but maybe don’t use it to track a deadly disease.
The British Government promised to create a “world-beating” system to track deadly Covid 19 infections – but it included an outdated version of the off-the-shelf spreadsheet software Microsoft Excel. The result was disastrous.
When under pressure or lacking in expertise we can all be tempted to use a tool unsuitable for the job in hand. But whether fitting shelves or trying to halt a pandemic, we need to accept that cutting corners comes at a cost.
Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley and Marilyn Rust.
The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.
Thanks to the team at Pushkin Industries, Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Maggie Taylor, Daniella Lakhan and Maya Koenig.
Further reading and listening
On the history of accounting, our sources include Iris Origo The Merchant of Prato, Jane Gleeson-White’s Double-Entry, and my own Fifty Things That Made The Modern Economy.
On digital spreadsheets listen to Planet Money and read Steven Levy’s “A Spreadsheet Way of Knowledge“.
On spreadsheet errors, Matt Parker’s “When Spreadsheets Attack“, Felienne Hermans’ “Enron’s Spreadsheets and Related Emails“, and “Spreadsheets May Be For You“, voiced by Gemma Arrowsmith.
On Smallpox, key sources include Bill Foege’s House on Fire and Charles Kenny’s The Plague Cycle.
On the Public Health England error, my thanks to members of the European Spreadsheet Risks Group (EUSPRIG). Useful contemporary reports by Leo Kelion of the BBC and Ed Conway of Sky News are worth reading.
Does Contact Tracing Work? Quasi-Experimental Evidence from an Excel Error in England. Thiemo Fetzer and Thomas Graeber
Bill Gates was speaking to me in an interview for How To Vaccinate The World (BBC Radio).
Why bad times call for good data
Watching the Ever Given wedge itself across the Suez Canal, it would have taken a heart of stone not to laugh. But it was yet another unpleasant reminder that the unseen gears in our global economy can all too easily grind or stick. From the shutdown of Texas’s plastic polymer manufacturing to a threat to vaccine production from a shortage of giant plastic bags, we keep finding out the hard way that modern life relies on weak links in surprising places.
So where else is infrastructure fragile and taken for granted? I worry about statistical infrastructure — the standards and systems we rely on to collect, store and analyse our data. Statistical infrastructure sounds less important than a bridge or a power line, but it can mean the difference between life and death for millions.
Consider Recovery (Randomised Evaluations of Covid-19 Therapy). Set up in a matter of days by two Oxford academics, Martin Landray and Peter Horby, over the past year Recovery has enlisted hospitals across the UK to run randomised trials of treatments such as the antimalarial drug hydroxychloroquine and the cheap steroid dexamethasone. With minimal expense and paperwork, it turned the guesses of physicians into simple but rigorous clinical trials. The project quickly found that dexamethasone was highly effective as a treatment for severe Covid-19, thereby saving a million lives.
Recovery relied on data accumulated as hospitals treated patients and updated their records. It wasn’t always easy to reconcile the different sources — some patients were dead according to one database and alive on another. But such data problems are solvable and were solved.
A modest amount of forethought about collecting the right data in the right way has produced enormous benefits. In the geek community, statistical infrastructure is cool: the latest World Development Report from the World Bank describes the huge potential for data to do good and laments the lost opportunities that result from weak statistical infrastructure in poor countries.
But it isn’t just poor countries that have suffered. In the US, data about Covid-19 testing was collected haphazardly by states. This left the federal government flying blind, unable to see where and how quickly the virus was spreading. Eventually volunteers, led by the journalists Robinson Meyer and Alexis Madrigal of the Covid Tracking Project, put together a serviceable data dashboard.
“We have come to see the government’s initial failure here as the fault on which the entire catastrophe pivots,” wrote Meyer and Madrigal in The Atlantic. They are right.
What is more striking is that the weakness was there in plain sight. Madrigal recently told me that the government’s plan for dealing with a pandemic assumed that good data would be available — but did not build the systems to create them. It is hard to imagine a starker example of taking good statistical infrastructure for granted.
Instead, as with the Ever Given, we only notice the problems afterwards. Back in October, almost 16,000 positive Covid-19 cases disappeared somewhere between testing labs and England’s contact-tracing system. An outdated Excel file format had been used which simply didn’t have enough rows.
The effects of this absurd Excel lapse have been fatal. One estimate, by the economists Thiemo Fetzer and Thomas Graeber, is that it led to 125,000 further cases and 1,500 deaths.
Happily, there are some inspiring examples of good practice. OpenSAFELY is a project jointly led by Ben Goldacre, another Oxford academic. The OpenSAFELY tools allow researchers to pose statistical queries of an immensely detailed data set — the medical records of millions of NHS patients in the UK. Researchers using OpenSAFELY never gain direct access to the patient records themselves and, by design, they automatically share their statistical code for others to examine and adapt. Vital medical questions can be answered in a transparent, collaborative fashion without compromising patient privacy.
Such statistical sorcery is possible when the data infrastructure is thoughtfully designed from the beginning to gather that data and make sense of it. Once the rules and systems are in place, the data will follow.
Every previous crisis has provoked a realisation that we lacked the data we needed. The Great Depression prompted governments to gather data about unemployment and national income. The banking crisis of 2007-08 showed regulators that they had far too little information about stresses and vulnerabilities in the financial system. The pandemic should prompt us to improve the data we gather on public health. Governments routinely use labour surveys to understand the economic health of households; they should now do the same with literal health. We could assemble a representative panel of volunteers who agreed to medical check-ups every three months. This would provide invaluable data and, in times of crisis, the volunteers could be approached more frequently, for example, for regular swabs to track the spread of a new virus.
Hindsight is a wonderful thing, of course: the next crisis will no doubt demand timely information about something new. But statistical infrastructure can be built to adapt — and it is a great deal cheaper than digging a second canal from the Red Sea to the Mediterranean.
Written for and first published in the Financial Times on 23 April 2021.
The paperback of “How To Make The World Add Up” is now out. US title: “The Data Detective”.
“Nobody makes the statistics of everyday life more fascinating and enjoyable than Tim Harford.”- Bill Bryson
“This entertaining, engrossing book about the power of numbers, logic and genuine curiosity”- Maria Konnikova
I’ve set up a storefront on Bookshop in the United States and the United Kingdom – have a look and see all my recommendations; Bookshop is set up to support local independent retailers.
May 17, 2021
Prizes, mindfulness and some very strange dice
Bestsellers. If you bought a copy of How To Make The World Add Up in paperback publication week – thank you, especially since the book was announced as a Sunday Times bestseller yesterday. I am so delighted and so grateful for the support; the Sunday Times bestseller list is often packed with self-help books and celebrity biographies: for a serious book about ideas, it is harder to reach than you might imagine. Thank you again.

Another shout out for Extra Life: I’m interviewing Steven Johnson later today about his new book “Extra Life”. The book is excellent; do drop in (5pm ET Monday 17 May at Politics & Prose, online).
The Data Detective Challenge: The “superforecasting” folks at the Good Judgement Project have launched The Data Detective Challenge, inspired by the US edition of my book. If you fancy yourself a forecaster, take a look and see how you get one. (There’s a video there with my forecasting tips, for what they are worth.)
Mindfulness. This week I picked up the classic “Mindfulness” by the academic psychologist Ellen Langer. It’s SO INTERESTING. Full of counterintutions and eye-catching experimental results. (To pick just one example: People who disliked football / opera enjoyed the experience increasing when asked to watch and note down one, three or six new things they noticed.) I also heard about “the Coolidge effect” for the first time… interesting!
Noise. Last week’s conversation between me and Danny Kahneman was a smash hit for the FT Weekend last week. I’m not surprised; Danny was so charming and fascinating. His new book, with Olivier Sibony and Cass Sunstein, is “Noise“. It’s out tomorrow, and it’s full of only-obvious-in-hindsight ideas.
Prizes! I was honoured to be named as “Best Radio Contributor 2020” by the Voice of the Listener and Viewer. When you look at previous winners the honour multiplies. I’m very grateful and (although I know everyone says this sort of thing, it’s really true) More or Less is and How To Vaccinate The World was very much a team effort. My producers (Sandra Kanthal, Kate Lamble, Charlotte McDonald, Richard Vadon and many others) devote a lot of time and skill to making me look wiser than I have any right to look.
I’ve also been shortlisted for the Wincott Awards both as Journalist of the Year and for Data Journalism of the year. The winners are announced later this week – wish me luck! Given the quality of the competition I’m not optimistic about my chances but I am chuffed to bits to be on the shortlist.
Dice. Who could not love these dice? I particularly liked the d12 that is more like a d10 with a small zero fact, and a small infinity face.
May 13, 2021
Cautionary Tales – Fritterin’ Away Genius
(Self promotion: the paperback of How To Make the World Add Up is now out worldwide (except North America). Please consider an early order, which is disproportionately helpful in winning interest and support for the book. Thank you!)
Claude Shannon was brilliant. He was the Einstein of computer science… only he loved “fritterin’ away” his time building machines to play chess, solve Rubik’s cubes and beat the house at roulette.
If Shannon had worked more diligently – instead of juggling, riding a unicycle and abandoning project after project – would he have made an even greater contribution to human knowledge? Maybe… and maybe not. Are restlessness and “fritterin'” important parts of a rich and creative life?
Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley and Marilyn Rust.
The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.
Thanks to the team at Pushkin Industries, Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Maggie Taylor, Daniella Lakhan and Maya Koenig.
Further reading and listening
Soni and Goodman’s biography of Claude Shannon, A Mind At Play, was an essential source, as was Ed Thorp’s autobiography A Man For All Markets. Other excellent books that touch on the topics and events discussed are Jon Gertner’s The Idea Factory, James Gleick’s The Information, James Owen Weatherall’s The Physics of Wall Street, and Brian Christian’s The Most Human Human.
Bernice Eiduson’s research project is written up as Robert S. Root‐Bernstein, Maurine Bernstein and Helen Gamier. “Identification of Scientists Making Long‐Term, High‐Impact Contributions, with Notes on their Methods of Working”. Creativity Research Journal, 1993.
Nobel prize winners with serious hobbies: Root-Bernstein, R., Allen, L., et al. “Arts foster scientific success: Avocations of Nobel, National Academy, Royal Society, and Sigma Xi members”. Journal of Psychology of Science and Technology, 2008
Doctors with completion bias: KC, Diwas S., Bradley R. Staats, Maryam Kouchaki, and Francesca Gino. “Task Selection and Workload: A Focus on Completing Easy Tasks Hurts Long-Term Performance.” Harvard Business School Working Paper, No. 17-112, June 2017. https://dash.harvard.edu/bitstream/handle/1/33110105/17-112.pdf?sequence=1


