Tim Harford's Blog, page 31

July 28, 2022

Cautionary Tales – South Pole Race: “Mummy, is Amundsen a good man?”

Roald Amundsen beat Captain Scott to the South Pole. The Norwegian – using dog sleds and skis – made it look easy… fun, even. He was heading home to safety, while the British party – hauling sleds by hand – struggled out on the ice. 

In this case, to the victor went a spoiled reputation. The British grumbled that Amundsen had somehow cheated, or had at least behaved in an underhand manner. These stinging accusations would haunt the adventurer until the day he died in the polar wastes.

Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley, with support from Courtney Guarino and Emily Vaughn.

The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.

Thanks to the team at Pushkin Industries, including Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Royston Beserve, Maggie Taylor, Nicole Morano, Daniella Lakhan and Maya Koenig.

[Apple] [Spotify] [Stitcher]

Further reading and listening

Stephen Bown The Last Viking

David Crane Scott of the Antarctic

Apsley Cherry-Garrard The Worst Journey in the World

Malcolm Gladwell David and Goliath

Ranulph Fiennes Captain Scott

Roland Huntford The Last Place on Earth

Edward Larson An Empire of Ice

Diana Preston A First Rate Tragedy

Robert Scott The Voyage of the Discovery

“Captain Roald Amundsen and the Society” The Geographical Journal Dec 1927 https://www.jstor.org/stable/1782920

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on July 28, 2022 22:01

How to save more lives and avoid a privacy apocalypse

In the mid-1990s, the Massachusetts Group Insurance Commission, an insurer of state employees, released healthcare data that described millions of interactions between patients and the healthcare system to researchers. Such records could easily reveal highly sensitive information — psychiatric consultations, sexually transmitted infections, addiction to painkillers, bed-wetting — not to mention the exact timing of each treatment. So, naturally, the GIC removed names, addresses and social security details from the records. Safely anonymised, these could then be used to answer life-saving questions about which treatments worked best and at what cost.

That is not how Latanya Sweeney saw it. Then a graduate student and now a professor at Harvard University, Sweeney noticed most combinations of gender and date of birth (there are about 60,000 of them) were unique within each broad ZIP code of 25,000 people. The vast majority of people could be uniquely identified by cross-referencing voter records with the anonymised health records. Only one medical record, for example, had the same birth date, gender and ZIP code as the then governor of Massachusetts, William Weld. Sweeney made her point unmistakable by mailing Weld a copy of his own supposedly anonymous medical records.

In nerd circles, there are many such stories. Large data sets can be de-anonymised with ease; this fact is as screamingly obvious to data-science professionals as it is surprising to the layman. The more detailed the data, the easier and more consequential de-anonymisation becomes.

But this particular problem has an equal and opposite opportunity: the better the data, the more useful it is for saving lives. Good data can be used to evaluate new treatments, to spot emerging problems in provision, to improve quality and to assess who is most at risk of side effects. Yet seizing this opportunity without unleashing a privacy apocalypse — and a justified backlash from patients — seems impossible.

Not so, says Professor Ben Goldacre, director of Oxford University’s Bennett Institute for Applied Data Science. Goldacre recently led a review into the use of UK healthcare data for research, which proposed a solution. “It’s almost unique,” he told me. “A genuine opportunity to have your cake and eat it.” The British government loves such cakeism, and seems to have embraced Goldacre’s recommendations with gusto.

At the moment, we have the worst of both worlds: researchers struggle to access data because the people who have patient records (rightly) hesitate to share them. Yet leaks are almost inevitable because there is patchy oversight over who has what data, when.

What does the Goldacre review propose? Instead of emailing millions of patient records to anyone who promises to be good, the records would be stored in a secure data warehouse. An approved research team that wants to understand, say, the severity of a new Covid variant in vaccinated, unvaccinated and previously infected individuals, would write the analytical code and test it on dummy data until it was proved to run successfully. When ready, the code would be submitted to the data warehouse, and the results would be returned. The researchers would never see the underlying data. Meanwhile the entire research community could see that the code had been deployed and could check, share, reuse and adapt it.

This approach is called a “trusted research environment” or TRE. The concept is not new, says Ed Chalstrey, a research data scientist at The Alan Turing Institute. The Office for National Statistics has a TRE called the Secure Research Service to enable researchers to analyse data from the census safely. Goldacre and his colleagues have developed another, called OpenSAFELY. What is new, says Chalstrey, are the huge data sets now becoming available, including genomic data. De-anonymisation is just hopeless in such cases, while the opportunity they present is golden. So the time seems ripe for TREs to be used more widely.

The Goldacre review recommends the UK should build more trusted research environments with the fourfold aim of: earning the justified confidence of patients, letting researchers analyse data without waiting years for permission, making the checking and sharing of analytical tools something that happens by design, as well as nurturing a community of data scientists.

The NHS has an enviably comprehensive collection of patient records. But could it build TRE platforms? Or would the government just hand the project wholesale to some tech giant? Top-to-bottom outsourcing would do little for patient confidence or the open-source sharing of academic tools. The Goldacre review declares “there is no single contract that can pass over responsibility to some external machine. Building great platforms must be regarded as a core activity in its own right.”

Inspiring stuff, even if the history of government data projects is not wholly reassuring. But the opportunity is clear enough: a new kind of data infrastructure that would protect patients, turbo-charge research and help build a community of healthcare data scientists that could be the envy of the world. If it works, people will be sending the health secretary notes of appreciation, rather than his own medical records.

Written for and first published in the Financial Times on 1 July 2022.

The paperback of “The Next 50 Things That Made The Modern Economy” is now out in the UK.

“Endlessly insightful and full of surprises — exactly what you would expect from Tim Harford.”- Bill Bryson

“Witty, informative and endlessly entertaining, this is popular economics at its most engaging.”- The Daily Mail

I’ve set up a storefront on Bookshop in the United States and the United Kingdom – have a look and see all my recommendations; Bookshop is set up to support local independent retailers. Links to Bookshop and Amazon may generate referral fees.

 •  0 comments  •  flag
Share on Twitter
Published on July 28, 2022 09:30

July 21, 2022

Your phone’s notification settings and the meaning of life

Switching to a new phone is easy enough these days. The wheezing older model formed a huddle with the shiny oversized new thing, and within a few minutes had effected a near-complete digital handover. One exception was the notification settings. As they reset to the default, my new phone began to beep and buzz incessantly, like the strange offspring of R2-D2 and a cheap vibrator.

A photo app started trying to sell me a print album. A train ticket app prodded me not to forget my upcoming journeys. The Financial Times app urged me to read the latest headlines. More disturbing, Google News installed itself and did the same thing, except for news sources I don’t follow and don’t want to. Most absurd of all, every single incoming email announced itself with a beep and a teasing extract on my home screen. Fortunately, I don’t have social media on my smartphone; I could only imagine the cacophony if I did.

This was all simple enough to fix. Calendar, text messages and phone calls are now the only apps allowed to interrupt me. Still, it was annoying. I wondered: surely everyone switches off most notifications, right? Right?

Perhaps not. I stumbled upon an essay by Guardian columnist Coco Khan marvelling at how much calmer she felt after turning off notifications. She described this peace as completely unexpected, “an unintended consequence of a tiny tweak”. She went on to explain that WhatsApp alone had sent her more than 100 notifications a day and that she had only muted the apps because she’d been on holiday in Bali, and the phone was buzzing all night. As well it might, given that social media notifications were still on. She felt calmer when this stopped. Who could have predicted that?

On the face of it, it is absurd that she was surprised. But it is always easier to be wise about other people. I read Khan’s account as a cautionary tale for all of us. We humans can adapt to a lot; it’s easy to sleepwalk into a state of chronic stress and distraction without ever reflecting that things could be different.

Khan’s experience seems common. One of the most robust findings in behavioural science is that default settings wield an outsize influence over our choices, even when it is trivial to change those defaults. It is no wonder that many apps pester us endlessly, by default. App makers clearly believe we’ll put up with it, and they may be right.

One study, published in 2015 by researchers at the Technical University of Berlin, found that on average six out of seven smartphone apps were left in their default notification settings. Given how many notifications are clearly valueless, this suggests that in the face of endless notifications, many smartphone users have learnt helplessness.

Of course we sometimes want to know immediately when something has happened. As I am fond of saying, a doorbell is more convenient than going to the door every 90 seconds to see if anyone is there. Although that trade-off would change if the doorbell itself were sounding every few minutes, day and night. But most of us have too many notifications enabled.

“Notification” is a dishonest euphemism, anyway. The correct word is “interruption”, because it prompts the right question: how often do I want my phone to interrupt me?

A 2017 study by Martin Pielot of Telefónica Research and Luz Rello of the Human-Computer Interaction Institute investigated how people felt when their phones were entirely silenced. Pielot and Rello stumbled, revealingly, right at the start. They tried to recruit volunteers to mute everything for a week, but gave up because so few people were willing to do so, and those who were willing would be such outliers as to provide no insight about the rest of us.

So the researchers tried again, with a 24-hour “Do Not Disturb” challenge. All interruptions were blocked, even incoming phone calls. The results were intriguing: people felt less distracted and more productive, but they also felt cut off and worried about being unresponsive. There was no sign that they were less stressed or more relaxed, but perhaps that is not a surprise. It is not completely restful to know that your boss may be infuriated because you are not picking up your phone.

Not many of us can adopt Kraftwerk’s approach: the great electronic band silenced the telephone in their studio. If you wanted to call them, fine. They would answer, but only by prior arrangement and at precisely the agreed time.

There is a happy medium here, I am sure, and it will vary from person to person. But I suspect Kraftwerk are closer to the optimal compromise than are my smartphone defaults. Oliver Burkeman puts it best in his book Four Thousand Weeks: our attention is not just a scarce resource; it is life itself. “At the end of your life, looking back, whatever compelled your attention from moment to moment is simply what your life will have been.” Glance at yet another notification, and you are quite literally paying with your life.

Written for and first published in the Financial Times on 24 June 2022.

The paperback of The Data Detective was published on 1 February in the US and Canada. Title elsewhere: How To Make The World Add Up.

I’ve set up a storefront on Bookshop in the United States and the United Kingdom. Links to Bookshop and Amazon may generate referral fees.

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on July 21, 2022 09:50

July 14, 2022

Cautionary Tales – South Pole Race: David and Goliath on Ice

1910: Two men are racing one another to be the first to reach the South Pole.Captain Robert Falcon Scott heads a well-financed, technologically-advanced expedition – aiming to reach the pole in the “proper” and heroic way… on foot. Roald Amundsen’s effort is more modest, relying on cheap sled dogs to carry him to victory.

Scott – for all his money, for all his fancy equipment, for all his institutional-backing – is doomed to failure in the ice wastes of Antarctica. Why?

Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley, with support from Courtney Guarino and Emily Vaughn.

The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.

Thanks to the team at Pushkin Industries, including Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Royston Beserve, Maggie Taylor, Nicole Morano, Daniella Lakhan and Maya Koenig.

[Apple] [Spotify] [Stitcher]

Further reading and listening

Stephen Bown The Last Viking

David Crane Scott of the Antarctic

Apsley Cherry-Garrard The Worst Journey in the World

Malcolm Gladwell David and Goliath

Ranulph Fiennes Captain Scott

Roland Huntford The Last Place on Earth

Edward Larson An Empire of Ice

Diana Preston A First Rate Tragedy

Robert Scott The Voyage of the Discovery

Jonathan Karpoff “Public versus Private Initiative in Arctic Exploration” Journal of Political Economy 2001 vol 109

 •  0 comments  •  flag
Share on Twitter
Published on July 14, 2022 22:01

The high price we pay for social media

Sitting exams is unpleasant at the best of times, but my daughter believes she has extra cause to complain. Two of her A-level papers are scheduled for the same time, so she must take a break between them with only an invigilator for company. “I can’t even have my phone,” she protests.

Because I am the worst parent in the world, I opine that it would be very good for her mental health to be without her phone for a couple of hours. She could challenge me to prove it, but more sensibly, she rolls her eyes and walks away.

Ernest Hemingway once declared that “what is moral is what you feel good after and what is immoral is what you feel bad after”. I’m not sure if that stands up to philosophical scrutiny, but I do think it’s worth asking ourselves how often we feel bad after spending time on social media. I usually feel disheartened and a little self-loathing after doomscrolling on Twitter in a way that I never feel after reading a book or a decent magazine.

That’s the experience of a middle-aged man on Twitter. What about the experience of a teenage girl on Instagram? A few months ago the psychologist Jonathan Haidt published an essay in The Atlantic arguing that Instagram was toxic to the mental health of adolescent girls. It is, after all, “a platform that girls use to post photographs of themselves and await the public judgments of others”.

That echoes research by Facebook, which owns Instagram. An internal presentation, leaked last year by Frances Haugen, said: “Thirty-two per cent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” In the UK between 2003 and 2018, there was a sharp increase in anxiety, depression and self-harm, and a more modest increase in eating disorders, in people under the age of 21. In absolute terms, anxiety, depression, self-harm and eating disorders were higher in girls than boys.

Similar trends can be found in the US and elsewhere in the English-speaking world. And a team of psychologists including Haidt and Jean Twenge has found increases in loneliness reported by 15 and 16-year-olds in most parts of the world. The data often seem to show these problems taking a turn for the worse after 2010.

There are other explanations for an increase in teen anxiety (the 2008 banking crisis; Covid-19 and lockdowns; school shootings; climate change; Donald Trump) but none of them quite fits the broad pattern we observe, in which life started to get worse for teenagers around 2010 in many parts of the world. What does fit the pattern is the widening availability of smartphones.

This sort of broad correlational data is suggestive of a problem, but hardly conclusive. And a large and detailed study by Amy Orben and Andrew Przybylski of the University of Oxford found very little correlation between the amount of time spent on screens and the wellbeing of adolescents. This study seems to me more robust and rigorous than most, with one major weakness: it lumps together all forms of screen time — from Disney+ to Minecraft, TikTok to Wikipedia.

Three recent pieces of analysis approach the subject quite differently. One from Luca Braghieri and two fellow economists looks at the campus-by-campus rollout of Facebook across US colleges between early 2004, when it was launched at Harvard, and late 2006, when it was made available to the general public. Because this rollout is sharply staggered, it creates a quasi-randomised trial, which is a better source of data than broad correlations.

The researchers find a large negative effect of the launch of Facebook on mental health — somewhere between one-quarter and one-fifth as bad as the effect of losing one’s job. The Facebook of around 2005 is not the same as the social media of today: it was probably less addictive and less intrusive, and was not available on smartphones. If it was bad then, one wonders about the impact of social media now.

The other two studies were charmingly simple: they asked experimental participants, chosen at random, to switch off social media for a while — while a control group continued as before. The larger study by Hunt Allcott, Braghieri and others asked people to quit Facebook for four weeks during the 2018 midterm US elections. A smaller but more recent study by researchers at the University of Bath had people eschewing all social media for a week.

The results in both cases were striking, with clear improvements in a variety of measures of happiness, wellbeing, anxiety and depression. It seems that a break from social media is good for your soul.

Intriguingly, the largest effect of all in the Allcott and Braghieri study is that people who had temporarily left Facebook for the experiment were much less likely to use it afterwards.

I don’t know whether a two-hour break from her phone really would be good for my daughter’s mental health. Nor do I think the wellbeing case against social media is proven beyond doubt. But that should not be a surprise. It took time to demonstrate that cigarettes caused lung cancer. If social media causes depression and anxiety, it will take time to demonstrate that, too. But at this stage, one has to wonder.

Written for and first published in the Financial Times on 31 December 2022.

The paperback of “The Next 50 Things That Made The Modern Economy” is now out in the UK.

“Endlessly insightful and full of surprises — exactly what you would expect from Tim Harford.”- Bill Bryson

“Witty, informative and endlessly entertaining, this is popular economics at its most engaging.”- The Daily Mail

I’ve set up a storefront on Bookshop in the United States and the United Kingdom – have a look and see all my recommendations; Bookshop is set up to support local independent retailers. Links to Bookshop and Amazon may generate referral fees.

 •  0 comments  •  flag
Share on Twitter
Published on July 14, 2022 09:53

July 7, 2022

Learning to think well involves hearts as well as minds

What does it mean to “learn how to think”? Is it a matter of learning some intellectual skills such as fluent reading, logic and clear expression? Does it require familiarity with some canonical texts or historical facts? Perhaps it’s all about correcting certain biases that cloud our judgment? I recently read a thought-provoking essay by the psychologist Barry Schwartz, best known for his book The Paradox of Choice.

Writing a few years ago in The Chronicle of Higher Education, Schwartz argued that one of the goals of a university education, especially a liberal arts education, is to teach students how to think. The trouble is, said Schwartz, “nobody really knows what that means”.

Schwartz proposes his own ideas. He is less interested in cognitive skills than in intellectual virtues.

“All the traits I will discuss have a fundamental moral dimension,” he says, before setting out the case for nine virtues: love of truth; honesty about one’s own failings; fair-mindedness; humility and a willingness to seek help; perseverance; courage; good listening; perspective-taking and empathy; and, finally, wisdom — the word Schwartz uses to describe not taking any of these other virtues to excess.

One only has to flip the list to see Schwartz’s point. Imagine a person who is hugely knowledgeable and brilliantly rational, yet who falls short on these virtues, being indifferent to truth, in denial about their own errors, prejudiced, arrogant, easily discouraged, cowardly, dismissive, narcissistic and prone to every kind of excess. Could such a person really be described as knowing how to think? They would certainly not be the kind of person you’d want to put in charge of anything.

“My list was meant to start the conversation, not end it,” Schwartz told me. So I sent his list to some people I respect, both in and adjacent to academia, to see what they made of it. The reaction was much the same as mine: almost everyone liked the idea of intellectual virtues, and almost everyone had their own ideas about what was missing.

The Cambridge statistician Sir David Spiegelhalter raised the idea of intellectual variety, since working on disparate projects was often a source of insight. Hetan Shah, chief executive of the British Academy, suggested that this variety, and in particular the ability to see the connection between different parts of a system, was the most important intellectual virtue. He also argued for a sense of humour: if we can’t play with ideas, even dangerous ideas, we are missing something.

Dame Frances Cairncross has chaired several notable academic institutions. She suggested that if one accepted the premise that intellectual virtues were also moral virtues, a greater one was “humanity . . . a sympathy for the human condition and a recognition of human weakness”. She also suggested the virtue of “getting stuff done”, noting the line from the Book of Common Prayer, “we have left undone those things which we ought to have done.” True enough. What would be the value of having all these intellectual virtues if we did not exercise them, and instead spent our days munching popcorn and watching TV?

Tom Chatfield, author of How To Think, mentioned persuasiveness. What is the point of thinking clearly if you cannot help anyone else to do likewise? This is fair, although persuasiveness is perhaps the intellectual virtue that most tempts us into the vices of arrogance, partisanship and an unbalanced treatment of the facts.

Almost everyone raised the omission that was much on my mind: curiosity. Curiosity was not on Schwartz’s list, except perhaps by implication. But curiosity is one of the central intellectual virtues. Curiosity implies some humility, since it is an acknowledgment that there is something one doesn’t yet understand. Curiosity implies open-mindedness and a quest to enlarge oneself. It is protective against partisanship. If we are curious, many other intellectual problems take care of themselves. As Orson Welles put it about the film-going audience: “Once they are interested, they understand anything in the world.”

Very good. Range, systemic thinking, humanity, humour, getting things done, persuasiveness, curiosity. Other plausible virtues were suggested, too; alas, this columnist must also display the virtue of brevity.

But one of my correspondents had a sharply different response to Schwartz’s emphasis on explicitly moral intellectual virtues — tellingly, the one most actively involved in teaching. Marion Turner, professor of English literature at Oxford University, put it frankly: “I’m not trained to teach students how to be good people, and that’s not my job.”

It’s a fair point. It is very pleasant to make a list of intellectual virtues, but why should we believe that academics can teach students courage, humility or any other virtue? Yet if not academics, then who? Parents? Primary schoolteachers? Newspaper columnists? Perhaps we should just hope that people acquire these virtues for themselves? I am really not sure.

Barry Schwartz is on to something, that is clear. Facts, logic, quantitative tools and analytical clarity are all very well, but the art of thinking well requires virtues as well as skills. And if we don’t know who will teach those virtues, or how to teach them, that explains a lot about the world in which we now live.

Written for and first published in the Financial Times on 10 June 2022.

The paperback of The Data Detective was published on 1 February in the US and Canada. Title elsewhere: How To Make The World Add Up.

I’ve set up a storefront on Bookshop in the United States and the United Kingdom. Links to Bookshop and Amazon may generate referral fees.

2 likes ·   •  0 comments  •  flag
Share on Twitter
Published on July 07, 2022 09:50

June 30, 2022

Cautionary Tales – Chicago when it sizzles

July 1995. A deadly heat wave gripped Chicago – bridges buckled; the power grids failed; and the morgue ran out of space – but some neighbourhoods saw more deaths than others. Of course, richer and leafier districts suffered less, but poor places where social interaction was difficult and loneliness a problem were hit hardest of all.   

Does the Chicago heat wave teach us that in dealing with climate change we need to consider not just physical infrastructure, but social infrastructure too?  

Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley, with support from Courtney Guarino and Emily Vaughn.

The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.

Thanks to the team at Pushkin Industries, including Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Royston Beserve, Maggie Taylor, Nicole Morano, Daniella Lakhan and Maya Koenig.

[Apple] [Spotify] [Stitcher]

Further reading and listening

Eric Klinenberg Heat Wave

Mike Thomas “Chicago’s Deadly 1995 Heat Wave: An Oral History.” Chicago Magazine

Judy Pasternak “Heat Wave’s Final Chapter is Cold, Lonely” LA Times, 26 August 1995

Jane Jacobs Dark Age Ahead

Mike Thomas “Chicago’s Deadly 1995 Heat Wave: An Oral History.” Chicago Magazine

Mike Royko ”KILLER HEAT WAVE OR A MEDIA EVENT?” Chicago Tribune 18 July 1995

Malcolm Gladwell “Political Heat” The New Yorker 4 August 2002

Kopp, Buzan and Huber “The Deadly Combination of Heat and Humidity” The New York Times 7 June 2015

Alexandra Witze “Racism is magnifying the deadly impact of rising city heatNature 14 July 2021

 •  0 comments  •  flag
Share on Twitter
Published on June 30, 2022 22:01

Even when you do succeed, sometimes it pays to try again

If at first you don’t succeed, goes the old saying, try, try again. Good advice, up to a point. But let me offer a modification: even when you do succeed, try, try again. Tempting as it is to declare victory and move on, in many endeavours there is much to be said for rethinking an apparently satisfactory formula.

Consider the advice for job interviewers in Talent, a new book by economist Tyler Cowen and venture capitalist Daniel Gross. They suggest asking a routine question, such as “give me an example of when you resolved a difficult challenge at work”. Then ask for another example. And another.

The pat answers will be exhausted quickly, and the candidate will have to start improvising, digging deep — or perhaps admit to being stumped. “If the candidate really does have 17 significant different work triumphs,” write Cowen and Gross, “maybe you do want to hear about what number 17 looks like.”

One way to describe this tactic is that the interviewer is asking for answers in parallel rather than answers in series. Instead of stringing together a logical sequence of 17 questions, the interviewer is asking for 17 different answers to the same question.

It is counterintuitive advice, but the logic is clear enough and it seems simple to execute. So why don’t we do it? First, we feel uncomfortable. Second, demanding 17 different answers to the same question may seem silly or fruitless.

While the approach is unconventional in job interviews, it is common practice among designers. They will often produce several distinct attempts to meet a given brief, rather than immediately focusing on what seems to be the best idea. In doing so, the designers force themselves to explore the full range of possibilities, to avoid the risk of committing too early to a concept that seems attractive but which may eventually be a dead end.

Researchers Steven Dow, Alana Glassco and others at Stanford University explored this idea by asking experiment participants to use simple software to design a web advert for a magazine. Half of the participants worked in series: they sketched five prototype adverts, receiving feedback after each one. The other half worked in parallel: they sketched three prototypes, then received feedback on all three and then sketched two more before receiving feedback.

Dow and his colleagues asked experts to rate the quality of the final adverts and tested them on the internet, measuring click-through rates. They rated the diversity of the adverts and they also asked the participants for their confidence after finishing the process. In every respect, the parallel process ads were superior: the final designs looked better and earned more clicks; the initial sketches covered a greater range of ideas; and the budding amateur designers gained in confidence as a result of prototyping in parallel.

A striking example of parallel design is the creation of the Windows 95 start-up sound. Microsoft was looking for an opportunity to show off the burgeoning audio capabilities of the computers of the day, so somewhat implausibly it commissioned Brian Eno, whose previous collaborators included David Bowie, Talking Heads, U2, Devo and Roxy Music.

Eno recalls receiving a brief, requesting music that was “inspirational, sexy, driving, provocative, nostalgic, sentimental . . . there were about 150 adjectives. And then at the bottom it said, ‘and not more than 3.8 seconds long’.”

Eno describes himself as being “completely bereft of ideas” at the time. He found the brief both hilarious and inspiring. In the end he composed more than 80 tiny pieces of music. The final result was a musical signature that has stood the test of time, and a liberated Eno. “It really broke a logjam in my own work,” he told the San Francisco Chronicle.

No doubt that was partly a response to the tight constraint of a 3.8-second piece, but surely it was also the creative response to trying to produce the 83rd composition. When it would have been easy to put his nose to the mixing desk, obsessing over the tiniest variations of tone and timing, Eno forced himself to explore the possibilities.

Bill Burnett and Dave Evans, in their delightful book Designing Your Life, suggest an exercise in which you sketch out a vision for the next five years of your life. What will you be doing? Where will you live and with whom? Are you hoping to run a marathon? Start a business? Write a novel? Get married?

This is often a straightforward act of imagination, but what makes the exercise excruciating is what comes next: Burnett and Evans want you to do it again, only this time, it’s different — the idea at the heart of the plan is completely forbidden. Back to the drawing board. And then a third time.

I’ve tried this myself and seen others try it. People squirm. They protest. Sometimes they cry. And then, sooner or later, the ideas start pouring out. We contain multitudes, all of us. But we don’t always let them see the light of day. Perhaps we should try producing answers in parallel more often. Even when you do succeed, try, try again.

Written for and first published in the Financial Times on 3 June 2022.

The paperback of The Data Detective was published on 1 February in the US and Canada. Title elsewhere: How To Make The World Add Up.

I’ve set up a storefront on Bookshop in the United States and the United Kingdom. Links to Bookshop and Amazon may generate referral fees.

 •  0 comments  •  flag
Share on Twitter
Published on June 30, 2022 09:26

June 23, 2022

Why does it feel good to do good?

“It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner,” wrote Adam Smith, famously, in The Wealth of Nations, “but from their regard to their own interest. We address ourselves, not to their humanity but to their self-love.”

True enough. And yet my recent experience is that there is much to be said for addressing ourselves not to people’s self-love but to their humanity. I recently posted a Twitter thread telling people what was on my mind. I explained that my father Adrian had died. I posted photographs and described his life: his curiosity, his intelligence, his shy modesty. I told how my father had devoted himself to the care of my dying mother in the 1990s, and had somehow held down his job, kept his children attending school and made sure there was food on the table. And I described the sensitive care my father and mother had both received at the Florence Nightingale hospice in Aylesbury. And, finally, I asked people to consider giving money to the hospice.

People are kind, so I wasn’t surprised to get a warm response. What I did not expect was to receive anonymous donations of three or even four figures. It seemed a lot of money to give incognito to a local charity in a place you might never visit, in memory of a man you probably never met.

Economists have a number of theories to explain why anyone gives to a charitable cause. The most cynical — true sometimes, clearly false in this case — is that people are ostentatiously demonstrating their generosity and their riches.

At the other end of the spectrum is “pure altruism”. Just as rational consumers maximise their gains as savvy shoppers, picking up the best products at the cheapest possible price, pure altruists also seek the biggest impact for their spending. The difference is merely that pure altruists are aiming to maximise the utility of other people. That doesn’t quite seem to cover it either. There is a community of “effective altruists” out there, but they tend to prefer hard evidence, not memorial threads on Twitter.

The economists Dean Karlan and Daniel Wood have shown there is a tension between evidence and emotion. They tested out fundraising mailshots with a tear-jerking story about a named beneficiary: “She’s known nothing but abject poverty her entire life.” Others got the same emotive tale alongside a paragraph attesting to the “rigorous scientific methodologies” that demonstrated the charity’s impact. Karlan and Wood found that some people who’d previously given big donations came back and gave even more, impressed by the evidence of effectiveness. But smaller donors gave less. Apparently, the scientific evidence turned them off.

Perhaps they were giving because of what the economist James Andreoni calls the “warm glow”, and John List, another economist, terms “impure altruism”. Warm-glow giving is motivated by altruism of a fuzzier kind. Rather than calculating the most effective target for our donations, instead we give because it feels good to believe we’re doing good.

Because warm-glow giving is emotional rather than rational, it raises the question of how to persuade people to get themselves in the mood to donate. Nobody was better at this game than Charles Sumner Ward, who in the late 19th and early 20th centuries went on a hot streak raising money for the YMCA, the Boy Scouts, Masonic Temples and other employers of his formidable talents.

Ward deployed tactics that now seem very modern, including artificial deadlines, large donors who pledged funds only if they were matched by smaller donations, publicity stunts, a campaign clock showing progress towards an often-arbitrary goal and little wearable flags that donors could display. Some of these ideas are now proven to increase donations, but social scientists continue to ask what makes people give.

Cynthia Cryder and George Loewenstein have found that tangibility matters. People give more generously if they have first been asked to pick a charity from a list than if they’re shown the list and asked first to choose a donation amount, then to pick the charity to receive that donation. They also donate more if given specific examples of projects the charity does, rather than a more generic description. Being able to clearly picture how the money would be spent induced people to open their wallets.

Perhaps this explains why people were so generous. I was very specific about my father’s life, my parents’ deaths and the way this particular hospice had helped them. Rather than donating to an abstract ideal, people were giving money to something they could picture clearly.

Dean Karlan prompted me to consider one other thing: that people who regularly read my column or listen to my podcast have a relationship with me, and my thread on Twitter created an opportunity for them to mark that relationship with compassion and generosity.

Whatever the reason, I am grateful. And if this column prompts a warm glow, indulge yourself. Find a charity that means something to you, and give something in memory of someone who mattered to you. The altruism may be “impure”, but to do good feels good.

Written for and first published in the Financial Times on 27 May 2022.

 •  0 comments  •  flag
Share on Twitter
Published on June 23, 2022 09:48

June 16, 2022

Cautionary Tales – The French Knight’s Guide to Corporate Culture

France 1346: The army of King Philip VI is Europe’s pre-eminent killing machine. It is used to crushing any force stupid enough to oppose it, and now fully expects to annihilate a motley band of English invaders on a field near the village of Crecy.

Except as night falls, it is Philip’s army that lies broken and bleeding in the mud. What went wrong? The French knights, it seems, had failed to update their corporate culture.

Cautionary Tales is written by me, Tim Harford, with Andrew Wright. It is produced by Ryan Dilley, with support from Courtney Guarino and Emily Vaughn.

The sound design and original music is the work of Pascal Wyse. Julia Barton edited the scripts.

Thanks to the team at Pushkin Industries, including Mia Lobel, Jacob Weisberg, Heather Fain, Jon Schnaars, Carly Migliori, Eric Sandler, Emily Rostek, Royston Beserve, Maggie Taylor, Nicole Morano, Daniella Lakhan and Maya Koenig.

[Apple] [Spotify] [Stitcher]

Further reading and listening

Thomas Schelling Micromotives and Macrobehavior

Warren Ellis Crecy

Lynn White Jr. Medieval Technology and Social Change

Andrew Ayton and Philip Preston The Battle of Crecy

Saul David Military Blunders

Julian Humphrys The Battle of Crecy History Extra

Boris Groysberg, Jeremiah Lee, Jesse Price, and J. Yo-Jud Cheng “The Leaders Guide to Corporate CultureHarvard Business Review

Marty Baron speech at the Reuters Institute

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on June 16, 2022 22:01