More on this book
Community
Kindle Notes & Highlights
Are the underlying data accessible? Has the performance of the algorithm been assessed rigorously – for example, by running a randomised trial to see if people make better decisions with or without algorithmic advice? Have independent experts been given a chance to evaluate the algorithm? What have they concluded? We should not simply trust that algorithms are doing a better job than humans, and nor should we assume that if the algorithms are flawed, the humans would be flawless.
As a candidate for US President in 2016, Donald Trump faced a problem. His campaign wanted to claim that the American economy was broken, but official statistics showed that the unemployment rate was very low – below 5 per cent and falling. There could have been a thoughtful response to that – for example, that the unemployment rate doesn’t measure the quality, security or earning power of jobs. But Mr Trump took the simpler path of repeatedly dismissing unemployment figures as ‘phony’ and ‘total fiction’ and claiming that the true rate was 35 per cent. Simply inventing your own numbers is a
...more
In 2009, the shock of the global financial crisis was followed by the realisation that Greece had been underplaying its borrowing for years. Nobody believed its debts could be repaid. The EU and IMF stepped in with the customary mix of a bailout and some brutal austerity, and the Greek economy collapsed.
As we saw in the previous chapter, public scrutiny is vital. It’s what distinguishes science from alchemy. If statistics are published and designed to be accessible to all, they can be analysed and examined by academics, policy wonks and indeed anybody with a bit of time and access to a computer. Errors can be identified and corrected.
The UK’s statistical system, now reformed, has spent a quarter of a century trying to recover its reputation. That has taken time and hard work, because trust is easy to throw away and hard to regain. Still, the UK’s Office for National Statistics is more trusted than comparable organisations such as the Bank of England, the courts, the police and the civil service – and vastly more trusted than politicians or the media.46
19 – the number of words in the preceding sentence. I suppose this brightens up a page loaded with text, but it’s hardly an insightful use of ink. Also, the correct number is twenty-one. Never let zippy design distract you from the possibility that the underlying numbers simply might be wrong.
Say it with Charts, the bible of management consultants, makes this process very clear. First, says author Gene Zelazny, decide what you want to say with a graph. Once you’ve decided what you want to say, that suggests a particular kind of comparison. That, in turn, suggests a particular choice of graph – such as a scatter plot, a line graph, a stacked bar chart or a pie chart.* Finally, underline your message by sticking it in the graph title. Don’t just write ‘Number of contracts, January-August’. Write something like ‘The number of contracts has increased’ or perhaps ‘The number of
...more
In the 1870s, Parliament passed several public health acts. Death rates in the UK began to fall, and life expectancy to rise. What makes Florence Nightingale’s story so striking is that she was able to see that statistics could be tools and weapons at the same time. She appreciated the importance of solid foundations such as the tedious tasks of standardising definitions and getting everyone to fill in the right forms, and of producing ‘the dryest of all’ analyses, impervious to attack from the critics.
Florence Nightingale was on the right side of history, but many of the people who misuse catchy graphics are not. For those of us on the receiving end of beautiful visualisations, everything we’ve learned so far in this book applies. First – and most important, since the visual sense can be so visceral – check your emotional response. Pause for a moment to notice how the graph makes you feel: triumphant, defensive, angry, celebratory? Take that feeling into account. Second, check that you understand the basics behind the graph. What do the axes actually mean? Do you understand what is being
...more
One of the reasons facts don’t always change our minds is that we are keen to avoid uncomfortable truths. These days, of course, we don’t need to mess around with a static-reducing button. On social media we can choose who to follow and who to block. A vast range of cable channels, podcasts and streaming video lets us decide what to watch and what to ignore. We have more such choices than ever before, and you can bet that we’ll use them.
So one might have hoped for accuracy. But no – the subjects flattered themselves hopelessly. If they put some event at a 25 per cent likelihood, and then it happened, they might then remember they’d called it as a 50/50 proposition. If a subject had put a 60 per cent probability on an event which later failed to happen, she might later recall that she’d forecast a 30 per cent probability. The Fischhoff-Beyth paper was titled ‘I knew it would happen’.
Tetlock published his conclusions in 2005, in a subtle and scholarly book, Expert Political Judgment. He found that his experts were terrible forecasters. This was true in both the simple sense that the forecasts failed to materialise and in the deeper sense that the experts had little idea of how confident they should be in making forecasts in different contexts.
The instinctive starting point is to think about the couple. It’s always hard to imagine divorce in the middle of the romance of a wedding day (although sharing a whisky with the bride’s ex-boyfriend may shake you out of that rosy glow) but you’d naturally ponder questions such as: ‘Do they seem happy and committed to each other?’; ‘Have I ever seen them argue?’; and ‘Have they split up and got back together three times already?’ In other words, we make a forecast with the facts that are in front of our nose.
Which points to the fourth and perhaps most crucial element: superforecasting is a matter of having an open-minded personality. The superforecasters are what psychologists call ‘actively open-minded thinkers’ – people who don’t cling too tightly to a single approach, are comfortable abandoning an old view in the light of fresh evidence or new arguments, and embrace disagreements with others as an opportunity to learn.
Forget what the economy is doing; just find well-managed companies, buy some shares, and don’t try to be too clever. And if that approach sounds familiar, it’s most famously associated with Warren Buffett, the world’s richest investor – and a man who loves to quote John Maynard Keynes.
This book has argued that it is possible to gather and to analyse numbers in ways that help us understand the world. But it has also argued that very often we make mistakes not because the data aren’t available, but because we refuse to accept what they are telling us. For Irving Fisher, and for many others, the refusal to accept the data was rooted in a refusal to acknowledge that the world had changed.
I’ve laid down ten statistical commandments in this book. First, we should learn to stop and notice our emotional reaction to a claim, rather than accepting or rejecting it because of how it makes us feel. Second, we should look for ways to combine the ‘bird’s eye’ statistical perspective with the ‘worm’s eye’ view from personal experience. Third, we should look at the labels on the data we’re being given, and ask if we understand what’s really being described. Fourth, we should look for comparisons and context, putting any claim into perspective. Fifth, we should look behind the statistics at
...more
This highlight has been truncated due to consecutive passage length restrictions.