More on this book
Community
Kindle Notes & Highlights
The counterintuitive result is that presenting people with a detailed and balanced account of both sides of the argument may actually push people away from the centre rather than pull them in. If we already have strong opinions, then we’ll seize upon welcome evidence, but we’ll find opposing data or arguments irritating. This ‘biased assimilation’ of new evidence means that the more we know, the more partisan we’re able to be on a fraught issue.
This isn’t just about Kickstarter, of course. Such bias is everywhere. Most of the books people read are bestsellers – but most books are not bestsellers, and most book projects never become books at all. There’s a similar tale to tell about music, films and business ventures. Even cases of Covid-19 are subject to selective attention: people who feel terrible go to hospital and are tested for the disease; people who feel fine stay at home. As a result, the disease looks even more dangerous than it really is.
Scientists sometimes call this practice ‘HARKing’ – HARK is an acronym for Hypothesising After Results Known. To be clear, there’s nothing wrong with gathering data, poking around to find the patterns and then constructing a hypothesis. That’s all part of science. But you then have to get new data to test the hypothesis. Testing a hypothesis using the numbers that helped form the hypothesis in the first place is not OK.15
Reading her book was less fun, because the incompetence and injustice she described was so depressing – from the makers of protective vests for police officers who forgot that some officers have breasts, to the coders of a ‘comprehensive’ Apple health app who overlooked that some iPhone users menstruate.5 Her book argues that all too often, the people responsible for the products and policies that shape our lives implicitly view the default customer – or citizen – as male. Women are an afterthought.
Algorithms trained largely on pale faces and male voices, for example, may be confused when they later try to interpret the speech of women or the appearance of darker complexions. This is believed to help explain why Google photo software confused photographs of people with dark skin with photographs of gorillas; Hewlett Packard webcams struggled to activate when pointing at people with dark skin tones; and Nikon cameras, programmed to retake photographs if they thought someone had blinked during the shot, kept retaking shots of people from China, Japan or Korea, mistaking the distinctively
...more
The ‘winter detector’ problem is common in big data analysis. A literal example, via computer scientist Sameer Singh, is the pattern-recognising algorithm that was shown many photos of wolves in the wild, and many photos of pet husky dogs. The algorithm seemed to be really good at distinguishing the two rather similar canines; it turned out that it was simply labelling any picture with snow as containing a wolf.
When the woman contacted Tesco to complain, the company representatives concluded that it wasn’t their job to break the bad news that her husband was cheating on her, and went for the tactful white lie. ‘Indeed, madam? A computer error? You’re quite right, that must be the reason. We are so sorry for the inconvenience.’ Fry tells me that this is now the rule of thumb at Tesco: apologise, and blame the computer.
Remember the ‘Math is Racist’ headline? I’m fairly confident that maths isn’t racist. Neither is it misogynistic, or homophobic, or biased in other ways. But I’m just as confident that some humans are. And computers trained on our own historical biases will repeat those biases at the very moment we’re trying to leave them behind us.
The state of Illinois introduced just such an algorithm, called Rapid Safety Feedback. It analysed data on each report, compared them to the outcomes of previous cases, and produced a percentage prediction of the child’s risk of death or serious harm. The results were not impressive. The Chicago Tribune reported that the algorithm gave 369 children a 100 per cent chance of serious injury or death. No matter how dire the home environment, that degree of certitude seems unduly pessimistic. It could also have grave implications: a false allegation of child neglect or abuse could have terrible
...more