Prejudice By Numbers

In an excerpt from The Formula, Luke Dormehl raises concerns about law enforcement’s increased use of algorithms:


As slashed budgets lead to increased staff cuts, automated systems have moved from simple administrative tools to become primary decision-makers …



The central problem once again comes down to the spectral promise of algorithmic objectivity. “We are all so scared of human bias and inconsistency,” says Danielle Citron, professor of law at the University of Maryland. “At the same time, we are overconfident about what it is that computers can do.” The mistake, Citron suggests, is that we “trust algorithms, because we think of them as objective, whereas the reality is that humans craft those algorithms and can embed in them all sorts of biases and perspectives.”


To put it another way, a computer algorithm might be unbiased in its execution, but, as noted, this does not mean that there is not bias encoded within it. Implicit or explicit biases might be the work of one or two human programmers, or else come down to technological difficulties. For example, algorithms used in facial recognition technology have in the past shown higher identification rates for men than for women, and for individuals of non-white origin than for whites. An algorithm might not target an African-American male for reasons of overt prejudice, but the fact that it is more likely to do this than it is to target a white female means that the end result is no different.




 •  0 comments  •  flag
Share on Twitter
Published on November 24, 2014 06:01
No comments have been added yet.


Andrew Sullivan's Blog

Andrew Sullivan
Andrew Sullivan isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Andrew Sullivan's blog with rss.