More on this book
Community
Kindle Notes & Highlights
Read between
November 10 - November 10, 2017
Only then will they ask the right questions—like, “Is our training data representative of a range of skin tones?” and “Does our product fail more often for certain kinds of images?”—and, critically, figure out how to adjust the system as a result.
That’s what concerns researchers from Boston University and Microsoft Research about artificial intelligence based on word embeddings. In a paper titled “Man Is to Computer Programmer as Woman Is to Homemaker? Debiasing Word Embeddings,” they argue that because word embeddings frequently underpin a range of other machine-learning systems, they “not only reflect such stereotypes but can also amplify them” 29—effectively bringing the bias of the original data set to new products and new data sets. So much for machines being neutral.
They demonstrate a method for algorithmically debiasing word embeddings, ensuring that gender-neutral words, like “nurse,” are not embedded closer to women than to men—without breaking the appropriate gender connection between words like “man” and “father.” They also argue that the same could be done with other types of stereotypes, such as racial bias.
Biased algorithms. Alienating online forms. Harassment-friendly platforms. All kinds of problems plague digital products, from tiny design details to massively flawed features. But they share a common foundation: a tech culture that’s built on white, male values—while insisting it’s brilliant enough to serve all of us. Or, as they call it in Silicon Valley, “meritocracy.”
Across the board, diverse groups were more careful with details than were homogenous groups, and more open to conversation. When white participants were in diverse groups rather than homogenous ones, they were more likely to cite facts (rather than opinions), and they made fewer errors, the study found.
Just look at messaging app Slack, a darling of the startup world with an office motto that’s refreshingly healthy: “Work hard and go home.”

