Lars

51%
Flag icon
That’s what concerns researchers from Boston University and Microsoft Research about artificial intelligence based on word embeddings. In a paper titled “Man Is to Computer Programmer as Woman Is to Homemaker? Debiasing Word Embeddings,” they argue that because word embeddings frequently underpin a range of other machine-learning systems, they “not only reflect such stereotypes but can also amplify them” 29—effectively bringing the bias of the original data set to new products and new data sets. So much for machines being neutral.
Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
Rate this book
Clear rating
Open Preview