Chase

65%
Flag icon
Once the Amazon engineers discovered the bias in their resume-screening tool, they tried to remove it by deleting the female-associated terms from the words the algorithm would consider. Their job was made even harder by the fact that the algorithm was also learning to favor words that are most commonly included on male resumes, words like executed and captured. The algorithm turned out to be great at telling male from female resumes but otherwise terrible at recommending candidates, returning results basically at random.
You Look Like a Thing and I Love You: How Artificial Intelligence Works and Why It's Making the World a Weirder Place
Rate this book
Clear rating
Open Preview