AttackGirl

50%
Flag icon
profiles, while expanding the risk for errors. Recently, Google processed images of a trio of happy young African Americans and its automatic photo-tagging service labeled them as gorillas. The company apologized profusely, but in systems like Google’s, errors are inevitable. It was most likely faulty machine learning (and probably not a racist running loose in the Googleplex) that led the computer to confuse Homo
AttackGirl
AI database looking at bigger picture of all verses just picture. Now white women with black males... purpose social manipulation or AI adjustments. White males?
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Rate this book
Clear rating
Open Preview