Kimberly Nicholas

65%
Flag icon
data the COMPAS algorithm learned from is the result of hundreds of years of systematic racial bias in the US justice system. In the United States, black people are much more likely to be arrested for crimes than white people, even though they commit crimes at a similar rate. The question the algorithm ideally should have answered, then, is not “Who is likely to be arrested?” but “Who is most likely to commit a crime?” Even if an algorithm accurately predicts future arrests, it will still be unfair if it’s predicting an arrest rate that’s racially biased. How did it even manage to label black ...more
You Look Like a Thing and I Love You: How Artificial Intelligence Works and Why It's Making the World a Weirder Place
Rate this book
Clear rating
Open Preview