Lars

43%
Flag icon
And according to a 2016 investigation by ProPublica, their results are typical: only about six out of ten defendants who COMPAS predicts will commit a future crime actually go on to do so. That figure is roughly the same across races. But the way the software is wrong is telling: Black defendants are twice as likely to be falsely flagged as high-risk. And white defendants are nearly as likely to be falsely flagged as low-risk.
Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
Rate this book
Clear rating
Open Preview