Robert

12%
Flag icon
Even when we do have representation, a training set can still be biased and produce invalid results. A study completed in 2016 found that more than 117 million American adults are in a law enforcement facial recognition database.
Robert
Here's a better example, I think: "Algorithms Should’ve Made Courts More Fair. What Went Wrong?" https://www.wired.com/story/algorithms-shouldve-made-courts-more-fair-what-went-wrong/: "A 2011 Kentucky law requires judges to consult an algorithm when deciding whether defendants must post cash bail. More whites were allowed to go home, but not blacks."
Software Engineering at Google: Lessons Learned from Programming Over Time
Rate this book
Clear rating
Open Preview