And that’s why you’ll get algorithms that learn that racial and gender discrimination are handy ways to imitate the humans in their datasets. They don’t know that imitating the bias is wrong. They just know that this is a pattern that helps them achieve their goal. It’s up to the programmer to supply the ethics and the common sense.