Faithful to the data the model was trained on, it filtered out résumés indicating a candidate was a woman. This was the by-product of prior human decisions that favored men. At Amazon, the initial system was not adopted after the engineers were unable to take out the gender bias. The choice to stop is a viable and necessary option.