Noise: A Flaw in Human Judgment
Rate it:
Open Preview
Read between July 31 - August 28, 2024
71%
Flag icon
might encourage opportunistic behavior,
71%
Flag icon
might be necessary to prevent wrongdoing.
71%
Flag icon
A system might tolerate noise as a way of producing extra deterrence.
71%
Flag icon
people do not want to be treated as if they are mere things,
71%
Flag icon
With these points in mind, our general conclusion is that even when the objections are given their due, noise reduction remains a worthy and even an urgent goal.
71%
Flag icon
too expensive.
71%
Flag icon
There is a legitimate concern here, but it is easily overstated, and it is often just an excuse.
71%
Flag icon
perhaps the educator could make sure to read each essay at the same time of day, so as to reduce occasion noise.
71%
Flag icon
We could easily imagine a limit on how much to invest in noise reduction.
71%
Flag icon
In the case of divergent medical diagnoses, efforts to reduce noise have particular appeal; they might save lives.
71%
Flag icon
an institution might think that elaborate corrective steps are not worth the effort. Sometimes that conclusion is shortsighted, self-serving, and wrong, even catastrophically so.
71%
Flag icon
the belief that it is too expensive to reduce noise is not always wrong.
71%
Flag icon
some noise-reduction efforts might themselves produce unacceptably high levels of error.
71%
Flag icon
some efforts at noise reduction might even increase bias.
72%
Flag icon
false positives are a directional error—a bias.
72%
Flag icon
but some cures are worse than the disease.
72%
Flag icon
three common objections to reform efforts.
72%
Flag icon
aggravate the very problem they are inte...
This highlight has been truncated due to consecutive passage length restrictions.
72%
Flag icon
might be ...
This highlight has been truncated due to consecutive passage length restrictions.
72%
Flag icon
put other important values ...
This highlight has been truncated due to consecutive passage length restrictions.
72%
Flag icon
Quoting Václav Havel, they insisted, “We have to abandon the arrogant belief that the world is merely a puzzle to be solved,
72%
Flag icon
a machine with instructions for use waiting to be discovered, a body of information to be fed into a computer in the hope that, sooner or later, it will spit out a universal solution.”
72%
Flag icon
Eliminating noise is the central point of the three-strikes legislation.
72%
Flag icon
the price of this success is too high.
72%
Flag icon
the price of that noise-reduction strategy is too high.
72%
Flag icon
Supreme Court, a serious constitutional shortcoming of the mandatory death sentence is that it “treats all persons convicted of a designated offense not as uniquely individual human beings, but as members of a faceless, undifferentiated mass to be subjected to the blind infliction of the penalty of death.”
72%
Flag icon
all these people might make mistakes if they apply overly rigid, noise-reducing rules.
72%
Flag icon
noise-free scoring system that fails to take significant variables into account might be worse than reliance on (noisy) individual judgments.
72%
Flag icon
Some noise-reduction strategies ensure too many mistakes.
72%
Flag icon
help reduce noise without creating intolerably high costs (or bias).
72%
Flag icon
introduce other forms of decision hygiene,
72%
Flag icon
growing objections to “algorithmic bias.”
72%
Flag icon
much of this book might be taken as an argument for greater reliance on algorithms, simply because they are noiseless.
72%
Flag icon
In Weapons of Math Destruction, mathematician Cathy O’Neil urges that reliance on big data and decision by algorithm can embed prejudice, increase inequality, and threaten democracy itself.
72%
Flag icon
these and other cases, algorithms could eliminate unwanted variability in judgment but also embed unacceptable bias.
73%
Flag icon
algorithm could discriminate and, in that sense, turn out to be biased, even when it does not overtly use race and gender as predictors.
73%
Flag icon
predictors that are highly correlated with race or gender.
73%
Flag icon
discrimination could also come from the source data.
73%
Flag icon
bias in the training data, it is quite possible to design, intentionally or unintentionally, an algorithm that encodes discrimination.
73%
Flag icon
algorithms could be worse: since they eliminate noise, they could be more reliably biased than human judges.
73%
Flag icon
Exactly how to test for disparate impact, and how to decide what constitutes discrimination, bias, or fairness for an algorithm, are surprisingly complex topics, well beyond the scope of this book.
73%
Flag icon
same kind of scrutiny; people sometimes discriminate unconsciously and in ways that outside observers, including the legal system, cannot easily see.
73%
Flag icon
criteria that matter: accuracy and noise reduction, and nondiscrimination and fairness.
73%
Flag icon
of criteria we select. (Note that we said can and not will.)
73%
Flag icon
producing less racial discrimination than human beings do.
73%
Flag icon
predictive algorithm in an uncertain world is unlikely to be perfect, it can be far less imperfect than noisy and often-biased human judgment.
73%
Flag icon
broader conclusions are simple and extend well beyond the topic of algorithms.
73%
Flag icon
In that case we have a serious problem, but the solution is not to abandon noise-reduction efforts; it is to come up with better ones.
73%
Flag icon
should ask is, can we design algorithms that are both noise-free and less biased?”
73%
Flag icon
And if one effort to reduce noise is too crude—if we end up with guidelines or rules that