Wally Bock

60%
Flag icon
In medicine, between-person noise, or interrater reliability, is usually measured by the kappa statistic. The higher the kappa, the less noise. A kappa value of 1 reflects perfect agreement; a value of 0 reflects exactly as much agreement as you would expect between monkeys throwing darts onto a list of possible diagnoses.
Noise: A Flaw in Human Judgment
Rate this book
Clear rating
Open Preview