More on this book
Community
Kindle Notes & Highlights
Read between
December 17, 2016 - September 4, 2017
the Oakland front office discovered what amounted to new baseball knowledge.
They found value in players who had been discarded or overlooked, and folly in much of what passed for baseball wisdom.
a lot of people have taken the Oakland A’s as their role model and set out to use better data, and better analysis of that data, to find market inefficiencies.
But the enthusiasm for replacing old-school expertise with new-school data analysis was often shallow.
this hunger for an expert who knows things with certainty, even when certainty is not possible—has a talent for hanging around.
The ways in which some baseball expert might misjudge baseball players—the ways in which any expert’s judgments might be warped by the expert’s own mind—had been described, years ago, by a pair of Israeli psychologists, Daniel Kahneman and Amos Tversky.
His job was to replace one form of decision making, which relied upon the intuition of basketball experts, with another, which relied mainly on the analysis of data.
had no way of knowing that people with a gift for using numbers to predict things would overrun professional sports management and everyplace else high-stakes decisions were being made—or
up. He simply suspected that the established experts maybe didn’t know as much as everyone thought they did.
Maybe the experts don’t know what they’re talking about.”
From his stint as a consultant he learned something valuable, however. It seemed to him that a big part of a consultant’s job was to feign total certainty about uncertain things.
with McKinsey, they told him that he was not certain enough in his opinions. “And I said it was because I wasn’t certain.
A lot of what people did and said when they “predicted” things, Morey now realized, was phony: pretending to know things rather than actually knowing things.
A perfect answer didn’t exist, but statistics could get you to some answer that was at least a bit better than simply guessing.
The more the players got paid, the more costly to him the sloppy decisions. He thought that Morey’s analytical approach might give him an edge in the market for high-priced talent, and he was sufficiently indifferent to public opinion to give it a whirl.
The model was also a tool for the acquisition of basketball knowledge. “Knowledge is literally prediction,” said Morey. “Knowledge is anything that increases your ability to predict the outcome.
“Knowledge is literally prediction,” said Morey. “Knowledge is anything that increases your ability to predict the outcome.
A model allowed you to explore the attributes in an amateur basketball player that led to professional success, and determine how much weight should be given to each.
A model allowed you to explore the attributes in an amateur basketball player that led to professional success, and determine how much weight should be given to each. Once you had a database of thousands of former players, you could search for more general correlations between their performance in college and their professional careers.
“Just having the model, without any human opinion at all, forces you to ask the right questions,”
He didn’t think of his model as “the right answer” so much as “a better answer.”
The model obviously needed to be checked and watched—mainly because there was information that the model wouldn’t be privy to.
no one else was using a model to judge basketball players—no one had bothered to acquire the information needed by any model.
Any theory about basketball players had to be tested on a database of players.
There was much information Morey’s model needed that simply was not available. The Rockets began to gather their own original data by measuring things on a basketball court that had previously gone unmeasured.
“What I missed were the limitations of the model.”
there was a limit to the usefulness of even the objective, measurable information.
You needed experts. The limits of any model invited human judgment back into the decision-making process—whether it helped or not.
The trick wasn’t just to build a better model. It was to listen both to it and to the scouts at the same time.
“You have to figure out what the model is good and bad at, and what humans are good and bad at,”
Humans sometimes had access to information that the model did...
This highlight has been truncated due to consecutive passage length restrictions.
Morey leaned on his staff to pay attention to the workouts but not allow whatever they saw to replace what they knew to be true. Still, a lot of people found it very hard to ignore the evidence of their own eyes.
A scout would settle on an opinion about a player and then arrange the evidence to support that opinion.
Maybe the mind’s best trick of all was to lead its owner to a feeling of certainty about inherently uncertain things.
when they were judging other people, saw what they expected to see and were slow to see what they hadn’t seen before.
“The owners often made their money from disrupting fields where most of the conventional wisdom is bullshit,”
These people tended to be keenly aware of the value of even slight informational advantages, and open to the idea of using data to gain those advantages.
It was strange that when people bothered to measure what happened on the court, they had measured the wrong things so happily for so long.
There was a new awareness of the sorts of systematic errors people might make—and so entire markets might make—if their judgments were left unchecked.
he thought of science as a conversation.
interesting things happened to people who could weave them into interesting stories.
“All your economic models are premised on people being smart and rational, and yet all the people you know are idiots.”
If you went to a doctor in the seventeenth century, you were worse off for having gone. By the end of the nineteenth century, going to the doctor was a break-even proposition: You were as likely to come away from the visit better off as you were to be worse off.
Edwards went on to lay out a problem: Economic theory, the design of markets, public policy making, and a lot more depended on theories about how people made decisions.
“Is this behavior irrational?” he wrote. “We tend to doubt it. . . . When faced with complex multidimensional alternatives, such as job offers, gambles or [political] candidates, it is extremely difficult to utilize properly all the available information.”
When people make decisions, they are also making judgments about similarity, between some object in the real world and what they ideally want. They make these judgments by, in effect, counting up the features they notice. And as the noticeability of features can be manipulated by the way they are highlighted, the sense of how similar two things are might also be manipulated.
“the similarity of objects is modified by the manner in which they are classified. Thus, similarity has two faces: causal and derivative. It serves as a basis for the classification of objects, but is also influenced by the adopted classification.”
Things are grouped together for a reason, but, once they are grouped, their grouping causes them to seem more like each other than they otherwise would. That is, the mere act of classification reinforces stereotypes. If you want to weaken some stereotype, eliminate the classification.
“Amos’s approach to doing science wasn’t incremental,” said Rich Gonzalez. “It proceeded by leaps and bounds. You find a paradigm that is out there. You find a general proposition of that paradigm. And you destroy it. He saw himself doing a negative style of science.
That’s how Amos would begin: by undoing the mistakes of others. As it turned out, other people had made some other mistakes.