More on this book
Community
Kindle Notes & Highlights
Read between
July 9 - July 22, 2019
We will try to convince you that because of recent technological changes, companies need to rethink the balance between minds and machines, between products and platforms, and between the core and the crowd.
it’s exactly because incumbents are so proficient, knowledgeable, and caught up in the status quo that they are unable to see what’s coming, and the unrealized potential and likely evolution of the new technology. This phenomenon has been described as the “curse of knowledge” and “status quo bias,” and it can affect even successful and well-managed companies. Existing processes, customers and suppliers, pools of expertise, and more general mind-sets can all blind incumbents to things that should be obvious, such as the possibilities of new technologies that depart greatly from the status quo.
“Most of the checking, reconciling, waiting, monitoring, tracking—the unproductive work . . .—is eliminated by reengineering. . . . People working in a reengineered process are, of necessity, empowered. As process team workers they are both permitted and required to think, interact, use judgment, and make decisions.”
Erik worked with Lynn Wu, now a professor at the Wharton School, to develop a simple model that predicted housing sales and prices. They used data from Google Trends, which reports how often words like “real estate agent,” “home loan,” “home prices,” and the like were searched for each month in each of the fifty US states. They used their model to predict future housing sales and compared their forecasts to the published predictions made by experts at the National Association of Realtors. When the results came in, their model beat the experts by a whopping 23.6%, reflecting the power of
...more
A much bigger blow to the notion of human superiority in judgment came from the finding that in 46% of the studies considered, the human experts actually performed significantly worse than the numbers and formulas alone. This means that people were clearly superior in only 6% of cases. And the authors concluded that in almost all of the studies where humans did better, “the clinicians received more data than the mechanical prediction.” As Paul Meehl, the legendary psychologist who began in the early 1950s to document and describe the poor track record of human expert judgment, summarized,
...more
Buster Benson, a product manager at the software company Slack, came up with what we think is a great way to group these biases and keep in mind the problems they pose for us:# 1. Information overload sucks, so we aggressively filter. . . . [But] some of the information we filter out is actually useful and important. 2. Lack of meaning is confusing, so we fill in the gaps. . . . [But] our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there.** 3. [We] need to act fast lest we
...more
In 2006, Avinash Kaushik and Ronny Kohavi, two data analysis professionals who were then working at Intuit and Microsoft, respectively, came up with the acronym HiPPO to summarize the dominant decision-making style at most companies. It stands for “highest-paid person’s opinion.” We love this shorthand and use it a lot, because it vividly illustrates the standard partnership. Even when the decisions are not made by the highest-paid people, they’re often—too often—based on opinions, judgments, intuition, gut, and System 1. The evidence is clear that this approach frequently doesn’t work well,
...more
But, like most other advertising buyers, the Obama team knew that relying on demographics was terribly imprecise. They might be showing their ads mainly to hard-core Romney supporters. Or they might be showing them primarily to people who had already made up their minds to vote for Obama, which could also be wasteful. Relying on demographics meant relying on judgments and estimates so coarse that they were really little more than guesses: that eighteen- to twenty-four-year-old men were particularly up for grabs as a group during the election, or that viewers of Family Guy, or perhaps of
...more
Most of us have a lot of faith in human intuition, judgment, and decision-making ability, especially our own (we’ve discussed this topic with a lot of audiences and have almost never heard anyone admit to having below-average intuition or judgment). But the evidence on this subject is so clear as to be overwhelming: data-driven, System 2 decisions are better than those that arise out of our brains’ blend of System 1 and System 2 in the majority of cases where both options exist. It’s not that our decisions and judgment are worthless; it’s that that they can be improved on.
For us, the answer to these questions is no. Good decisions are critical to well-functioning societies: they help the right resources, from rides to jobs to health care, get to the right people in the right place at the right time. The standard partnership advocated by Hammer and Champy, in which computers do the record keeping while HiPPOs exercise their judgment and make decisions, is often not the best way to accomplish this. At this point
In a test involving more than 82,000 forecasts, Tetlock found that “humanity barely bests [a] chimp” throwing darts at the possible outcomes.
Of course, not all of these predictions are wrong. Tetlock found that some people, whom he calls “superforecasters,” really are able to consistently generate forecasts more accurately than chance would predict. They tend to take in information from many sources and, perhaps more important, show an ability to adopt multiple viewpoints when looking at a situation. Less accurate forecasters, meanwhile, tend to have one fixed perspective that they always use in their analyses (both ardent conservatives and diehard liberals, for example, tend to make lousy political predictions). Tetlock calls the
...more
The existence of superforecasters aside, our most fundamental advice about predictions is to rely on them less. Our world is increasingly complex, often chaotic, and always fast-flowing. This makes forecasting something between tremendously difficult and actually impossible, with a strong shift toward the latter as timescales get longer.
In other cases, subjective human judgments should still be used, but in an inversion of the standard partnership: the judgments should be quantified and included in quantitative analyses. Decision-making processes should not be set up so that decision makers feel good about themselves. They should be set up to produce the best decisions, based on the right goals and clear metrics.
As technology has spread, so have opportunities to move past the standard partnership and its overreliance on human HiPPOs, and to move toward more data-driven decision making. The data show that companies that do this usually have an important advantage over those that do not. People who can look at an issue from multiple perspectives and companies that can iterate and experiment effectively are better performers.
Young children don’t need explicit lessons on rules in order to learn to speak well.‡ Most adults can’t learn without them. There’s some overlap in the two approaches, of course—many kids eventually take language classes, and adults pick up some things by ear—but they are starkly different. The brains of young children are specialized for language learning: they operate on statistical principles to discern the patterns in language§ (for example, When Mom talks about herself as the subject, she uses the word “I” and puts it at the start of the sentence. When she’s the object, she uses “me” and
...more
“As of 2014, few commercial systems make any significant use of automated commonsense reasoning . . . nobody has yet come close to producing a satisfactory commonsense reasoner.”
“In doing commonsense reasoning, people . . . are drawing on . . . reasoning processes largely unavailable to introspection.” In other words, the cognitive work that we humans do to navigate so easily through so many thickets of rules is an ongoing demonstration of Polanyi’s Paradox, the strange phenomenon that we know more than we can tell.
The system uses this set of matched pairs to build up the associations within its neural network that enable it to transcribe new instances of recorded speech. Because both supervised and unsupervised machine learning approaches use the algorithms described by Hinton and his colleagues in their 2006 paper, they’re now commonly called “deep learning” systems.
Jeff Dean,** who heads Google’s efforts to use the technology, notes that as recently as 2012 the company was not using it at all to improve products like Search, Gmail, YouTube, or Maps. By the third quarter of 2015, however, deep learning was being used in approximately 1,200 projects across the company, having surpassed the performance of other methods.
Yann LeCun has memorably highlighted the vast, largely untapped importance of unsupervised learning with a cake metaphor. He says, “If intelligence was a cake, unsupervised learning would be the cake, supervised learning would be the icing on the cake, and reinforcement learning would be the cherry on the cake. We know how to make the icing and the cherry, but we don’t know how to make the cake.” He thinks that developing better algorithms for unsupervised learning will be essential if we are ever to achieve AGI.
After enough technical progress, enough experimentation, and enough iteration, we believe that automated and digitally mediated processes will become quite widespread and will take the place of many that are now mediated by people. We believe, in short, that virtualization is a secular trend, where “secular” is used in the way the finance industry uses it: to denote a long-term development that will unfold over several years, rather than a short-term fluctuation.
Our conversations and investigations point to recent major developments in five parallel, interdependent, and overlapping areas: data, algorithms, networks, the cloud, and exponentially improving hardware. We remember them by using the acronym “DANCE.”
We see this pattern—machines assuming the dull, dirty, dangerous, or dear tasks—over and over at present:
In 2015, Rio Tinto became the first company to utilize a fleet of fully remote-controlled trucks to move all the iron ore at its mine in Western Australia’s Pilbara region. The driverless vehicles run twenty-four hours a day, 365 days a year and are supervised from a control center a thousand miles away. The savings from breaks, absenteeism, and shift changes enable the robotic fleet to be 12% more efficient than the human-driven one. Automated milking systems milk about one-quarter of the cows in leading dairy countries such as Denmark and the Netherlands today. Within ten years, this
...more
Drivers of the robotic Cambrian Explosion include data, algorithms, networks, the cloud, and exponential improvements in hardware: DANCE. Robots and their kin will be increasingly used wherever work is dull, dirty, dangerous, and dear.
People are still more agile and dexterous than even the most advanced robots, and they probably will be for some time to come. These abilities, combined with our senses and problem-solving skills, mean that we’ll be working side by side with robots in many settings.
We know how he feels. The debate over whether computers are, or ever can be, truly creative might be of interest to some people, but we are much more excited by questions about how to maximize the total amount of creativity in the world. The right way to do this, we believe, is to push ahead on two fronts: continuing to work on making computers that can come up with good new ideas, and figuring out the best ways to bring them together with human creators. The best solutions here will come from minds and machines working together. Far too
But designers and other creative professionals today spend too much of their time doing mind-numbingly dull things. As former Autodesk CEO Carl Bass explained to us, Using [computer-aided design] tools, it’s like eleventh-grade geometry. You’re sitting there, you draw a line, you find the midpoint, you do this, you draw another thing, you extrude it, you put a fillet¶ on it. What’s interesting about it is, you do it prematurely to actually knowing whether your thing solves the problem. You can work on all these details for weeks and only then find out that the mechanism you’re building is
...more
Computers are getting good at tasks like determining people’s emotional states by observing their facial expressions and vocal patterns, but this is a long, long way from doing the things we just listed. We’re confident that the ability to work effectively with people’s emotional states and social drives will remain a deeply human skill for some time to come. This implies a novel way to combine minds and machines as we move deeper into the second machine age: let the computers take the lead on making decisions (or judgments, predictions, diagnoses, and so on), then let people take the lead if
...more
the medical office of the future might employ an artificial intelligence, a person, and a dog. The AI’s job will be to diagnose the patient, the person’s job will be to understand and communicate the diagnosis, and to coach the patient through treatment, and the dog’s job will be to bite the person if the person tries to second-guess the artificial intelligence.
Digital technologies do a poor job of satisfying most of our social drives. So, work that taps into these drives will likely continue to be done by people for some time to come. Such work includes tasks that require empathy, leadership, teamwork, and coaching.
As technology advances, high-level social skills could become even more valuable than advanced quantitative ones. And the ability to combine social with quantitative skills will often have the highest payoff of all.
Mobile telephones were an expensive novelty in the United States; in 1995 they cost roughly $1,000 and only 13% of people had bought one. The great majority of American households had a landline phone (though the term did not yet exist) connected to the national network by copper wires. The
By 2013, total US newspaper print ad revenue had declined by 70% over the previous decade, and online ads had contributed back only $3.4 billion of the $40 billion lost in annual sales. A saying emerged in the newspaper industry that “print dollars were being replaced by digital dimes.” From 2007 to 2011, 13,400 newspaper newsroom jobs were cut in the United States. Help-wanted classified ad revenue dropped by more than 90% in the decade after 2000, from $8.7 billion to $723 million. Newspaper companies including the Tucson Citizen (Arizona’s oldest continuously published daily newspaper) and
...more
This highlight has been truncated due to consecutive passage length restrictions.
Unbundling is not the end of the story. As Jim Barksdale, former CEO of Internet browser company Netscape, observed, “There’s only two ways I know of to make money: bundling and unbundling.” As it turns out, both halves of this advice apply to music. Those same rights holders who unbundled music
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design. — Friedrich von Hayek, 1988
The platform revolution is not nearly finished, and its impact will be profound. Recent examples like Stripe, ClassPass, Postmates, and Transfix are the vanguard of a large trend: the diffusion of platforms, especially those leveraging the world’s rapidly improving digital infrastructure. This diffusion into other industries will continue because of the significant advantages that platforms have over their competitors, and because of the many advantages they bring to their participants.
“High reputation beats high similarity,” as Gebbia puts it; his company has found that its platform’s user interface and user experience “can actually help us overcome one of our most deeply rooted biases—the ‘stranger-danger’ bias.”
The crowd is not unstructured, however. Its structure is emergent, appearing over time as a result of the interactions of members. Stock markets, prediction markets, and modern search engines extract valuable information from this emergent structure.
Overcentralization fails because of Hayek’s insights and Polanyi’s Paradox: people can’t always articulate what they have, what they know, what they want, and what they can do. Large crowds can be brought together to build highly useful products like Linux. Such efforts require “geeky leadership” that follows principles of openness, noncredentialism, self-selection, verifiability, and clarity about goals and outcomes.
such as the well-documented biases toward overconfidence and confirmation (that is, only really considering information that supports what you were already thinking)—become stronger and thus lead to worse outcomes.
something. The chances of finding a good fit increase with the number of people who see the request, which explains why platforms for task matching have become so popular. These include 99designs and Behance for graphic design and other creative work, Upwork for information technology and customer service tasks, Care.com for personal services, and TaskRabbit for a wide variety of odd jobs, like officiating at a wedding, delivering ice cream cake to someone’s grandfather, or waiting in line at the Apple Store ahead of a new iPhone release.
One reason for the success of the crowd is that the core is often mismatched for the problems it’s most interested in.
The freedom of all is essential to my freedom. — Mikhail Bakunin,
In June of 2016 the Republic of Georgia announced a project in conjunction with economist Hernando de Soto to design and pilot a blockchain-based system for land title registry in the country. It is expected that moving elements of the process onto the blockchain can reduce costs for homeowners and other users, while also reducing possibilities for corruption (since the land records, like everything else on the blockchain, will be unalterable).
Bitcoin shows the potential of completely decentralized communities. By combining math (cryptography), economics, code, and networks, they can create something as fundamental and critical as money.
The blockchain might well be more important than Bitcoin. It’s open, transparent, global, flexible, and immutable ledger is clearly valuable, especially if it’s combined with smart contracts and other digital innovations. The most remarkable thing about Bitcoin and the blockchain might be how they enable a global crowd of people and organizations, all acting in their own interest, to create something of immense shared value. Bitcoin and the blockchain have sparked a wave of innovation and entrepreneurship, and it’s not at all clear now what roles they’ll eventually play in economies and
...more
But things have not worked out that way. According to the Bureau of Labor Statistics, managers represented approximately 12.3% of the US workforce in 1998, but by 2015 this figure had increased to 15.4%. And there’s strong evidence that a lot of other jobs have become substantially more management-like over time.

