More on this book
Community
Kindle Notes & Highlights
by
Tim Harford
Read between
November 11 - November 12, 2018
Agricultural abundance creates rulers and ruled, masters and servants, and inequality of wealth unheard of in hunter-gatherer societies.
The moldboard plow cuts a long, thick ribbon of soil and turns it upside down.8 In dry ground, that’s a counterproductive exercise, squandering precious moisture. But in the fertile wet clays of Northern Europe, the moldboard plow was vastly superior, improving drainage and killing deep-rooted weeds, turning them from competition into compost.
But the wet-clay moldboard plow required a team of eight oxen—or, better, horses—and who had that sort of wealth? It was most efficient in long, thin strips often a step or two away from someone else’s long, thin strips. As a result, farming became more of a community practice: people had to share the plow and draft animals and resolve disagreements. They gathered together in villages. The moldboard plow helped usher in the manorial system in Northern Europe.9
With their diets of rice and grain, our ancestors were starved of vitamins, iron, and protein. As societies switched from foraging to agriculture ten thousand years ago, the average height for both men and women shrank by about six inches, and there’s ample evidence of parasites, disease, and childhood malnutrition. Jared Diamond, author of Guns, Germs, and Steel, called the adoption of agriculture “the worst mistake in the history of the human race.”
The Luddites weren’t worried about being replaced by machines; they were worried about being replaced by the cheaper, less skilled workers whom the machines would empower.3 So whenever a new technology emerges, we should ask: Who will win and who will lose out as a result? The answers are often surprising, as we’re about to see.
A French publisher, Édouard-Léon Scott de Martinville, had already developed something called the “phonoautograph,” a device intended to provide a visual record of the sound of a human voice—a little like a seismograph records an earthquake. But it does not seem to have occurred to Monsieur Scott de Martinville that one might try to convert the recording back into sound.5
Thomas Edison’s phonograph led the way toward a winner-take-all dynamic in the performing industry. The very best performers went from earning like Mrs. Billington to earning like Elton John. Meanwhile, the only-slightly-less-good went from making a comfortable living to struggling to pay their bills. Small gaps in quality became vast gaps in money, because nobody was interested in paying for a copy of the second-best when you could have a copy of the best.
In 2002, David Bowie warned his fellow musicians that they were facing a very different future. “Music itself is going to become like running water or electricity,” he said. “You’d better be prepared for doing a lot of touring because that’s really the only unique situation that’s going to be left.”10
Inequality remains alive and well—the top 1 percent of musical artists take more than five times more money from concerts than the bottom 95 percent put together.
But while modern minds naturally think of the telephone as transformative, barbed wire wreaked huge changes on the American West, and much more quickly.
So barbed wire spread because it solved one of the biggest problems the settlers faced. But it also sparked ferocious disagreements. And it’s not hard to see why. The homesteading farmers were trying to stake out their property—property that had once been the territory of various Native American tribes. And twenty-five years after the Homestead Act came the Dawes Act, which forcibly assigned land to Native American families and gave the rest to white farmers. The philosopher Olivier Razac comments that as well as freeing up land for settler cultivation, the Dawes Act “helped destroy the
...more
This function of matching people who have coincidental wants is among the most powerful ways the Internet is reshaping the economy. Traditional markets work perfectly well for some goods and services, but they’re less useful when the goods and services are urgent or obscure.
Back then, eBay had only just started. Its very first sale: Mark Fraser bought a broken laser pointer.
Trust is an essential component of markets—it’s so essential that we often don’t even notice it, as a fish doesn’t notice water. In developed economies, enablers of trust are everywhere: brands, money-back guarantees, and of course repeat transactions with a seller who can be easily located. But the new sharing economy lacks those enablers. Why should we get into a stranger’s car—or buy a stranger’s laser pointer? In 1997, eBay introduced a feature that helped solve the problem: Seller Feedback.
Among the best ways to improve the usefulness of search results is to analyze which links were ultimately clicked by people who previously performed the same search, as well as what the user has searched for before.11 Google has far more of that data than anyone else. That suggests it may continue to shape our access to knowledge for generations to come.
Your access to a passport is, broadly speaking, determined by where you were born and the identity of your parents. (Although anybody with $250,000 can buy one from St. Kitts and Nevis.15)
Yet the passport is a tool designed to ensure that a certain kind of discrimination takes place: discrimination on the grounds of nationality.
Yet economic logic points in the opposite direction. In theory, whenever you allow factors of production to follow demand, output rises. In practice, all migration creates winners and losers, but research indicates that there are many more winners—in the wealthiest countries, by one estimate, five in six of the existing population are made better off by the arrival of immigrants.
Suppose a group of Mexicans arrive in the United States, ready to pick fruit for lower wages than Americans are earning. The benefits—slightly cheaper fruit for everyone—are too widely spread and small to notice, while the costs—some Americans lose their jobs—produce vocal unhappiness.
That suggests our world would now be much richer if passports had died out in the early twentieth century. There’s one simple reason they didn’t: World War I intervened. With security concerns trumping ease of travel, governments around the world imposed strict new controls on movement—and they proved unwilling to relinquish their powers once peace returned.
The world’s robot population is expanding quickly—as of 2016, sales of industrial robots grew about 13 percent a year, which means the robot “birth rate” is almost doubling every five years.
So perhaps, for a glimpse of the future, we should look not to Rosie the Robot but to another device now being used in warehouses—the Jennifer unit. It’s a headset that tells human workers what to do, down to the smallest detail; if you have to pick nineteen identical items from a shelf, it will tell you to pick five, then five, then five, then four . . . which leads to fewer errors than if you were told “Pick nineteen.”
But the same basic idea links every welfare state: that the ultimate responsibility for ensuring that people don’t starve on the street should lie not with family, or charity, or private insurers, but with government.
Some evidence suggests it’s worth considering. From 1974 to 1979, the idea was tried in a small Canadian town, Dauphin, in Manitoba. For five years, thousands of Dauphin’s poorest residents got monthly checks funded jointly by the provincial and federal governments. And it turns out that guaranteeing people an income had interesting effects. Fewer teenagers dropped out of school. Fewer people were hospitalized with mental health problems. And hardly anyone gave up work.15 New trials are under way to see if the same thing happens elsewhere.
Formula has another, less obvious economic cost. There’s evidence that breastfed babies grow up with slightly higher IQs—about three points higher, when you control as well as possible for other factors. And higher IQs are linked to greater productivity and lifetime earnings. What might be the benefit of making a whole generation of kids just that little bit smarter? According to The Lancet, about $300 billion a year.18 That’s several times the value of the global formula market.
Those ads have always been controversial, not least because formula is arguably more addictive than tobacco or alcohol. When a mother stops breastfeeding, her milk dries up. There’s no going back.
In Utah, there’s a company called Ambrosia Labs. It pays mothers in Cambodia to express breast milk, screens it for quality, and sells it on to American mothers. It’s pricey now—more than $100 a liter.23 But that could come down with scale.
Women in the United States now spend about forty-five minutes a day in total on cooking and cleaning up; that is still much more than men, who spend just fifteen minutes a day. But it is a vast shift from Mary’s four hours a day.
In the 1960s, only a quarter of food spending was on food prepared and eaten outside the home;3 it’s been rising steadily since then, and in 2015 a landmark was reached: for the first time in their history, Americans spent more on food and drink consumed outside the home than on food and beverages purchased at grocery stores.4 In case you think Americans are unusual in that, the British passed that particular milestone more than a decade earlier.5
The data are clear that the washing machine didn’t save a lot of time, because before the washing machine we didn’t wash clothes very often. When it took all day to wash and dry a few shirts, people would use replaceable collars and cuffs or dark outer layers to hide the grime on their clothes.
The availability of ready meals has had some regrettable side effects. Obesity rates rose sharply in developed countries between the 1970s and the early twenty-first century, at much the same time as these culinary innovations were being developed and embraced. This is no coincidence, say health economists: the cost of eating a lot of calories has fallen dramatically, not just in financial terms but in terms of the cost of time.9
Between 1977 and 1995, American potato consumption increased by a third, and almost all of those extra potatoes were fried.
In the United States, calorie intake among adults rose by about 10 percent between the 1970s and the 1990s, but none of that was due to more calorific regular meals. It was all from snacking—and that usually means processed convenience food.
Over the centuries, lovers have tried all kinds of unappealing tricks to prevent pregnancy. There was crocodile dung in ancient Egypt, Aristotle’s recommendation of cedar oil, and Casanova’s method of using half a lemon as a cervical cap.1 But even the obvious modern alternative to the pill, condoms, have a failure rate. Because people don’t tend to use condoms exactly as they’re supposed to—they sometimes rip or slip—with the result that for every one hundred sexually active women using condoms for a year, eighteen will become pregnant.
But the failure rate of the pill is just 6 percent—three times better than condoms. That assumes typical, imperfect use; use it perfectly and the failure rate drops to one-twentieth of that.
In 1970, men earned more than 90 percent of the medical degrees awarded that year. Law and business degree classes were more than 95 percent male. Dentistry degree classes were 99 percent male. But at the beginning of the 1970s, equipped with the pill, women surged into all these courses. The proportion of women in these classes increased swiftly, and by 1980 they were often a third of the class. It was a huge change in a brief space of time.
A few years ago, the economist Amalia Miller used a variety of clever statistical methods to demonstrate that if a woman in her twenties was able to delay motherhood by one year, her lifetime earnings would rise by 10 percent: that was some measure of the vast advantage to a woman of completing her studies and securing her career before having children.
American women today can look across the Pacific Ocean for a vision of an alternative reality. In Japan, one of the world’s most technologically advanced societies, the pill wasn’t approved for use until 1999. Japanese women had to wait thirty-nine years longer than American women for the same contraceptive.
One of them decided that Spacewar deserved a breathtaking backdrop and programmed what he called the “Expensive Planetarium” subroutine. It featured a realistic starscape, stars displayed with five different brightnesses, as viewed from Earth’s equator. The author of the glorious addition: Peter Samson, the young student whose imagination was so captured by Spacewar that he misperceived the night sky above Lowell, Massachusetts.4
One contemporary commentator warned that the retailer “could no longer sell what his own judgement dictated. He must sell what the consumer wanted.”1 That commentator was Charles Coolidge Parlin. He’s widely recognized as the world’s first professional market researcher—and,
of market research. A century later, the market research profession is huge: in the United States alone, it employs about half a million people.2
Approaches to market research became more scientific; in the 1930s, George Gallup pioneered opinion polls; the first focus group was conducted in 1941 by an academic sociologist, Robert K. Merton. He later wished he could have patented the idea and collected royalties.8
Willis Carrier was earning $10 a week—below minimum wage in today’s money. But he figured out a solution: circulating air over coils that were chilled by compressed ammonia maintained the humidity at a constant 55 percent.
By 1906 he was already talking up the potential for “comfort” applications in theaters and other public buildings.5
Air-conditioning has transformed architecture. Historically, a cool building in a hot climate implied thick walls, high ceilings, balconies, courtyards, and windows facing away from the sun. The so-called dogtrot house, once popular in America’s South, was two sets of rooms bisected by a covered, open-ended corridor that allowed breezes through.
The proportion of air-conditioned homes in Chinese cities jumped from under one-tenth to more than two-thirds in just ten years.11 In countries such as India, Brazil, and Indonesia, the market for air conditioners is expanding at double-digit rates.
Yale University’s William Nordhaus divided the world into cells, by lines of latitude and longitude, and plotted each one’s climate, output, and population. He concluded that the hotter the average temperature, the less productive people could be.18 According to Geoffrey Heal of Columbia University and Jisung Park of Harvard, a hotter-than-average year is bad for productivity in hot countries, but good in cold ones: crunching the numbers, they conclude that human productivity peaks when the temperature is between 65 and 72 degrees Fahrenheit.19
But there’s an inconvenient truth: You can make it cooler inside only by making it warmer outside. Air-conditioning units pump hot air out of buildings. A study in Phoenix, Arizona, found that this effect increased the city’s night-time temperature by two degrees.20 Of course, that only makes air-conditioning units work harder, making the outside hotter still. On underground metro systems, cooling the trains can lead to swelteringly hot platforms.
This was far more dangerous work than manufacturing or even construction. In a large port, someone would be killed every few weeks. In 1950, New York averaged half a dozen serious incidents every day—and New York’s port was one of the safer ones.
The man who navigated this maze of hazards—who can fairly be described as the inventor of the modern shipping container system—was an American, Malcom McLean.