More on this book
Community
Kindle Notes & Highlights
Evil Geniuses chronicles the quite deliberate reengineering of our economy and society since the 1960s by a highly rational confederacy of the rich, the right, and big business.
I wrote about how, since the 1980s, “the piece of the income pie taken each year by the rich has become as hugely disproportionate as it was in the 1920s,” how “an average CEO now gets paid several hundred times the salary of his average worker, a gap that’s an order of magnitude larger than it was in the 1970s,” and how most Americans’ wages had barely budged.
The economy is weather, the political economy is climate.
in forty years, the share of wealth owned by our richest 1 percent has doubled, the collective net worth of the bottom half has dropped almost to zero, the median weekly pay for a full-time worker has increased by just 0.1 percent a year, only the incomes of the top 10 percent have grown in sync with the economy, and so on. Americans’ boats stopped rising together; most of the boats stopped rising at all. But along with economic inequality reverting to the levels of a century ago and earlier, so has economic insecurity, as well as the corrupting political power of big business and the rich,
...more
As it turned out, the 1980s were the ’30s but in reverse: instead of a fast-acting New Deal, a time-release Raw Deal. But the reengineering was helped along because the masterminds of the economic right brilliantly used the madly proliferating nostalgia. By dressing up their mean new rich-get-richer system in old-time patriotic drag. By portraying low taxes on the rich and unregulated business and weak unions and a weak federal government as the only ways back to some kind of rugged, frontiersy, stronger, better America. And by choosing as their front man a winsome 1950s actor in a cowboy hat,
...more
Thus the angriest organized resistance to the new, the nostalgias driving the upsurge of racism and sexism and nativism—which gave us a president who seemed excitingly new because he asserted an impossible dream of restoring the nastily, brutishly old.
Of course, that process of perpetual reinvention and refreshment always involved tension between people pushing for the new and people resisting it, sometimes with existential ferocity: irreconcilable differences over status quos resulted first in the American Revolution and then in the Civil War and then the politics of the Depression.
America’s tragic flaw is our systemic racism, and it’s a residue of a terrible decision our founders made to resist the new and perpetuate the old: the enslavement of black people. Slavery had ended in most of Europe by the 1500s, but not in its colonies in the New World and elsewhere. France and Spain and Britain outlawed their slave trades and slavery itself decades before the United States did, and they found it unnecessary to fight civil wars over the issue. Tsarist Russia emancipated its serfs before democratic America emancipated its slaves. On abolition we were not early adopters.
governor of Tennessee wrote after the Civil War, for “the cotton fields, alive with the toiling slaves, who, without a single care to burden their hearts, sang as they toiled from early morn till close of day.”
Mark Twain blamed secession and the Civil War on such Southern “love [of] sham chivalries of a brainless and worthless long-vanished society.”
In 1910 President Theodore Roosevelt, a rich Republican, said that “corporate funds” used “for political purposes” were “one of the principal sources of corruption” and had “tended to create a small class of enormously wealthy and economically powerful men whose chief object is to hold and increase their power.”
Even as Democratic president Harry Truman was launching the Cold War in 1950 to protect capitalist America from foreign Communists, he emphasized in his State of the Union speech that the country needed an even tougher antitrust law to protect our system from falling “under the control of a few dominant economic groups whose powers will be so great that they will be a challenge to democratic institutions.”
the USSR acquired intercontinental nuclear missiles only in the late 1950s, and the new possibility of instant global apocalypse surely helped predispose a critical mass of American kids to become hedonists, renegades, utopians, and/or nihilists, at least temporarily. But that was only a hypothetical war. When the actual war came in 1964 and 1965, just as the oldest boomers turned eighteen and nineteen, it triggered the full-scale countercultural reaction—then kept fueling it as the U.S. deployment in Vietnam increased from a few thousand advisers to 445,000 troops in less than two years, and
...more
At the end of 1963, the Times ran a long front-page article with the headline GROWTH OF OVERT HOMOSEXUALITY IN CITY PROVOKES WIDE CONCERN. But after the 1969 riot during a police bust of the LGBTQ patrons of the Stonewall Inn in Greenwich Village, a long front-page Times article was headlined HOMOSEXUALS IN REVOLT.
We can be humble and live a good life with the aid of the machines, or we can be arrogant and die.”
In his speech, Powell said that because “the media and intellectual communities of our society [have] built up these extremists into national figures of prominence, power and even adulation,” made them “lionized on the campus, in the theater and arts, in the national magazines and on television,” more and more young Americans, “often from our finest homes, [are] vulnerable to radical ‘mind-blowing’ ” and now believed “the destructive criticism…that our free enterprise system is ‘rotten’ and that somehow we have become a wholly selfish, materialistic, racist and repressive society.” And
...more
Modern liberals prided ourselves on not being ideologues, on entertaining all sorts of disparate policy ideas for improving the world, whereas the economic right really has one big, simple idea—do everything possible to let the rich stay rich and get richer.
The Reagan administration did away with the federal Fairness Doctrine, which had been in place since the early broadcast era to prevent radio and TV news programs from having distinct ideological or partisan tilts.
a handful of law students at Yale, Harvard, and the University of Chicago, sharing an enthusiasm for what one called “free-market concepts,” founded the Federalist Society. It turned out to be the monumentally important first step in the plan Horowitz’s memo had laid out. The Chicago chapter enlisted a professor at their law school who was also on the payroll of the conservative Washington think tank AEI to be their faculty adviser—Antonin Scalia.
Reactionaries and other right-wingers castigated the Warren court and other federal judges for being “judicial activists” and called themselves “strict constructionists” who adhered to the plain old meanings of the Constitution. Accusing white liberals in the 1950s and ’60s of being constitutionally sloppy was more effective than calling them atheists and coddlers and race traitors.
Originalism’s most important hidden agenda was to keep courts and judges completely out of the business of business, as if what worked for the U.S. economy for its first century, before modern corporations existed, was how things should work today. It was like the Friedman Doctrine, which turned a reasonable capitalist truism (profits are essential) into a simple-minded, unhinged, socially destructive monomania (only profits matter).
As movements, originalism in the law and libertarianism in economics were fraternal twins. Both were born of extreme nostalgia, fetishizing and distorting bygone America, so both more easily achieved mass appeal in the everything-old-is-new-again 1970s and ’80s. Both purported to be based on objective principles that transcended mere politics or special interests, even while both were vehicles for big business and the right to recover, fortify, and expand their economic and political power.
But the opinionated and cocksure young Bork didn’t stay in his lane. Economic libertarianism was also the basis of his argument in 1963 and 1964 against the proposed Civil Rights Act.*3 At the very moment when Martin Luther King, Jr., was leading the watershed pro-civil-rights March on Washington in August 1963, Bork published an article in The New Republic arguing that any law requiring businesses to serve people of all races would be “subversive of free institutions” by “self-righteously impos[ing] upon a minority”—that is, upon racist white businessmen—“the morals of the majority.” The
...more
Exactly one year after Bork’s book was published, a pivotal Supreme Court decision quoted its key “consumer welfare” sentence, and since then federal judges have quoted the line in antitrust decisions dozens of times. Just like that, economic efficiency as measured by prices became “the stated goal in antitrust” exclusively.
What’s more, in the 1970s and ’80s a whole new field of law emerged out of the theories and ideas that Bork and his fellow Chicago School libertarians had been crafting. Its name is simple and deceptively generic: Law and Economics. Getting lawyers and especially judges more fluent in economic analysis is a good thing, of course. But the animating idea behind Law and Economics was political—that a main point of the law, not only of antitrust, is to maximize economic efficiency, that the law’s bottom line is the economic bottom line.
So the successful war launched by the economic right in the 1970s and ’80s had several theaters, one of which was the law, with two main battlefronts, originalism and Law and Economics. Both were long-haul strategic campaigns effectively camouflaged as impartial philosophical movements. Originalists’ deep anger didn’t really derive from legal analytics about what Bork called “the Supreme Court’s unconstitutional rulings” of the last century. Rather, it was about the progressive particulars of what he called “the sustained radicalism of the Warren Court,” such as the rulings that guaranteed
...more
As soon as a rule or regulation solves some problem, we tend to forget about both the problem and the solution. If you were to read the previous paragraph aloud to people before asking them the standard Gallup poll question—“Do you think there’s too much, too little or about the right amount of government regulation of business and industry?”—I’m certain that many, many fewer would answer too much.
After all, for as long as anyone could remember, Americans shared proportionately in the national prosperity, the fractions going to the people at the bottom and the middle and the top all growing at the same rate. In the 1980s it wasn’t yet clear to most people that the political economy was being changed from a more or less win-win game to one that was practically zero-sum, that over the next few decades, at least three-quarters of them would be the economic casino’s suckers, that their losses and forgone winnings would all go to the luckiest 20 percent, and that thenceforth in America only
...more
Social contracts are unwritten but real, taken seriously but not literally, which is their beauty and their problem. They consist of all the principles and norms governing how members of society are expected to treat one another, the balance between economic rights and responsibilities, between how much freedom is permitted and how much fairness is required.
In the evolving American social contract, the balance among the competing demands of liberty and equality and solidarity (or fraternité) worked pretty well for most of the twentieth century, the arc bending toward justice. But then came the ultra-individualistic frenzy of the 1960s, and during the 1970s and ’80s, liberty assumed its powerfully politicized form and eclipsed equality and solidarity among our aspirational values. Greed is good meant that selfishness lost its stigma. And that was when we were in trouble.
Libertarians fantasize that they’re action heroes and entirely self-made. They tend to exempt themselves from the truism that there but for the grace of God goes each one of them, because an implicit premise of their ultra-individualism is that anybody in America can make it on their own and that unfair disadvantages either don’t exist or can’t be helped. I have a hunch that the demographic profile of self-identified libertarians—94 percent white, 68 percent male, 62 percent in their forties or younger—has something to do with those beliefs and fantasies.
For forty years, from the 1940s through the ’70s, the compensation of the top three executives of the largest companies had increased modestly, less than 1 percent a year, from the equivalent of $1.4 million on average to $1.8 million. Then it suddenly went crazy, particularly during the 1990s, so that by the early 2000s, those executives were receiving an average of $13 million a year. In the 2010s the average compensation of the five hundred highest-paid executives of public companies was $30 million.
From the 1930s through the ’80s, the top three executives at the fifty largest U.S. corporations were paid between thirty and sixty times as much as their average employee. But then in just a dozen years, that ratio quadrupled, so that by 2003 those three top bosses on average were earning 219 times as much as their average employee. The ratio between the pay of the average worker and that of the CEO climbed even higher and remains close to three hundred. Some U.S. CEOs—at Starbucks and Disney, for instance—are paid one thousand times more than their median employee.
For a decade, since before she was in the Senate, Elizabeth Warren has been saying that stock buybacks provide only a “sugar high for companies in the short term.” That metaphor seems too benign. It’s more like the high—and addiction—produced by cocaine, another craze that swept America in the 1970s and ’80s, and that makes addicts neglect their important but mundane duties and long-term health in favor of the next jolt of artificial self-confidence.*9 Consider this remarkable fact: from 2010 through 2019, most of the money invested in U.S. stocks came not from true investors but from
...more
Thus began America’s radical increase in economic inequality—fully a quarter of which, research shows, might be attributable to just the increased pay and wealth that has gone since the 1980s to the people working in finance.
all those smart financial professionals’ advice and judgments about what stocks and bonds to buy and sell, so-called “active management,” apparently don’t even make clients more money. In other words, much of this financial priesthood is superfluous, unnecessary. The evidence in study after study is that active management of financial portfolios by professionals “is not directly beneficial to investors on average…especially after taking into account fees.” In 2016, for instance, two-thirds of the professional portfolio managers buying and selling shares in the four hundred biggest companies
...more
It’s ironic that just as we entered an economic era all about eliminating inessential middlemen and corporate bloat, one bloated sector filled with inessential middlemen, finance, has flourished as never before. It’s ironic that one of the rationales for America’s 1980s makeover was to revive the heroic American tradition of risk-taking—given that so much of the story has turned out to be about reckless financiers insulating themselves from risk by shifting it to customers and, through the government, to taxpayers. It’s ironic that finance, a service industry created to help business and the
...more
In all this, financialization has done what people back in the 1950s and ’60s and ’70s worried and warned that the Communists would do if they took over: centralize control of the economy, turn Americans into interchangeable cogs serving an inhumane system, and allow only a well-connected elite to live well. Extreme capitalism resembles Communism: yet another whopping irony.
after surviving the Depression and winning the war, Americans cruised along together for almost four decades in glorious sunny weather that seemed like it would go on forever—then we hit rough seas, and suddenly the first-class passengers, saying they hoped everyone else could join them later, grabbed all the lifeboats for themselves and sped off to their own private luxury ship anchored in a safe harbor.
Economists’ term for markets where there’s just one overwhelmingly dominant buyer of labor (or anything else) is a monopsony.
The Friedman Doctrine in 1970 begat the shareholder supremacy movement in the 1980s, which begat an unraveling of all the old norms concerning loyalty and decency of businesses toward employees. Loyalty implies treating employees better than the law requires, which was at odds with the new mandates of shareholder supremacy. Replacing strikers was a shock-and-awe swerve, outsourcing work to low-wage contractors a less dramatic form of cold-bloodedness. Both were highly effective means of scaring workers in order to reduce their power and keep their pay lower.
But then came the 1980s. I mentioned earlier how the tax code tweak 401(k), which went into effect in 1980, handed a captive audience of millions of new customers and a revenue bonanza to the financial industry. But this innovation also provided a cost-cutting financial bonanza to employers. They now had another clever way to execute on the new Scrooge spirit: replacing the pensions they’d funded for decades with individual-worker-funded investment plans—self-reliance! freedom!—cost them less right away and cost them nothing once employee number 49732 left the building for good.
Today only one in eight private sector employees are in line to get such a pension, and most American workers don’t even have a 401(k) or an IRA or any other retirement account. It’s yet another route by which the U.S. political economy made a round trip from 1940 to 1980 and then back again.
I am frequently concerned about being laid off. From 1979 through the 2000s, that statement was posed in a regular survey of employees of four hundred big U.S. corporations, each person asked if they agreed or disagreed. In 1982, early in our new national musical chairs game, during a bad recession with high unemployment, only 14 percent of this large sample of workers said they felt anxious about losing their jobs. The number crept upward during the 1980s, and then in the ’90s people finally registered that, uh-oh, our social contract had been completely revamped. By 1995, even though the
...more
employees of the biggest corporations, whose jobs everyone had considered the most secure, were now too frightened of being jettisoned from those jobs to push hard for more pay or better working conditions. Those data and their implications must’ve slipped Greenspan’s mind later when he found it “difficult to judge” the effects of insecurity on workers’ leverage and pay. And he never mentioned, of course, that it was he and his confederates on the right who’d spent the last decades restructuring our political economy to reduce the power of workers and increase their job insecurity.
At the same time that economic insecurity grew, new sources of economic inequality were built into our system that made insecurity more chronic and extreme. Scores of public and private choices and changes increased inequality, all shaped by the new governing economic gospel: everybody for themselves, everything’s for sale, greed is good, the rich get richer, buyer beware, unfairness can’t be helped, nothing but thoughts and prayers for the losers.
Only a quarter of people graduating from four-year public colleges and universities in the early 1990s had student loan debt; by 2010, two-thirds did. Credit had been deregulated in the 1980s just in time for the business of student loans to explode in the ’90s. When I graduated college in 1976, the total amount of money lent to students to pay for higher education each year was the equivalent of $8 billion—but by the first school year of the 1980s, it had jumped to $22 billion, and in 2005 it reached $100 billion. In other words, over those three decades, while the number of students grew by
...more
That change is particularly clear in a recent study conducted by Stanford and Harvard economists. In 1970, they found, almost all thirty-year-old Americans, 92 percent, were earning more than their parents had at that age and older. Among Americans in their early thirties in 2012, however, only half were earning more than their parents had—and for sons compared to fathers, even fewer. That enormous difference over two generations was mainly caused not by slower economic growth, the economists found, but by how American economic growth was shared after 1980. If we’d continued slicing the pie as
...more
Unlike most of the other economic changes I’ve discussed, like shareholder supremacy and ending antitrust and defeating organized labor, this wasn’t part of the original strategy of big business and the right. But for them it isn’t exactly collateral damage either, because it has been a political boon. So far they’ve brilliantly managed to redirect the anger of most of the (white) left-behinds to keep them voting Republican, by reminding them that they should resent the spoiled college-educated liberal children and grandchildren of the acid amnesty abortion liberal elite who turned on them and
...more
Which made the president seem moderate when his 1935 “Soak the Rich Act” raised the tax on income over $2 million to 55 percent. “Political equality,” FDR said in 1936, is “meaningless in the face of economic inequality.” In what he pitched as a Second Bill of Rights, he proposed guaranteeing all Americans “the right to earn enough” for “a decent living” and “a decent home” and to have “adequate medical care” paid for with “a tax [on] all unreasonable profits, both individual and corporate.” That was 1944, Peak Leftism for Democrats on economics.

