More on this book
Community
Kindle Notes & Highlights
Read between
August 22 - August 28, 2020
Immigration, like outsourcing and tighter regulation of unions, allowed employers to pay less for many kinds of labor. But immigrants came with other huge costs: new schools, new roads, translation (formal and informal), and health care for those who could not afford it. Those externalities were absorbed by the public, not the businessmen who benefited from immigration. Naturally businessmen preferred this arrangement to the old one, which had involved paying expensive benefits to an entitled, querulous, native-born, and sometimes unionized workforce.
Sectors in which low-wage newcomers dominated (restaurants, landscaping, construction) began to crowd out sectors in which they did not (mostly manufacturing and local retail). Now immigration was the economy.
If we were judging open immigration and outsourcing not as economic policies but as U.S. aid programs for the world’s poor, we might consider them successes. But we are not. The cultural change, the race-based constitutional demotion of natives relative to newcomers, the weakening democratic grip of the public on its government as power disappeared into back rooms and courtrooms, the staggeringly large redistributions of wealth—all these things ensured that immigration would poison American politics right down until the presidential election of 2016.
People could sense the deteriorating relative position of the working class even before it showed up in the statistics. Wealth was being more openly signaled,
the targets of elite condescension could be roughly identified as those Americans who made up the Reagan electorate, minus the richest people in it. A new social class was coming into being that had at its disposal both capitalism’s means and progressivism’s sense of righteousness. It would breathe life back into the 1960s projects around race, sex, and global order that had been interrupted by the conservative uprisings of the 1970s.
Starting in the1980s, American businessmen freed themselves from the customs that had bound them to their country’s labor force. They established a new and more profitable symbiosis with immigrants and less empowered, less well-compensated workers overseas.
The security of the working class had provided a margin for error for all the social experiments of the last half-century. By 2016, that margin was gone.
economist Robert Gordon found no special productivity boost from the computer age.
American innovation “would narrow to a trickle” after the 1960s.
As Americans’ new machines were recording reality with ever more precision, irrationality and even superstition were on the rise. This is not as paradoxical as it sounds. Acquired knowledge obeys a Malthusian logic: Each new fact brings a handful of new questions, which, when answered, bring a handful more. Facts grow arithmetically, but questions grow geometrically. The result is a deterioration of certitude. The closer we get to the truth, the less confident we are in our possession of it.
Postmodernism was a kind of insurgency.
Postmodernism described narratives, from communism to mainline Christianity, that—interesting though they might be as myths—were losing their power to bind people into communities and spur them to action. Postmodernism also vied with those narratives: It was a project to delegitimize them. Every institution it penetrated, it politicized.
Postmodern writers and critics naturally took up one of the great obsessions of the 1960s and ’70s: authenticity.
the question of why anybody should remain anything he happens to be to begin with.” Understanding this is necessary to understanding the political anxiety that overtook the United States in the second decade of the twenty-first century. Fascinating though the paradoxes of postmodernism are to intellectuals, they are not principles for living. Most people hate them. If all arguments and narratives were “unstable” or “contingent,” who would take the trouble to learn the facts on which they are based? On what basis would respect for truth rest? On what basis respect for other human beings? Such
...more
Fascinating though the paradoxes of postmodernism are to intellectuals, they are not principles for living. Most people hate them. If all arguments and narratives were “unstable” or “contingent,” who would take the trouble to learn the facts on which they are based? On what basis would respect for truth rest? On what basis respect for other human beings? Such questions were not low-stakes at all. The collapse of old absolutisms was supposed to open up space for diversity. It opened up space for something of that name, which soon showed signs of becoming an absolutism itself.
Powell’s opinion, in short, did not eliminate quotas. It just dressed them up as something else. It required all schools that used racial preferences to recast them as programs to promote their interest in the diversity of their student bodies. That was an interest that many universities had not realized they had.
The OCR was now writing detailed standards for racial balance that courts accepted as grounds for ordering injunctive relief. Those standards were called quotas in 1970 and diversity after 1990.
Since repairing race relations was taken as an emergency, the safeguards that had been in place to prevent abuses in regulation writing—from traditions of “notice and comment” to newer applications of the Administrative Procedure Act—were never applied.
That innovation caused civil rights law to work in a very different way ...
This highlight has been truncated due to consecutive passage length restrictions.
So without the participation—or even the knowledge—of the broad, non-lawyerly public, progressive legal projects ricocheted from bureaucrats to judges and back, growing more ambitious and onerous with each bounce. A net of regulatory power soon constrained “the conduct of nearly every employer, school and unit of state and local government in the country,”
Racial preference was meant to remedy not past but present discrimination. If there was no evidence of such discrimination, it could only be because the whites who held power were hiding it.
This was not the glorious role white Americans had envisioned for themselves when they came bearing what they saw as the gift of civil rights in 1964.
Civil rights, as it developed after the Bakke case, required censorship.
people were permitted to take positions like Glazer’s, to argue that any part of the difference in outcomes between the races was attributable to anything other than racism, the entire logic of civil rights law would break down. So now the government got into the business of promulgating attitudes about race.
Certain whites, however, far from feeling the shame of racism, stood in a newfound moral effulgence as fighters against it,
this was a “postmodern” era, when all narratives of religion, patriotism, material progress, scientific objectivity, and gentlemanly virtue were under suspicion. But the narrative of racial justice that had motivated the activists of the 1960s was an exception. Alone among historical accounts, it was above suspicion. As the unique surviving narrative, it became a moral beacon.
Al Campanis
Discouraging or disciplining racist attitudes was no longer enough—it had become necessary to destroy the life and livelihood of anyone even suspected of harboring them.
The price of that legacy was a system of censorship.
Litigation could make it embarrassing, expensive, and potentially fatal to an organization like the Los Angeles Dodgers or CBS to have anyone in their employ speculating, woolgathering, or talking off the cuff. It was an institutional innovation. It grew directly out of civil rights law.
Americans in all walks of life began to talk about the smallest things as if they would have their lives destroyed for holding the wrong opinion. And this was a reasonable assumption.
Because there was no statutory “smoking gun” behind it, this new system of censorship was easily mistaken for a change in the public mood, although it remained a mystery how a mood so minoritarian could be so authoritative. The system itself came to be called political correctness.
ethnic studies departments, which had spread to virtually all universities by the end of the 1970s, aimed not so much at understanding power relations among ethnic groups as at transforming them. Another purpose was to provide a welcoming landing spot for students admitted under affirmative action programs.
The heart of it was a set of sober procedures promulgated by cautious academic administrators and government regulators frightened of civil rights law.
What made political correctness different from other persecutory interludes in twentieth-century America, including McCarthyism, was its direct access to the courtroom.
The Reagan era had in retrospect marked a consolidation, not a reversal, of the movements that began in the 1960s. In the quarter-century after Reagan, conservatives lost every battle against the substance of political correctness.
Political correctness was not a joke after all. It was the most comprehensive ideological capture of institutional power in the history of the United States.
Those who pooh-poohed P.C. assumed that the partisan arrangements that had governed Western thinking in the Cold War would last forever.
Now, in fact, it was possible for people who had wanted a different racial or sexual order to demand it of the American system without incurring the suspicion that they were working against the country’s national security.
Universities may have been radicalizing not because radical Baby Boomers were entering the faculty but because conservative Baby Boomers were exiting the student body.
Political correctness was a top-down reform. It was enabled not by new public attitudes toward reactionary opinions but by new punishments that could be meted out against those who expressed them.
The power of political correctness generally derived, either directly or at one remove, from the civil rights laws of the 1960s.
No longer was the irreconcilability of individuals’ and society’s sexual priorities a tragedy or a disagreement. Recast in the categories of civil rights law, it was a crime, a crime that was being committed against a whole class of people.
As a matter of common sense, both tradition and diversity had claims that needed to be respected. As a matter of law, only diversity did.
Republicans, again, were blind to this. They saw political correctness as little more than a series of jokes.
Once social issues could be cast as battles over civil rights, Republicans would lose 100 percent of the time. The agenda of “diversity” advanced when its proponents won elections and when they lost them. Voters had not yet figured that out. As soon as they did, the old style of democratic politics would be dead.
“Political correctness” was a name for the cultural effect of the basic enforcement powers of civil rights law.
Reagan had won conservatives over to the idea that “business” was the innocent opposite of overweening “government.” So what were conservatives supposed to do now that businesses were the hammer of civil rights enforcement, in the forefront of advancing both affirmative action and political correctness?
Only with the entrenchment of political correctness did it become clear what Americans had done in 1964: They had inadvertently voted themselves a second constitution without explicitly repealing the one they had. Each constitution contained guarantees of rights that could be invoked against the other—but in any conflict it was the new, unofficial constitution, nurtured by elites in all walks of life, that tended to prevail. This was a recipe for strife.
Affirmative action and political correctness were the twin pillars of the second constitution. They were what civil rights was. They were not temporary. Affirmative action was deduced judicially from the curtailments on freedom of association that the Civil Rights Act itself had put in place.

