Peter L. Berger's Blog, page 139
September 1, 2017
Guterres Hits the Reset Button in Israel
While Antonio Guterres was in Israel for his maiden visit as UN Secretary-General, Israeli Prime Minister Benjamin Netanyahu delivered a defiant speech that should have caused the entire visit to explode. “There will be no more uprooting of settlements in the land of Israel,” announced Netanyahu at an event celebrating fifty years of settlement in the West Bank. “We are here to stay forever.” In a veiled reference to Guterres, just hours after their meeting, Netanyahu said he warns all visiting foreign leaders that if the forces of radical Islam were to take over the West Bank after an Israeli withdrawal, this would endanger the entire Middle East.
But this extraordinarily mistimed intercession caused no visible diplomatic damage. The following day, Guterres met Palestinian Authority Prime Minister Rami Hamdallah in Ramallah, and repeated his and the UN’s “total commitment” to the two-state solution. He said that settlement activity—which he called illegal under international law—is an “obstacle that needs to be removed” for that solution to be realized. Guterres repeated longstanding UN policy, with nothing to suggest the speech had been amended at the last minute to serve as an explicit rebuttal of Netanyahu. Guterres had clearly chosen to avoid a frontal collision with the Israeli Prime Minister—and what could have been an embarrassing diplomatic incident was deftly swept under the rug.
In the course of his three-day visit, Guterres ably positioned himself as the friendliest Secretary-General Israel could realistically hope for. While he faithfully reaffirmed the UN’s support for Palestinian statehood and opposition to closures on the Gaza Strip—positions at odds with the Israeli government—he almost went out of his way to avoid clashing with Israel, in contrast to his predecessor, who did so all too often.
Consider the crisis in the Gaza Strip. Guterres called the situation there “one of the most dramatic humanitarian crises” that he had seen in many years—but notably avoided pointing the finger at Israel (or at the Palestinian government, which has been slashing electricity payments and medical budgets for Gaza in a bid to squeeze Hamas to relinquish power). Guterres indeed said that “it is important to open the closures,” which the UN opposes despite Israeli arguments that they are necessary to prevent Hamas from acquiring war materiel. But there was a notable shift in tone from predecessor, Ban Ki-moon, who in his final trip there said the Israeli closure of Gaza “suffocates its people,” calling it “collective punishment for which there must be accountability.”
In his remarks, Guterres explicitly promised Israel impartiality—a key demand from Jerusalem, which complains that the UN harbors a systemic bias against it and obsession with it. Speaking alongside both President Rivlin and Prime Minister Netanyahu, Guterres highlighted his personal commitment to impartiality and to the equality of states. In doing so, he picked up the baton from Ban Ki-moon, who confessed in his farewell remarks that the UN’s “disproportionate” focus on Israel has foiled its ability to fulfill its role properly. But in a major departure from Ban, Guterres also affirmed that calls for Israel’s destruction amount to a modern form of anti-Semitism—adding his voice to a growing chorus of world leaders who say so.
Indeed, Guterres has already signaled in his short time in office that he is committed to steering the United Nations in a more impartial direction. He has on several occasions drawn fire for sticking his neck out in support of Israel. In his first month, he provoked demands for an apology from Palestinian officials for stating the historical fact that there was a Jewish temple in Jerusalem two millennia ago. Soon after, the head of the UN’s West Asia commission resigned in protest after Guterres demanded she withdraw a report accusing Israel of apartheid. He also publicly distanced himself from the UN Committee on the Exercise of the Inalienable Rights of the Palestinian People’s commemoration of “fifty years of the Israeli occupation” due to the participation of listed terror groups Hamas and the Popular Front for the Liberation of Palestine.
Guterres now returns to Turtle Bay having hit the reset button between the UN Secretariat and Israel. As Secretary-General, Guterres cannot greatly influence the voting behavior of countries in the General Assembly, Human Rights Council, UNESCO, or other bodies—the moves that so often provoke outrage in Israel. But he has reopened the doors of his good offices for engagement with Israel, cultivating the opportunity for cooperation. Whether that goodwill can be sustained under pressure during the next crisis, diplomats are sure to discover sooner rather than later.
The post Guterres Hits the Reset Button in Israel appeared first on The American Interest.
August 31, 2017
Harvey Is Having Global Effects
Harvey has saturated American media for nearly a week, now, and its devastating effects along the route it has meandered across the Gulf Coast are going to be front page headlines for weeks if not months, as communities in its path struggle with one of the country’s worst natural disasters on record.
But this isn’t just a story of American tragedy—Harvey tore through a part of the U.S. that’s exceptionally dense with vital energy infrastructure. One-third of U.S. refineries lie along the Gulf Coast, and nearly all were affected by this storm. A major pipeline artery that transports those refined petroleum products from the Gulf to the East Coast was shut down last night. The average national gas price is up more than 13 cents per gallon from a week ago, and wholesale prices have spiked 15 percent as a result of the storm. American drivers can expect prices to continue to rise as supply chains are disrupted by Harvey’s after-effects.
This isn’t only an American story, either. Thanks to the shale boom, the United States is now a major oil and gas supplier on the global stage, so it stands to reason that disruptions in our supply chains are going to affect markets abroad. The FT nicely sums up this bizarre dichotomy:
The US’s new status as an energy powerhouse has created a more flexible, diverse, and arguably resilient world fuel market…But Harvey is exposing an Achilles heel: the concentration of US energy assets in a low-lying, hurricane-prone coastal corridor makes the world more exposed to local weather.
Industry consensus at this point seems to be that the U.S. energy industry will recover relatively quickly, and thanks to the shale boom there are plenty of barrels of crude and gallons of refined petroleum products held in reserve that will help suppliers meet demand in the coming days and weeks. But keep in mind that before the shale boom, this would have been a regional energy story. Now it’s a global one.
The post Harvey Is Having Global Effects appeared first on The American Interest.
Ukraine Is Still Its Own Worst Enemy
In the early days of the Trump Administration, the outlook for Ukraine could hardly have been bleaker. The United States had just elected a man who pledged to overturn decades of foreign policy precedent, casting doubt on the alliance system that has underpinned the postwar order in Europe, and who promised to make nice with Russia. In that grim moment, prominent Ukrainians began to think the unthinkable: that Ukraine may have to compromise on Crimea or abandon its EU and NATO aspirations.
Eight months later, circumstances have changed, and largely to Ukraine’s benefit. President Trump, acting on the advice of his more experienced aides, has shifted to a more conventional foreign policy, while a skeptical Congress has tied his hands on resetting relations with Russia. Trump and his subordinates have repeatedly affirmed U.S. commitments to Ukraine and promised to restart the peace process. In his recent visit to Kyiv, Defense Secretary Mattis strongly hinted that the Trump Administration would provide lethal defensive weapons to Ukraine, a move that the risk-averse Obama Administration had refused to make. And on the economic front, Ukraine appears to be on the mend: output is increasing, growth forecasts are trending upwards, and inflation is falling.
But these sunny data points obscure a darker truth: Ukraine remains a deeply dysfunctional country, and its leadership is getting to be as corrupt, self-dealing, and nakedly authoritarian as the one uprooted by the Maidan protests. In recent months, activists and journalists have been increasingly harassed, while President Petro Poroshenko has consolidated his patronage network by sidelining reformers, kicking rivals out of the country, and surrounding himself with loyalists. Meanwhile, familiar characters like Yulia Tymoshenko plot in the shadows.
The greatest danger for Ukraine today, therefore, may not be being overrun by Russia or abandoned by the United States. Instead, the danger lies in Ukraine’s gradual backsliding into bitter patronage politics and soft authoritarianism—twin scourges that could put out the promise of the Maidan just as surely as they extinguished the Orange Revolution.
Ukraine’s predicament is, in some ways, not unique. Like many post-Soviet states, it is burdened by an elite oligarchic structure that has proven difficult to shake off, despite the hopes of the activists and reformers who led the Maidan movement. As Andreas Umland argues in a piece for Open Democracy (with a nod to the political scientist Henry Hale), Ukraine still represents a classic case of “patronal politics,” where power is distributed through clientelistic networks rather than official political processes:
In patronal political regimes, power is accumulated and exercised through more or less successful building, maintenance and interaction of distinctly informal and frequently interlocking pyramid structures headed by men (and sometimes, women) at the helm of large economic conglomerates, regional political machines, or central state administrations. […]
Typically, the most powerful of these networks reach into a broad variety of social institutions ranging from ministries, agencies and parties to companies, media outlets and NGOs. The glue that holds these complicated coteries together are less institutional hierarchies than familial ties, personal friendships, long-term acquaintances, informal transactions, mafia-like behaviour codes, accumulated obligations, and withheld compromising materials (kompromat).
Patronal systems like Ukraine’s are difficult to uproot, Umland suggests, because they are much more nimble than most people understand. A government’s stated values may change, but its underlying clan structure does not. “Within a patronal political regime, officially pro-Western foreign policies may easily co-exist with hyper-corrupt policies at home,” Umland argues, “as long as this tension does not touch upon the ruling clan’s financial and other interests.” Moreover, such regimes can implement Western-style reforms in order to secure foreign support, even while manipulating those reforms at home to undermine their intent or selectively target opponents of the regime.
That path is precisely the one that Kyiv has chosen to pursue. President Poroshenko has a track record of unveiling laudable reforms for his Western audience, and then subverting or twisting them upon implementation. In March, for instance, after Ukraine adopted an asset disclosure law to reveal the wealth of public officials, the law was amended to make the same demands of anti-corruption NGOs. Ukraine’s Western allies saw the move as a transparent attempt to selectively target activists, journalists, and NGOs whose investigations came too close to comfort for the authorities.
Poroshenko has since worked to re-draft the measure, but activists like Vitaly Shabunin—a frequent critic of the President and head of the Anti-Corruption Action Center—allege that the new amendment is a sham. Shabunin himself has become a prominent target, one of several activists who claims to have been illegally wiretapped and otherwise harassed by the state. In May, his organization was accused in Parliament of embezzling American aid money—charges leveled by a mysterious rival NGO that appears to have been created overnight to lead a smear campaign against him. Soon afterwards, the tax police launched an investigation of Shabunin’s organization. There is also evidence that Ukraine’s security services have orchestrated protests against him.
But as the Kennan Institute’s Kateryna Smagliy argues, some civil society groups have left themselves vulnerable to exactly these sorts of attacks by integrating themselves into existing patronage networks, both domestic and international. Some Ukrainian NGOs are more focused on writing reports to please Western donors than addressing the needs of local Ukrainians, while others are indebted to major oligarchs like Viktor Pinchuk or Ihor Kolomoisky. When these anti-corruption NGOs target the government, then, it is all too easy to discredit them as just another tool that rival oligarchs are wielding in order to advance their own interests.
Ukraine’s struggle to deliver on the democratic promise of the Maidan has not diminished the public’s appetite for reform. A recent poll showed that 51 percent of Ukrainians consider state corruption to be a top priority for Ukraine—more, even, than cited the war in the east with Russia. But few believe that either the state or civil society is successfully rooting out corruption. Approval ratings for all of Ukraine’s leading politicians are dismal: the most popular one, Yulia Tymoshenko, enjoys a mere 22% approval. Another reshuffling of the country’s loathed political elites will hardly fix the underlying pathologies of the Ukrainian state. A deeper reckoning is needed, one that takes into account the West’s own culpability in propping up Ukraine’s corrupt oligarchy.
Western countries have all too often been the enablers of Ukraine’s kleptocratic ruling class. Tax havens like Luxembourg, Switzerland, and Cyprus are welcoming destinations for oligarchs’ ill-gotten gains. The laxly regulated London real estate market provides a convenient mechanism for laundering such dark money through shadowy shell companies. And as the Panama Papers revealed last year, the rot extends to the top of Ukraine’s current leadership: President Poroshenko has a tangled array of undisclosed offshore companies that hold his immense wealth far from the jurisdiction of the country he rules. Despite the West’s reluctant embrace of Poroshenko, he remains a product of the old system, a longstanding member of the oligarch class for whom such behavior is standard.
Apart from money laundering, the West is also complicit in laundering reputations. Every Ukrainian oligarch or politician worth his or her salt has lawyers and lobbyists on hire in Washington, working to advance his or her interests on K Street, Capitol Hill, and among the Ukrainian diaspora. In this way, the lobbying industry in Washington profits from, and perpetuates, Ukraine’s patronal political system, with American firms acting as just another asset in their patrons’ networks. The problem is a longstanding and bipartisan one: as a recent Daily Beast story points out, Democratic firms were just as willing as Republican ones to work for scuzzy figures like Paul Manafort to whitewash the crimes of the Yanukovych regime. And in the public sector, American administrations have often downplayed the democratic shortcomings of Ukrainian governments that publicly profess Western ideals.
In short, Western willingness to launder oligarchs’ money and whitewash their reputations allows the current system to flourish. Tackling this problem in its full scope would take immense political will and coordination of the sort that the West seems incapable of summoning at the moment. But there are smaller measures that the United States and EU can take in the short term to address its own deficiencies and steer Kyiv in the right direction.
Ukraine’s allies can and should put further pressure on the Poroshenko government, tying economic assistance to substantive reforms, like the anti-corruption court that Ukraine’s embattled reformers have long demanded. Many of Ukraine’s most significant recent reforms only passed because of such conditions from the IMF and EU; further aid should not come freely. Similarly, the Trump Administration should designate a point person on Ukraine—similar to the role Joe Biden played in the Obama Administration—to hold Kyiv’s feet to the fire in the fight against corruption. And it should expand law enforcement exchanges, like the one between the FBI and the National Anti-Corruption Bureau of Ukraine, to assist Kyiv in pursuing prominent kleptocrats and recovering stolen assets.
If the West does not exert such pressure, familiar patterns will continue to reassert themselves. Anti-corruption investigations will stall, held up by the authorities. Presidential power will grow, as Poroshenko acts with impunity to sideline political rivals (his parliamentary allies are already calling for a criminal investigation of Tymoshenko). And as the ruling class prepares for elections in 2019, the country will devolve into the factional infighting and corrupt deal-making that have long been hallmarks of Ukraine’s patronal politics.
We’ve seen this movie before, and it ended with Viktor Yanukovych taking power after the Orange Revolution failed to deliver on its reformist promises. Ukraine has already blown one chance at real reform. It can ill afford to lose another.
The post Ukraine Is Still Its Own Worst Enemy appeared first on The American Interest.
Kirkuk to Join Kurdistan Independence Referendum
Despite warnings to postpone or cancel from the United States, the federal Iraqi government in Baghdad, Turkey, and others, the Iraqi Kurdistan Regional Government (KRG), is barreling ahead with its independence referendum scheduled for September 25th. One of the key questions for the long-awaited vote is whether it would include disputed territories under Kurdish Peshmerga control that are not formally within the boundaries of the official Kurdistan region. Now, local government councils appear to be answering that question, as Reuters reports:
Iraq’s oil-producing region of Kirkuk will vote in a referendum on Kurdish independence on Sept. 25, its provisional council decided on Tuesday, a move that could increase tension with Arab and Turkmen residents. [….]
Only 24 of the 41 council members attended Tuesday’s vote, with 23 voting in favor of participating in the referendum. One abstained.
The remaining council members – all Arabs and Turkmen – boycotted the vote. Instead, they issued statements denouncing the vote as “unconstitutional.”
The vote appears to have followed strict ethnic lines, with 23 of the council’s 26 Kurdish members voting yes and the remaining nine Turkmen and six Arab council members abstaining. Whether that voting composition is ethnically representative of Kirkuk province is probably impossible to say and is a hugely controversial issue to begin with. Under Saddam, Kirkuk was subject to an “Arabization” program of ethnic cleansing directed at Kurds, Turkmen and Assyrians alike. Iraq’s last official census was conducted in 1957, but the two censuses conducted during independence as well those conducted by the British are disputed, given that they were used in part to support the legitimacy of the Sunni Arab Hashemite monarchy. Widely reproduced maps of the province’s ethno-sectarian composition, like Columbia University’s Gulf/2000 Project map “Kirkuk and Environs: Ethnic Composition, 2014,” include citations going back to the Ottoman era, which is probably the last time reliable data on the question was collected. Given the back and forth refugee flows since ISIS’ advances in 2014, it’s impossible to say what the composition of the province’s council “ought” to be, but Baghdad’s objections about the legitimacy of the vote have merit.
That being said, the real concern isn’t about a legitimate democratic process or whether Kirkuk’s Arabs and Turkmen are being properly represented. It’s about oil money. While the official Kurdistan region is itself oil-rich, Kirkuk’s oil fields and refineries would be an enormous boon for an independent Kurdistan and a major loss for the cash-strapped government in Baghdad. While oil production in Kirkuk virtually stopped after the rise of ISIS, and oil revenue has collapsed from low prices, large-scale production has resumed in the past year or so. Last year, a deal between the KRG and Baghdad resulted in a 50/50 revenue split for all of Kirkuk’s oil. That deal is likely to end, one way or the other, after Kurdish independence.
While Turkey opposes the KRG independence vote, the news that Kirkuk will be included in the vote will be of some solace. Although revenue from Kirkuk’s oil is currently split between the KRG and Baghdad, the hundreds of thousands of barrels produced from the province each day are shipped north through the Kirkuk-Ceyhan pipeline to Turkey. While the federal Iraqi government does the same for now, they have long threatened and pursued efforts to instead ship Kirkuk’s oil via Iran. An independent Iraqi Kurdistan that excluded Kirkuk would almost certainly see the end of the revenue sharing agreement and would make exporting Kirkuk’s oil via another route a critical issue for Baghdad, and a significant loss for Turkey.
The KRG is playing a high-stakes game. While continued oil flows might be a salve for Turkey, the leader of Turkey’s far-right nationalist party recently described the independence vote as a potential act of war, a claim which had to be walked back by Turkey’s foreign minister. By including Kirkuk in the referendum, Baghdad may decide that it has no choice but to seize the oil fields by force. While the U.S. strongly backs the KRG, the Administration is urging a delay in the independence vote, a warning which the KRG ignores at its own risk. After all, the KRG remains reliant on U.S. military aid which may now be in jeopardy. Just as in eastern Syria, the fall of ISIS in northern Iraq may simply eliminate a common enemy but lead to a wider conflict between stronger regional powers.
The post Kirkuk to Join Kurdistan Independence Referendum appeared first on The American Interest.
August 30, 2017
It’s Time to Retire the Silicon Valley Mythos
Google, like any corporation, has wide latitude to fire its employees for any number of reasons. And Google, like any corporation, is entitled to attach any conditions it wishes to its philanthropic donations.
So why did the company generate two high-profile scandals in the last month for doing those very things—first by firing an engineer for criticizing the company’s diversity practices, and then today by muscling out critics of its business practices from the New America Foundation, a think tank the company has supported heavily?
Both of these incidents raised hot-button political issues—free speech, gender differences, and political correctness in the case of the heretical engineer, and monopoly power and corporate influence in the case of the populist policy wonks at New America. But the salience of both firings has been magnified, I would argue, because they fly in the face of the way that our society, and especially the media and chattering classes, has historically—and wrongly—seen the technology industry.
In a perceptive 2013 New Yorker article on Silicon Valley’s politics, George Packer (now himself a fellow at New America) wrote:
[I]t’s an article of faith in Silicon Valley that the technology industry represents something more utopian, and democratic, than mere special-interest groups. The information revolution (the phrase itself conveys a sense of business exceptionalism) emerged from the Bay Area counterculture of the sixties and seventies, influenced by the hobbyists who formed the Homebrew Computer Club and by idealistic engineers like Douglas Engelbart, who helped develop the concept of hypertext and argued that digital networks could boost our “collective I.Q.” From the days of Apple’s inception, the personal computer was seen as a tool for personal liberation; with the arrival of social media on the Internet, digital technology announced itself as a force for global betterment. The phrase “change the world” is tossed around Silicon Valley conversations and business plans as freely as talk of “early-stage investing” and “beta tests.”
For many years, the media carried water for this view—partly because of the technology industry’s genuinely idealistic roots, partly because of its role in the ill-fated Arab Spring, and partly because it has been broadly supportive of Democratic candidates and socially liberal causes. We are surprised to see Google wield its power in ways that appear authoritarian when, as John Herrman wrote in the New York Times, technology companies have so ably “put on the costumes of liberal democracies” as a marketing strategy.
The heroic narrative surrounding technology titans has been diminished in recent years by growing questions about privacy, political bias, and monopoly power. But the August axings of an employee and a think tank scholar who made arguments that interfered with Google’s PR interests ought to be enough to retire it altogether.
In limp statements issued after both incidents, Google tried to cling to its feel-good founding premises. “We strongly support the right of Googlers to express themselves,” Google’s CEO said in explaining his decision to fire James Damore for expressing incorrect opinions. “We respect each group’s independence, personnel decisions, and policy perspectives,” a Google spokesperson said of its philanthropic efforts, after compelling evidence surfaced that its executive chairman had leaned on New America to oust Bary Lynn and his team.
Fewer and fewer people are taking such fictions seriously. Compared to any other major corporate interests, the technology industry is not particularly open-minded, particularly democratic, or particularly respectful of individual rights. If we revise our assumptions accordingly, we will be less surprised about what happens in Silicon Valley—and more cynical.
The post It’s Time to Retire the Silicon Valley Mythos appeared first on The American Interest.
Affirmative Action in a Time of White Resentment
On August 10, only a few days before the white supremacist murder in Charlottesville upended the national discussion of anything race-related, The New Yorker published an intriguing essay on affirmative action by Harvard law professor Jeannie Suk Gersen. The piece, entitled “The Uncomfortable Truth About Affirmative Action and Asian Americans,” was a response to reports from the previous week that the Department of Justice’s Civil Rights Division was looking into “investigations and possible litigation related to intentional race-based discrimination in college and university admissions.” Contrary to initial speculations, the DOJ would not be targeting affirmative action policies that allegedly harmed whites, but reviving a complaint against Harvard for discriminating against Asian applicants.
Suk’s “uncomfortable truth” is that the complaint is probably right as far as it goes. Asians and Asian Americans are almost certainly disadvantaged in elite college admissions and possibly subject to illegal quotas. They must have substantially higher standardized test scores than other groups to gain admission, and their share of the Ivy League student population has remained stagnant since the early 1990s—a period over which their population has increased at four times the rate of the U.S. population as a whole. If elite schools’ holistic admissions practices do not involve Asian quotas, then their enrollment figures are explicable, as Suk notes, only if “Asian applicants are severely less likely than white ones to have the special personal qualities that colleges seek.”
This point by itself should not be particularly controversial, but the dynamics of the affirmative action debate has led defenders of racial preferences to try to obfuscate it. A representative Vox essay on the subject studiously avoids the question of whether Asians are disadvantaged in college admissions while offering a variety of non-sequiturs meant to remind them that they are on Team POC. It notes, for instance, that Asian groups like Laotians and the Hmong are underprivileged; that Asians are underpaid and under-promoted relative to their education; that conservatives have used the “model minority myth” to praise Asians as a “foil” to blacks and Latinos; and that some white students have come up with racist names for heavily Asian colleges. But Laotians and the Hmong account for less than five percent of the Asian American population, Asians may not be underpaid if one looks only at those born in the United States (and if they were they would need a fair shake in admissions anyway), and the other two claims are simply irrelevant to the matter at hand.
Although substantively weak, such arguments recur time and again thanks to the partisan contours of the debate (and, perhaps, thanks to a folk belief that any program benefitting one minority must as a matter of logical necessity benefit every other). Suk rejects such evasions. Whatever elite colleges say they are doing with their admissions policies, she is clear that they are in fact engaged in racial balancing—that is, managing their demographic makeup such that they are not “swamped by members of any particular race.” And in practice they are most worried about being swamped by Asian students. Although taboo and possibly illegal, Suk argues that such balancing is realistically “unavoidable” and may in fact be desirable, as “we should not want the composition of our élite universities to be wildly out of proportion to the racial composition of our country.”
Suk is probably right that balancing of some form is warranted. It is hazardous to build an elite that looks nothing like the country it presides over. But openly admitting to penalizing Asians in the college admissions process threatens not only to doom affirmative action in the view of the courts; it also seems to subvert its moral logic, which rests on the idea that America’s white majority owes a leg up to groups it oppressed in the past. Affirmative action becomes more difficult to justify if those most subject to reverse discrimination are members of a minority group that was not responsible for those historical injustices—and indeed was itself subject to many of them.
This dilemma leads Suk to a proposal that is dangerous precisely because it is so compelling. She agrees it is unjust to discriminate against Asians, yet wishes to maintain affirmative action programs benefitting blacks and Latinos; to square this circle, she separates the two policies, suggesting we jettison the former while maintaining the latter. After all, “The problem is not race-conscious holistic review; rather, it is the added, sub-rosa deployment of racial balancing in a manner that keeps the number of Asians so artificially low relative to whites who are less strong on academic measures.” What is needed is thus “race-conscious affirmative action” to “address the historic discrimination and underrepresentation of blacks and Latinos, in combination with far less severity in the favoring of whites relative to Asians.” That is to say, keep the current preferences for blacks and Latinos while eliminating the penalty for Asian Americans.
This is a deeply elegant solution for liberals. It defangs one of the most commonsensical objections to race-conscious admissions—that they unfairly hurt a minority group—while neutralizing any conservative attempt to offer Asians a quid pro quo on affirmative action. It would also remove a source of tension between Asians and other minority groups that tend to vote Democratic. Although some polls have suggested a majority of Asian Americans support affirmative action, these rely on framing that dodges the issue of anti-Asian penalties. When affirmative action has come up in the California legislature, however, Asian-American Democrats have split with blacks and Hispanics and voted with Republicans to defend race-blind admissions.
But what would Suk’s proposal look like in practice? Harvard’s incoming freshman class is 49.1 percent white and 22.2 percent Asian; for the United States as a whole, those numbers are 63.7 percent and 4.7 percent, respectively. What does Harvard look like without white-Asian “balancing,” assuming that both groups continue to compete for the same roughly 70 percent of slots? 40 percent white and 30 percent Asian? 35 and 35? 30 and 40? At CalTech, which admits mostly based on test scores, the undergraduate population is 29 percent white and 41 percent Asian; at UC Berkeley, which is prohibited by state law from using race-conscious admissions practices, the numbers are almost identical. Granted, both of these schools are in California, where Asians make up 15 percent of the population. But if one assumes an even split at 35-35, then whites’ share at the country’s top school would fall to just over half of their share in the total population; Asians would be overrepresented by around a factor of seven. Crucially, from whites’ perspective, this would be the result not of an academic horse race but of an inconsistent selection process where balancing is applied only when doing so benefited non-whites.
One could certainly argue for such a scenario on moral or ideological grounds. Whites have dominated the American elite since its founding. A handicap in elite college admissions is by most standards a small price to pay for helping to level the playing field. Yet I believe Suk was right in her original suggestion that for reasons of social stability, colleges should be wary of allowing their demographics to diverge too far from those of the country as a whole. The problem is that keeping traditional race-based affirmative action while scrapping Asian balancing might do exactly that.
Morals aside, the case for ethnic balancing is essentially a pragmatic one: in a diverse country, it is important for different groups to buy in to the system. This is probably the best justification for the current arrangement, in which access to the upper echelons of American society—as mediated through elite higher education—is granted on the basis of an obscure calculus combining individual merit, family circumstances, and de facto recognition that the country’s racial and cultural groups need to see themselves reflected in the corridors of power. This may be unfair to individuals, but it is better than a nominally meritocratic alternative in which some groups monopolize access to the elite while others are shut out of it altogether.
Such disproportion is politically dangerous enough when a group shut out of power is a minority—in order to ensure stability, for instance, Singapore works hard to ensure that ethnic Malays, at around 13 percent of the population, feel represented in the political system. The problem is far worse when the out-group is a majority, and is especially delicate when that majority perceives it is being shut out of important corridors of power by a culturally or ethnically alien elite. As Amy Chua argued in World on Fire, such arrangements—in which a country’s elite is ethnically distinct from the majority of the population as a whole—can be highly unstable. The elite may feel little obligation toward the general population and is often forced to rule through undemocratic means. Members of the ethnic majority, in turn, becomes susceptible to blood-and-soil demagoguery that promises to give them back the fruits of “their” land. The turn toward majoritarian chauvinism is, as Chua argues, not a unique product of any one country’s history, but a pattern that recurs around the world—in Latin America, Malaysia, Myanmar, the Philippines, and Russia.
American politics is already beginning to see the effects of whites becoming more conscious of themselves as a declining majority; and though many whites accept this as a positive or merely neutral development, many view it with fear or anxiety. One of the great political challenges of the coming decades will be to soothe the latter group’s anxieties such that the country doesn’t wind up with a significant plurality of whites whose basic desire is to smash a system they mistrust and resent. Although elite college admissions are only a tiny part of this larger puzzle, they go a long way in determining the makeup of the American elite, and their policies are important as signals of how that elite thinks and what it values. If purely meritocratic admissions would create an ethnic makeup we are unwilling to accept, then the current policy of balancing may be the best of bad options.
The post Affirmative Action in a Time of White Resentment appeared first on The American Interest.
The End of the Working Class
Increased income inequality; wage stagnation; skill-biased technological change; productivity growth slowdown; rising college wage premium; labor-market polarization; declining prime-age labor force participation; low intergenerational relative mobility; declining absolute mobility—all of these are concepts developed by economists to describe the dimming prospects for ordinary American workers. Taken together, they inform the consensus view that something is wrong with the American economy that isn’t going away anytime soon.
But if we follow the experts in looking at our problems solely from an economic perspective, we will fail to appreciate the true gravity of our situation. Yes, the relevant data on “real” or inflation-adjusted incomes have been disappointing and worrisome for decades. In particular, the sharp rise in income inequality, created mostly by a rollicking rise in the top 1 percent of incomes, has meant that incomes for typical American households have not kept pace with the overall growth of the economy. Nevertheless, a careful and dispassionate review of the data shows that incomes continued to inch upwards since the 1970s. Indeed, of those who “fell” out of middle-class status over the past 25 years, depending on how one defines it, a good many fell “up” to higher income brackets. Although the Great Recession knocked incomes downward, they have now recovered almost all the ground they lost. When we factor in the fact that comparisons of real incomes can never capture access to new products that previously were unavailable at any price, the reasonable conclusion is that overall material living standards in the United States today are at their highest levels ever. Relative stagnation may frustrate our expectations, but isn’t the same thing as collapse.
If we pull back from a narrow focus on incomes and purchasing power, however, we see something much more troubling than economic stagnation. Outside a well-educated and comfortable elite comprising 20-25 percent of Americans, we see unmistakable signs of social collapse. We see, more precisely, social disintegration—the progressive unraveling of the human connections that give life structure and meaning: declining attachment to work; declining participation in community life; declining rates of marriage and two-parent childrearing.1
This is a genuine crisis, but its roots are spiritual, not material, deprivation. Among whites, whose fall has been from greater heights, the spreading anomie has boiled over into headline-grabbing acts of self-destructive desperation. First, the celebrated findings of Anne Case and Angus Deaton have alerted us to a shocking rise in mortality among middle-aged whites, fueled by suicide, substance abuse—opioids make headlines these days but they hardly exhaust the list—and other “deaths of despair.”2 And this past November, whites in Rust Belt states made the difference in putting the incompetent demagogue Donald Trump into the White House.
What we are witnessing is the human wreckage of a great historical turning point, a profound change in the social requirements of economic life. We have come to the end of the working class.
We still use “working class” to refer to a big chunk of the population—to a first approximation, people without a four-year college degree, since those are the people now most likely to be stuck with society’s lowest-paying, lowest-status jobs. But as an industrial concept in a post-industrial world, the term doesn’t really fit anymore. Historian Jefferson Cowie had it right when he gave his history Stayin’ Alive the subtitle The 1970s and the Last Days of the Working Class, implying that the coming of the post-industrial economy ushered in a transition to a post-working class. Or, to use sociologist Andrew Cherlin’s formulation, a “would-be working class—the individuals who would have taken the industrial jobs we used to have.”
The working class was a distinctive historical phenomenon with real internal coherence. Its members shared a whole set of binding institutions (most prominently, labor unions), an ethos of solidarity and resistance to corporate exploitation, and a genuine pride about their place and role in society. Their successors, by contrast, are just an aggregation of loose, unconnected individuals, defined in the mirror of everyday life by failure and exclusion. They failed to get the educational credentials needed to enter the meritocracy, from which they are therefore excluded. That failure puts them on the outside looking in, with no place of their own to give them a sense of belonging, status, and, above all, dignity.
Here then is the social reality that the narrowly economic perspective cannot apprehend. A way of life has died, and with it a vital source of identity. In the aftermath, many things are falling apart—local economies, communities, families, lives.
This slow-motion catastrophe has been triggered by a fundamental change in how the capitalist division of labor is organized. From the first stirrings of the Industrial Revolution in the 18th century until relatively recently, the miraculous technological progress and wealth creation of modern economic growth depended on large inputs of unskilled, physically demanding labor. That is no longer the case in the United States or other advanced economies. Between automation and offshoring, our country’s most technologically dynamic industries—the ones that account for the lion’s share of innovation and productivity growth—now make little use of American manual labor.
The U.S. economy still employs large numbers of less-skilled workers, of course. They exist in plentiful supply, and U.S. labor markets are functional enough to roughly match that supply with demand for it. But all of this is occurring in what are now the backwaters of economic life. The dynamic sectors that propel the whole system forward, and on which hinge hopes for continued improvement in material living conditions, don’t have much need today for callused hands and strong backs—and will have less need every year going forward.
Economists describe this situation drily as “skill-biased technological change”—in other words, innovation that increases the demand for highly skilled specialists relative to ordinary workers. They contrast the current dynamics to the skill-neutral transition from an agrarian to an industrial economy. Then, workers displaced from farm jobs by mechanization could find factory work without first having to acquire any new specialized expertise. By contrast, former steel and autoworkers in the Rust Belt did not have the skills needed to take advantage of the new job opportunities created by the information technology revolution.
Here again, exclusive reliance on the tools of economics fails to convey the full measure of what has happened. In the heyday of the American working class during the late 1940s, 1950s, and 1960s, the position of workers in society was buttressed by more than simply robust demand for their skills and effort. First, they had law and policy on their side. The Wagner Act of 1935 created a path toward mass unionization of unskilled industrial workers and a regime for collective bargaining on wages and working conditions. And during World War II, the Federal government actively promoted unionization in war production plants. As a result, some three-quarters of blue-collar workers, comprising over a third of the total American workforce, were union members by the early 1950s. The Wagner Act’s legal structure allowed workers to amass bargaining power and direct it in unison against management, suppressing wage competition among workers across whole industries. Unionized workers were thus empowered to negotiate wages roughly 10 to 15 percent above market rates, as well as a whole raft of workplace protections.
It is important to note that the strictly legal advantages enjoyed by labor at the height of its powers have diminished very little since then. There has been only one significant retrenchment of union powers since the Wagner Act, and that occurred with the passage (over President Truman’s veto) of the Taft-Hartley Act in 1947—a few years before organized labor reached its high-water mark. What really transformed labor law from words on a page into real power was the second great prop of the working class’s position in society: collective action. Congress did not unionize U.S. industry; mass action did, never more dramatically than in the great General Motors sit-down strike of 1936–37, which led to the unionization of the U.S. auto industry. And once unions were in place, labor’s negotiating strength hinged on the credibility of the threat of strikes. Coming out of World War II, when strikes had been strongly discouraged, American workers hammered home the seriousness of that threat with a wave of labor actions, as more than five million workers went on strike during the year after V-J Day—the most strike-ridden year in American history.
This militancy and group cohesion paved the way for the 1950 “Treaty of Detroit” between Charlie Wilson’s General Motors and Walter Reuther’s United Automobile Workers. The deal provided the basic template for labor’s postwar ascendancy, in which workers got automatic cost-of-living adjustments and productivity-based wage increases while production schedules, pricing, investment, and technological change were all conceded to fall within the “managerial prerogative.” “GM may have paid a billion for peace,” wrote Daniel Bell, then a young reporter for Fortune, but “it got a bargain.”
The declining fortunes of organized labor are a direct result of workers’ ebbing capacity for collective action. After the great wave of unionization beginning in the 1930s, organizing rates peaked in the early 1950s and then went into long-term decline. As employment in smokestack industries started falling in the 1970s, the number of newly organized workers lagged badly behind and the overall strength of unions progressively waned.
This flagging commitment to union solidarity cannot be explained satisfactorily without reference to the changing nature of the workplace. The unique—and uniquely awful—character of factory work was the essential ingredient that created a self-conscious working class in the first place. Dirty and dangerous work, combined with the regimentation and harsh discipline of the shop floor, led workers to see themselves as engaged in something like war—with their employer as the enemy. Class warfare, then, was no mere metaphor or abstract possibility: it was a daily, lived reality.
“It is a reproach to our civilization,” admitted President Benjamin Harrison in 1889, “that any class of American workmen should in the pursuit of a necessary and useful vocation be subjected to a peril of life and limb as great as that of a soldier in time of war.” At that time, the body count of workplace deaths and injuries hovered around one million a year. Such conditions begat efforts to organize and fight back—often literally. The “Molly Maguires” episode in the Pennsylvania coal fields, the Great Railroad Strike of 1877 that claimed more than a hundred lives, Haymarket, Homestead, Cripple Creek, the Ludlow Massacre—these are just some of the more memorable episodes among countless violent clashes as the agents of capital struggled to keep a lid on the pressures created by the demands they made of their workers.
The best part of working-class life, solidarity, was thus inextricably tied up with all the worst parts. As work softened, moving out of hot, clanging factories and into air-conditioned offices, the fellow-feeling born of shared pain and struggle inevitably dissipated.
But at the zenith of working-class fortunes, the combination of law and collective action gave labor leaders powers that extended far beyond the factory floor to matters of macroeconomic and geopolitical significance. This capacity to affect domestic politics and international relations further bolstered the position and influence of the working class. When steel or autoworkers went on strike, the resulting disruptions extended far beyond the specific companies the unions were targeting. Labor unrest in critical industries affected the health of the overall U.S. economy, and any threat to the stability of America’s industrial might was also a threat to national security and international order. Consider Harry Truman’s decision in April 1952, during the Korean War, to nationalize the U.S. steel industry just hours before workers were planning to walk out on strike. We generally remember the incident as an extreme overreach of Executive Branch power that was slapped down by the Supreme Court, but the point here is to illustrate the immense power wielded by unions and the high stakes of any breakdowns in industrial relations.
The postwar ascendancy of the working class was thus due to an interlocking and mutually reinforcing complex of factors. It was not just favorable labor laws, not just inspired collective action, but the combination of the two in conjunction with the heavy dependence on manual labor by technologically progressive industries of critical importance to national and global welfare—all of these elements, working in concert—that gave ordinary workers the rapid economic gains and social esteem that now cause us to look back on this period with such longing. And the truly essential element was the dependence of industry on manual labor. For it was that dependence, and the conflicts between companies and workers that it produced, which led to the labor movement that was responsible both for passage of the Wagner Act and the solidarity that translated law into mass unionization.
No sooner was this working-class triumph achieved than it began to unravel. The continued progress of economic development—paced by ongoing advances in automation, globalization, and the shift of output and employment away from manufacturing and into services—chipped relentlessly away at both heavy industry’s reliance on manual labor and the relative importance of heavy industry to overall economic performance.
These processes began in earnest longer ago than many observers today remember. U.S. multinational corporations quadrupled their investments overseas between 1957 and 1973—from $25 billion to $104 billion in constant dollars. And back in 1964, the “Ad Hoc Committee on the Triple Revolution” made headlines with a memorandum to President Johnson on the threat of mass technological unemployment as a result of automation. But this was just the beginning. As information technology supplanted smokestack industry at the vanguard of technological progress, and as demand for labor generally shifted in favor of more highly skilled workers, the working class didn’t just go into decline. It eventually disintegrated.
There is a great deal of nostalgia these days for the factory jobs and stable communities of the egalitarian 1950s and 1960s—when working-class life was as good as it ever got. The sense of loss is understandable, as nothing as promising or stable has replaced that way of life now gone. But this lament for what has been lost is the cry of the Children of Israel in the wilderness, longing for the relative comforts of Egypt. We must remember that, even in the halcyon postwar decades, blue-collar existence was a kind of bondage. And so the end of the working class, though experienced now as an overwhelmingly negative event, opens up at least the possibility of a better, freer future for ordinary workers.
The creation of the working class was capitalism’s original sin. The economic revolution that would ultimately liberate humanity from mass poverty was made possible by a new and brutal form of domination. Yes, employment relations were voluntary: a worker was always free to quit his job and seek a better position elsewhere. And yes, over time the institution of wage labor became the primary mechanism for translating capitalism’s miraculous productivity into higher living standards for ordinary people. Because of these facts, conservatives and libertarians have difficulty seeing what was problematic about the factory system.
We can dismiss the Marxist charge of economic exploitation through extraction of surplus value. Meager pay and appalling working conditions during the earlier stages of industrialization reflected not capitalist perfidy but objective reality. The abysmal poverty of the agrarian societies out of which industrialization emerged meant that nothing much better was affordable, or on offer to the great majority of families.
But that is not the end of the inquiry. We need to face the fact that workers routinely rebelled against the factory system that provided their livelihoods—not a normal response to mutually beneficial exchanges. First were the individual mutinies: no-shows and quitting were commonplace. During the early 20th century, absenteeism rates stood at 10 percent or higher in many U.S. industries, and the usual turnover rate for factory employees exceeded 100 percent a year. For those who made it to work, drinking, drug use, monkeywrenching to slow the line, and other acts of small-scale sabotage were regularly availed outlets for sticking it to the man.
More consequential than these acts of private desperation were the incessant attempts to organize collective action in the teeth of ferocious opposition from both employers and, usually, the state. Mass labor movements were the universal reaction around the world to the introduction of the factory system. These movements aimed to effect change not only in the terms of employment at specific workplaces, but in the broader political system as well. Although socialist radicalism did not dominate the U.S. labor movement, it was the rule elsewhere as the Industrial Revolution wrought its “creative destruction” of earlier agrarian ways. Whether through revolutionary or democratic means, elimination of private ownership of industry and the wage system was the ultimate goal.
Since grinding poverty had long been the accepted norm in agrarian economies, what was it about industrial work that provoked such a powerfully negative response? One big difference was that the recurrent want and physical hardships of rural life had existed since time immemorial, and thus seemed part of the natural order. Likewise, the oppressive powers of the landed aristocracy were inherited, and sanctified by ancient custom. By contrast, the new energy-intensive, mechanized methods of production were jarringly novel and profoundly unnatural. And the new hierarchy of bourgeois master and proletarian servant had been erected intentionally by capitalists for their own private gain. There had been solace in the fatalism of the old Great Chain of Being: all the orders of society, from high to low, were equally subject to the transcendent dictates of God and nature. Inside the factory, though, industrialists subjected both nature and humanity to their own arbitrary wills, untethered from any inhibition of noblesse oblige. The traditional basis for the deference of low to high had been wrecked; the bourgeoisie’s new position at the top of the social pyramid was consequently precarious.
Another reason for the restiveness of industrial workers was the factory system’s creation of enabling circumstances. In other words, workers engaged in united resistance because they could. In the agrarian era, highly dispersed and immobile peasants faced nearly insuperable obstacles to organizing on a large scale—which is why peasant revolts were as uncommon as they were futile. The factory system dramatically reduced the costs of organizing for collective action by concentrating workers in large, crowded workplaces located in large, crowded cities. Toiling and living together at close quarters allowed individualized discontent to translate into concerted resistance. Solidarity was a consequence of falling transaction costs.
At the heart of the matter, though, was the nature of the work. According to the cold logic of mechanized production, the technical efficiency of the human element in that process is maximized when it is rendered as machine-like as possible. Machines achieve their phenomenal productivity by performing a sequence of discrete, simple tasks over and over again, always the same, always precisely and accurately, as rapidly as possible. Humans are most productive in filling in the gaps of mechanization when they perform likewise.
The problem, of course, is that people are not machines, and they don’t like being treated as such. By inducing millions of people to take up factory work and creating a social order in which those millions’ physical survival depended upon their doing such work for most of their waking hours, industrial capitalism created a state of affairs deeply inconsistent with the requirements of human flourishing—and, not unrelatedly, a highly unstable one at that.
Adam Smith saw the problem clearly at the very dawn of the Industrial Revolution. He opened his Wealth of Nations with a celebrated discussion of a pin factory, elaborating how the division of labor—breaking down pin manufacturing into numerous simple tasks that can be performed repetitively and speedily—made possible an enormous increase in output. Later in the work, however, he worried about the human toll of this highly specialized efficiency:
The man whose whole life is spent in performing a few simple operations, of which the effects are perhaps always the same, or very nearly the same, has no occasion to exert his understanding or to exercise his invention in finding out expedients for removing difficulties which never occur. He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become.
When Smith was observing and for a long time thereafter, this psychological toll was all mixed up with acute physical suffering. But even as pay increased steadily and workplace hazards to life and limb receded over the course of the 20th century, the essential inhumanity of industrial work never changed. Consider these recollections from End of the Line, an oral history of Ford’s Michigan Truck Plant, published in the 1980s just as the industrial era was drawing to a close:
The next day I went in after school and worked ten hours. I thought I had gone to hell. I couldn’t believe what people were doing for money.
Management’s approach is that the simpler the work, the easier it is to train workers and the easier they are to replace. You can’t keep that from sinking into a person’s self-esteem…. Even though it gives us a certain amount of financial freedom, we are prisoners of the assembly line. You’re tied to a machine, and you’re just another cog. You have to do the same thing over and over again, all day long.
The way foremen talked to people, you soon realized you were a serf and the foreman was your master…. In those days ex-athletes, especially prizefighters, were highly valued as foremen in automobile plants, especially at Ford. They were big barroom brawlers, bouncers, scrappers, and fighters, people who could bully people and command respect because of their size.
To have the human body work like a machine—consistently, continuously, hour in, hour out, to produce a product—is inhuman…. It’s like you’re incarcerated from the minute you get there until it’s time to leave…. The first few weeks I was there, I thought the world was going to end.
I was going to quit after that first week. I was so tired. My hands were aching, and my whole body was a wreck. But when I got my first check, it was over $400 and I told myself, “Maybe I don’t hurt as bad as I thought I did.”
To have to mimic an unthinking machine all day, every day, was bad enough at the purely individual level. But to be subjected to this fate wasn’t a merely personal predicament; it was to be relegated to a whole class of people on the wrong side of an invidious social comparison. In pursuing the technical efficiency of mass production regardless of its human costs, the class system created by industrial capitalism divided people along very stark lines: those who work with their brains and those who work with their bodies; those who command and those who obey; those who are treated as full-fledged human beings and those who are treated as something less.
Conservatives and libertarians have tended to dismiss the issue of class. If there is formal legal equality, and if the wage bargain reflects supply and demand rather than expropriation, what could be the problem? The problem eludes them because they are blind to the sociological dimension of economic behavior. Although workers and managers were legally equal, their relationship was one of deep social inequality. If the capitalist class system wasn’t about narrowly defined exploitation or oppression, it was most certainly about domination.
The social inequality of the workplace fed off and in turn sustained other, pre-market sources of inequality. In England, where industrialization originated, a preexisting class hierarchy based on the enormous land holdings of the hereditary aristocracy made it easier for capitalists to think of their workers as a lower order who were useful only from the neck down. In America, a social order noted for its egalitarianism arose while the country remained an agrarian economy—an egalitarianism restricted, though it was, to white Protestant men. Even that beachhead of equality was lost, though, when the American mass-production economy took off after the Civil War. The country imported a steep social hierarchy by feeding the insatiable demand for factory workers with thronging millions of non-Protestants from Ireland and southern and eastern Europe. Ethnic and religious prejudice by America’s white Protestant business class buttressed its sense of rightful dominance in the workplace, and the association of ethnic and racial minorities with dirty, menial drudgery reinforced the supremacist arrogance of their white-collared, white-skinned “betters.”
Even in the glory days of the Treaty of Detroit, the pact between capital and labor was a Faustian bargain. The wages paid to industrial labor were always a bribe to surrender one’s brain, and part of one’s soul, at the factory gate. In time the physical assaults and indignities of industrial work softened, and the pay packet fattened to afford material comforts earlier workers would never have dreamed of enjoying—but, however sweetened, it was still a deal with the devil. And as mass affluence prompted a cultural turn away from mere material accumulation and toward self-expression and personal fulfillment as life’s highest desiderata, the terms of that deal only grew more excruciating.
The nightmare of the industrial age was that the dependence of technological civilization on brute labor was never-ending. In Metropolis, Fritz Lang imagined that pampered elites in the gleaming towers of tomorrow would still owe their privileges to the groaning toil of the laboring masses. H. G. Wells, in The Time Machine, speculated that class divisions would eventually sunder humanity into two separate species, the Eloi and the Morlocks.
Those old nightmares are gone—and for that we owe a prayer of thanks. Never has there been a source of human conflict more incendiary than the reliance of mass progress on mass misery. In its most destructive expression, the nuclear arms race between the United States and the Soviet Union, it threatened the very survival of humanity. We are lucky to be rid of this curse.
But the old nightmare, alas, has been replaced with a new one. Before, the problem was the immense usefulness of dehumanizing work; now, it is feelings of uselessness that threaten to leach away people’s humanity. Anchored in their unquestioned usefulness, industrial workers could struggle personally to endure their lot for the sake of their families, and they could struggle collectively to better their lot. The working class’s struggle was the source of working-class identity and pride. For today’s post-working-class “precariat,” though, the anchor is gone, and people drift aimlessly from one dead-end job to the next. Being ill-used gave industrial workers the opportunity to find dignity in fighting back. But how does one fight back against being discarded and ignored? Where is the dignity in obsolescence?
The scale of the challenge facing us is immense. What valuable and respected contributions to society can ordinary people not flush with abstract analytical skills make? How can we mend fraying attachments to work, family, and community? There are volumes to write on these subjects, but there is at least one reason for hope.
We can hope for something better because, for the first time in history, we are free to choose something better. The low productivity of traditional agriculture meant that mass oppression was unavoidable; the social surplus was so meager that the fruits of civilization were available only to a tiny elite, and the specter of Malthusian catastrophe was never far from view. Once the possibilities of a productivity revolution through energy-intensive mass production were glimpsed, the creation of urban proletariats in one country after another was likewise driven by historical necessity. The economic incentives for industrializing were obvious and powerful, but the political incentives were truly decisive. When military might hinged on industrial success, geopolitical competition ensured that mass mobilizations of working classes would ensue.
No equivalent dynamics operate today. There is no iron law of history impelling us to treat the majority of our fellow citizens as superfluous afterthoughts. A more humane economy, and a more inclusive prosperity, is possible. For example, new technologies hold out the possibility of a radical reduction in the average size of economic enterprises, creating the possibility of work that is more creative and collaborative at a scale convivial to family, community, and polis. All that hold us back are inertia and a failure of imagination—and perhaps a fear of what we have not yet experienced. There is a land of milk and honey beyond this wilderness, if we have the vision and resolve to reach it.
1For perspectives from contrasting ideological vantage points, see Robert D. Putnam, Our Kids: The American Dream in Crisis (Simon & Schuster, 2015); Charles Murray, Coming Apart: The State of White America, 1960-2010 (Crown Forum, 2012).
2See Anne Case and Sir Angus Deaton, “Mortality and Morbidity in the 21st Century,” prepared for Brookings Panel on Economic Activity, March 23-24, 2017, final post-conference version dated May 1, 2017.
The post The End of the Working Class appeared first on The American Interest.
August 29, 2017
Let’s Break a Deal
In the coming days, the International Atomic Energy Agency (IAEA) is slated to issue its report on the state of Iran’s compliance with the Joint Comprehensive Plan of Action (JCPOA, the Iran nuclear deal). If, as seems likely, the IAEA declares Iran to be in compliance with the deal, that will set them at odds with President Trump. Against the recommendations of his own national security advisers, President Trump said in interview with the Wall Street Journal last month that if it were up to him he would have declared Iran non-compliant at the start of his Administration and expects to declare them non-compliant at either the sanctions waiver in September or the compliance certification in October.
Before he departed the White House, former White House Chief Strategist Steve Bannon asked former U.S. Ambassador to the UN John Bolton to prepare a strategy to abrogate the deal, which was published yesterday by National Review. Former intelligence officials say that the Administration is pressuring the intelligence community to find evidence of Iran’s non-compliance. And today, Iran rejected one of the administration’s key desiderata, as Reuters reports:
Iran has dismissed a U.S. demand for U.N. nuclear inspectors to visit its military bases as “merely a dream” as Washington reviews a 2015 nuclear agreement between Tehran and six world powers, including the United States. [….]
The U.S. ambassador to the United Nations, Nikki Haley, last week pressed the International Atomic Energy Agency (IAEA) to seek access to Iranian military bases to ensure that they were not concealing activities banned by the nuclear deal.
“Iran’s military sites are off limits … All information about these sites are classified,” Iranian government spokesman Mohammad Baqer Nobakht told a weekly news conference broadcast on state television. “Iran will never allow such visits. Don’t pay attention to such remarks that are only a dream.”
All of this suggests Trump will get his way, and the U.S. will be abrogating the deal within the next two months. What would that mean? The Trump Administration declaring that Iran is non-compliant with the deal would involve some combination of three components. First, it could cease waiving sanctions, most notably related to oil exports. Second, a statement of non-compliance, per the Iran Nuclear Agreement Review Act of 2015, would then shift the burden to Congress to craft appropriate legislation to respond to Iran’s transgressions. Lastly, the Administration could take the issue to the UN Security Council and initiate the so-called “snap-back” sanctions provisions of the deal.
Under those provisions, the Security Council would need to affirmatively pass a resolution stating that Iran is compliant with the deal. Crucially, the United States would then have the power to unilaterally veto that resolution, and thus unilaterally re-impose the previous UN sanctions.
We’ve written extensively about the flaws of the Iran nuclear deal. But taking this unilateral approach poses overwhelming risks. First, even the Bolton plan for abrogating the deal makes clear that first and foremost it would require a major diplomatic effort to get our European and other allies onside. As we’ve written before, there is no indication that this Administration has made even a preliminary effort to do so. If anything, it is Iran that is making the most concerted diplomatic effort to make clear that the United States would be responsible if the deal fails. The head of Iran’s Atomic Energy Organization, Ali Akbar Salehi, said over the weekend that if the U.S. withdraws from the deal but the other parties to the JCPOA (i.e. Russia, China, France, the UK, Germany, and the EU) remain committed, that Iran will remain committed to the deal as well.
There are certainly ways that the United States could try to re-engineer aspects of the Iran nuclear deal, or to otherwise tailor sanctions to confront Iran’s other non-nuclear bad behavior. But without the support of our regional or European allies, and without an independent IAEA or U.S. intelligence community assessment that Iran is substantively violating the nuclear deal, unilateral abrogation could be disastrous. Absolved of its obligations, Iran claims that it could resume production of highly enriched uranium in a matter of days. In an atmosphere of profound distrust between European leaders and President Trump, there’s reason to doubt that they would resume sanctions, let alone follow the United States into any military action needed to destroy Iran’s enrichment capacity.
The Iran deal remains a bad one. But without that diplomatic groundwork of co-operation from our partners and allies, it’s a deal that the United States should not simply throw away.
The post Let’s Break a Deal appeared first on The American Interest.
The Return of the “Old Normal”
Ever since Europeans used their superior technology to colonize virtually the entire world between the 16th and early 20th centuries, power on a more or less global scale has been concentrated in the hands of the state. Of course, large swathes of territory always remained beyond the reaches of weak states and para-states, like the feudatory domains of Indian maharajas and the lightly governed colonies in which the colonizer focused mainly on securing areas of European residency and resource exploitation. Postcolonial states were similarly weak, often only existing on paper outside capitals and key districts even if they were able to forge enough of a political settlement to sustain the fiction that they genuinely ruled everywhere. Still, non-state actors have usually been limited in their ability to directly, let alone successfully, challenge national or colonial authority—which is why the occasional exception, like the Mahdi rebellion in the Sudan in the 1880s and 1890s, and the Pashtun tribes in parts of Pakistan, stand out so much.
But today non-state groups from ISIS to transnational crime syndicates—aided to some extent unwittingly by a de-nationalized Davoisie jet-set of wealthy individuals and their assorted court jesters—deploy an assortment of tactics and new technologies that strengthen their power to organize, mobilize, fight, and wield influence.1 As a variety non-state actors grows stronger, more states are either failing or losing the capacity to fully control their territory.
All this is by now well known, not least because scores of scholars, secular prophets, and lesser pundits have been predicting the decline or demise of the state in one form or another for more than a century. Such predictions have repeatedly come up short, with the state demonstrating more staying power than expected. But four trends have picked up steam in recent years, suggesting that such predictions were not wrong, just premature and to a certain extent focused too narrowly on certain parts of the world.
Although all countries are affected, the trends undermining the capacities of states are most devastating to those that are already weak for structural reasons. Clearly, weak states getting weaker have contributed to the rise in violent conflict in recent years, especially in the Middle East, sub-Saharan Africa, and parts of Asia, as militias, ethnic groups, tribes, terrorists, and warlords control increasingly large swaths of territory, or at least deny control to central authorities. In many places, governments simply lack the capacity to counter ever more powerful non-state actors, especially when they arise in outlying areas. More than a third of the African continent, for instance, lies beyond the control of central governments.2 In all, close to twenty countries are now divided into pockets of weak public authority, some form of local governance, instability, chronic low-level conflict, extremist control, and lawlessness.
What is not so well know is that the world has looked a lot like this before. Conditions in many non-Western countries today resemble those that existed before the rise of colonial empires—the “old normal,” we might call it. Before the spread of British, French, Spanish, Dutch, and Portuguese rule, states and proto-states around the world often lacked firm boundaries. Large areas lay beyond their de facto grasp, tending to make power and territory more contested from within and across notional boundaries, and hence violence more common.
Nevertheless, the U.S. and most other Western governments act as if the rise of failing or partly failing states is an aberration, the implicit conclusion being that such countries can be put back together again, Humpty-Dumpty-style, with outside assistance. To a considerable extent, that is what the U.S. government’s foreign policy/national security sectors have tried to do since the end of the Cold War, believing that transforming other states into imitations of our own will conduce to greater international security and thus ultimately serve U.S. interests as the major status quo global power.3 But peace talks, national unity governments, peacekeeping troops, military-security training, technical advice, and elections are insufficient to build a legitimate state where none exists—and all have proved futile in places like Somalia, South Sudan, Afghanistan, Libya, Central Africa, and several other countries in the Sahel.
There is no little irony in this assumption that sovereignty can be re-strengthened from without, since the deepening Western commercial and cultural penetration of non-Western regions over the past several decades is an unintended source of much state destabilization in the first place. But that inadvertence aside, the new “old normal” isn’t going to disappear; governance and peace-building arrangements that depend on a reasonably strong central state are no longer viable in many places. A hoary assumption—that the new states that emerged out of European colonial empires in the late 1950s and 1960s would eventually find their way to legitimacy and institutional competence—has proved largely untrue. In many cases, even if capacities have gradually improved, they have not kept up with social change and the growing might of non-state actors, whose capacities have increased substantially.
The time has come to admit the unpleasant. Some non-Western sovereignties cannot be “restored” because these sovereignties never built the institutions of a Weberian state in the first place, remaining instead locked in patrimonial politics that have mitigated against the expansion of their formal state institutions. Instead, we must be willing to build on a territory’s social fabric and indigenous capacities and to develop hybrid forms of sovereignty if we are to assist these places’ return to anything like stability given current global political-technological circumstances. No other arrangement is sustainable—especially in places like Libya where there is little appetite for foreign troops, and where the so-called international community has little desire to intervene in any meaningful way. The return of the “old normal” won’t be pretty or pleasant. But it’s inevitable, and in our own security interests we need to tackle it head on.
Four Trends
Although some countries are more vulnerable than others to shocks or stresses, most fragile states have been able to maintain “good enough” stability for extended periods since independence using a combination of patronage and repression. Regimes might come and go and not be particularly inclusive, effective, or just, but at least they kept the peace and delivered some services. With power concentrated at the center, the government had by far the strongest coercive force and a near-monopoly on media and communications. Governments may have been weak in many ways, but their opponents were generally even weaker (think northern Mali before 2012 or Nepal before the 1990s if you need examples). The latter had limited capacity to organize, arm, communicate with the general population, and contest the status quo.
But several trends are changing the balance of forces, making the more fragile states less sustainable in their current form, and more likely to tip over into instability than before. Four such trends are key, one each having to do with ideology, technology, weapons proliferation, and the larger structure of global power distribution.
New ideologies are increasing the centrifugal forces acting on states.4 Whereas once it was thought (at least in the West) that liberal democracy would triumph everywhere, it is now clear that other ideas are more attractive to many. Indeed, democracy and capitalism have often failed to fulfill their promises in many developing countries, at least partly because they require greater cohesion and better institutions than these states can muster. Meanwhile, new (or resurgent) ideas about identity and faith have proliferated in response to the pressures of globalization.
The backlash takes different forms in different places, but it has been highly destabilizing in countries already plagued by high levels of social fragmentation. Muslim countries are especially vulnerable because Islam often provides a transnational identity that is relatively easily to mobilize across borders. Indeed, the global spike in violent conflict is concentrated in the Middle East and surrounding territory (the Sahel, Somalia, Afghanistan) where allegiance to the state, generally weak in patrimonial societies anyway, is being supplanted by more enduring ethnic, religious, and tribal loyalties. Jihadism has spread worldwide, attracting groups (including youth) that have historically been marginalized by the exclusionary style of governance common in fragile states. Most Arab states in the post-World War II era have thus been caught in a double bind: Loyalty to the state (wataniya) has been challenged from without in the form of pan-Arabism (qawmiya) or pan-Islamism, and from within by the strong pull of primordial tribal and sometimes, in multiethnic states like Iraq, ethnic affinities.
New communications technology is empowering non-state actors and weakening national cohesion.5 Whereas once the state had a monopoly on media and communications, now it controls neither. The proliferation of cell phones, smartphones, new television channels, the Internet, and social media has weakened the legitimacy of many governments while promoting societal fragmentation along ideological or identity lines. Different narratives about the past, present, and future now compete with what the government says. Different loyalties—some subnational, other supranational—now compete with loyalty to the country itself. Individuals have more power to challenge ineffective governments, but weak institutions have rarely become more effective as a result; in some cases, they have actually become worse. Groups have far greater capacity to organize around a common cause and challenge authority. Whereas before they could only organize in quiet and through trusted contacts, now they can employ cellular networks and mobilize people with agility, on a scale previously unimaginable. The efficiency of non-state actors have climbed dramatically, with no discernable increase in the power of weak states.
The proliferation of weapons is weakening the state’s significant edge in using violence.6 Non-state actors have access to more sophisticated weapons than ever before. Some of this is due to technological change—individuals and small groups can buy cheaper weapons or even develop their own. The failure of countries such as Libya has increased supply (storage depots were pillaged), as has the entry into the weapons market of new small-arms exporters such as China. The expansion of international criminal networks has also played a role, opening new channels of supply.
The result is that non-state actors from the Levant to the Sinai to the Kivus only need cash in order to buy everything from machine guns to grenade launchers to mortars to explosives. Larger sources of income—from, among other things, illicit mining, smuggling, kidnapping, selling drugs, and taxing local people under their control—has increased their ability to pay (as well as pay off officials in their way). In addition, rising anxiety about extremist Islamist groups operating in countries with weak states has led global and regional powers to directly supply other non-state actors with weapons in places such as Syria, Iraq, and Libya in order to better fight terrorism. This could easily sire counterproductive outcomes.
An increasingly multipolar power dynamic is weakening the international response.7 The rise or reemergence of China, India, Russia, and regional powers such as Turkey, Iran, Saudi Arabia, and Ethiopia has produced a fragmented, divided international order with less capacity to impose its will than was the case even ten or 15 years ago. (The so-called international community was similarly divided during the Cold War, but between only two protagonists.) These new powers have their own ideas about how the international system should be run and their own interests to protect. In some cases, they directly oppose Western ideas and interests, as in Syria and Ukraine. In others, they compete with each other—and the West—to advance their aims, as in Yemen and increasingly in Africa.
At the same time, the U.S. government, and the rest of the West alongside it, has shown a growing reluctance to project force and defend the international order it created due to changing ideas at home about its role in the world coupled with perceived economic and financial weakness. European leaders have, generally speaking, been reactive bystanders as all this has happened. Even in those situations perceived to be of direct concern to them, as in Libya, they have failed to act decisively on behalf of their own security and broader interests. These dynamics are obvious as well in Syria, where the West has played a subordinate role as an array of international actors (Russia, Turkey, Saudi Arabia, Iran, and others) have aided both sides of the conflict and are collectively too divided to end it.
Although religion, ethnicity, clan, and other forms of identity play a major role in the disruption taking place, not all non-state actors are organized this way: Among the biggest beneficiaries of these trends are criminal networks that often have as much as or even more power than weak states. They are better financed, better able to leverage globalization, and not restrained by the need to uphold certain standards of governance. Drug cartels, human traffickers, computer hackers, counterfeiters, arms dealers, and others not only benefit from these trends but also actively undermining fragile state capacity around the world.
Many weak states may be gradually getting more capable as their education levels climb and organizational capacities grow, but such progress has been painfully slow in many places and nonexistent in others. Non-state actors are gaining ground faster because the above trends have augmented their ability to organize, challenge, and disrupt existing political orders. The violence that increasingly occurs as a result is thus less a reflection on changes in the states than in the non-state actors: The latter are simply shredding the illusions of power that governments have long sought to broadcast at home and abroad.
Whereas some fragile states have formal institutions that could play some sort of constructive role (for example, Colombia and Sri Lanka), others have such weak writs that non-state actors have already become nearly as or even more relevant than the state (Lebanon, Libya, Somalia). While the former group can use a differentiated form of traditional tools to promote accountability and development, the latter, most vulnerable group needs something new.
The states most at risk of partial or complete dissolution are concentrated along an arc from Central Africa to Central Asia. They include Libya, Afghanistan, the Democratic Republic of the Congo (DRC), the Central African Republic, and many of the countries in the Levant, the Sahel, and the Greater Horn of Africa (including Yemen). Most of these places experienced colonization on the cheap for relatively short periods of time and ever since have consistently underinvested in the ingredients of state capacity—including higher education, social cohesion, and managerial skills. Colonizers made little attempt to align borders with sociopolitical realities, or to invest in state institutions except in places where they held sway for very long periods and had ambitions beyond resource extraction, as in India. Post-colonization elites and donors have rarely taken effective steps to counter the challenges such conditions brought.
The political map of the world is becoming more complex and dynamic than it has been in recent decades. Instead of being populated by countries that are easy to identify and that measure up against each other, ending with recognizable (if not always agreed upon) boundaries, it will consist of an assortment of states, non-state forms of public authority, and areas with no public authority. Some areas will have strong government, other areas weak government, and yet others no government at all save some form of local arrangement. Boundaries between these different zones are likely to be unstable in a way that recent-century mapmakers will not recognize, sometimes changing regularly as the balance of power between states or between states and non-state actors changes.
Creative Governance Solutions from the Past
Winston Churchill said: “The longer you can look back, the further you can see forward.”8 Or to put it differently: The better we understand the world before the Age of Imperialism, the better we will be able to address the challenges of fragile states in the future.
Before the expansion of European empires starting around 1700, large segments of Southeast Asia, Africa, Central Asia, and the Arabian Peninsula (and everywhere else at one time, for the history of robust government is relatively thin) were occupied either by weak states or had no states. This was the case for much the same reason that large parts of the world have fragile states today: They lacked the capacity to effectively govern much of the territory they claimed to control, or concluded that the cost of asserting control over peripheries was not worth the benefits. Incentives encouraged leaders to rule exclusively and to minimize investments in outlying areas that yielded few material advantages. Social divisions and harsh conditions meant that populations had little loyalty to the states they inhabited, especially as migration was common in many places. People moving around the globe isn’t new either.
Governments adjusted their sovereignty to fit their ability to project power. In central or easily reached areas, they exerted a high degree of control. Farther out into “the marches of empire,” they exerted much less. In West Africa, for instance, the Ashanti, who ruled for two centuries in the area around present-day Ghana, conceived of power “as a series of concentric circles . . . rippling out from a center point,” according to scholar Jeffrey Herbst.9 Northern Nigeria’s Sokoto Caliphate, one of the largest and most powerful empires in sub-Saharan Africa until British conquest in 1903, exercised power similarly.10 So did the Ottoman Empire, which allowed broad autonomy for inner Arabia and North Africa after the 18th century. Outlying regions had much autonomy in these countries; sovereignty was often divided and plural when multiple power centers could claim some role in their affairs—as, for example, in 19th-century Tunisia, which was caught between Istanbul and Paris. A similar landscape prevailed throughout the world at one point and continued in most places until Western ascendance.
States experimented with a wide variety of models and forms to maximize their effectiveness and reach and to minimize the blowback from beyond their outer limits.11 Power was continuously renegotiated, especially in the hinterlands. Balance was important to prevent centrifugal forces from undermining authority.
The creative strategies that regimes were forced to adopt in response to their challenges can shed light on how the weakest states today might better manage their own limitations. These included:
Sharing power with local leaders, whether warlords, feudal lords, chiefs, religious leaders, or elders. This allowed locals maximum flexibility to manage their own affairs with little interference as long as they declared allegiance, paid taxes, didn’t disrupt trade, and didn’t pose a threat.
Sharing power with neighbors. Although usually not articulated as a strategy, some regimes accepted forms of plural sovereignty in which two neighbors had overlapping claims but did not fight over territory; in some cases, different rights were recognized for different regimes in the same territory, as was the case every time the United States used the Monroe Doctrine to assert its rights throughout the Western Hemisphere in order to protect its own security.
Developing national institutions that had a much more limited role than central governments do today. This meant developing mutual defense, foreign policy, a single body of law, some form of court or arbitration system, and some form of taxation, but not much more, leaving the rest to the component parts—a kind of confederacy. Leadership was often corporate such that local representatives or regional heads directly participated or rotated participation; parliaments were assemblies of representatives, not directly elected, and had relatively limited authority. Such practices used to be much more common in Europe and continue to influence Switzerland, which has long had a different system of government than its neighbors, based on cantons, shared sovereignty, and consensus about what the limited national government should do, and not do.
Creating different arrangements for different parts of countries (instead of using a comprehensive, one-size-fits-all arrangement), depending on their strategic importance, population density, ease of reach, ease of governance, and the local or regional balance of power. Pakistan’s Northwest Frontier area constitutes an example, though it is today a cautionary example of what “light” rule can lead to in terms of security threats.
Cordoning off areas that were hard to control or posed threats rather than seeking to project authority into them (thus, the Chinese built the Great Wall rather than seeking to extend their realm northward into the steppes).
How can governments and their international partners adapt these strategies in today’s conditions? To be sure, applying some of them would carry risks, which ideally will be taken into account when formulating the best way forward given particular contexts. If too much power is removed from central authorities, for instance, local actors may build enough of a power base so as to threaten the center or at least make conflict more rather than less likely. But in many cases, few if any better alternatives will exist. Local and international leaders will have to accept the conditions in these places as they are, not as they prefer them to be.
Not all countries can be built from the top down, which is the current default strategy for forging peace, rebuilding states, and strengthening governance. In many cases, a more bottom-up or horizontal approach is more likely to leverage the strongest pockets of social cohesion and capable governance that currently exist.
In Libya, for example, international attempts to bring order and stability through a top-down national unity government will continue to fail if done in isolation. The country simply lacks the social cohesion and institutional capacity to establish a robust central government that can control all of its territory in the absence of an authoritarian hand. A more prudent approach would work from the bottom up, focusing on Libya’s strongest political assets: functional local governance and effective, tribal-based conflict management mechanisms. An agreement or set of agreements among the most powerful local actors would have a better chance of success, especially if top-down efforts were recalibrated to take into account the need to complement rather than supplant such efforts, and were more modest in scope. Ideally, the two processes should be organized in such a way as to reinforce each other in a virtuous cycle. The result would be only piecemeal progress, but it would reverse the backward momentum that exists today and encourage other parts of the country to move in a similar manner. Similar dynamics based on tribes or cohesive local identity groups exist in Somalia, Yemen, and Afghanistan—and call for similar approaches.
A related strategy would be to focus efforts on a set of urban areas, which often have greater social cohesion and relatively stronger institutions than weak states. This approach might offer a better pathway forward in large, sprawling countries, such as Nigeria and the DRC, as well as in deeply divided countries such as Kenya. Such an approach would strategically employ urbanization and decentralization—which could mean federalism but does not have to—in order to launch new urban-based governance (UBG) models. Greatly empowered mayors—or district governors—would be tasked with larger portfolios, handling most facets of government in their areas. This approach has already been proven to work in Lagos, which has more power than most developing-country cities because of Nigeria’s federal structure, and which is better governed than the state as a whole.12
Countries whose governance capacity is limited and that have difficulty extending their reach to distant territories (or face substantial opposition from locals in doing so) should consider either radically decentralizing—including ceding some sovereignty to locals—or working with neighbors or international actors to co-manage parts of their territory. The center would still play a supportive role by providing some financial and technical assistance, setting minimum standards for governance, and arbitrating differences between groups. This already occurs formally in Kurdistan and the aforementioned tribal areas of Pakistan, and informally in Somalia and Burma, where local statelets manage their own affairs, as well as wherever central governments cannot maintain control over their territory without international help, such as in the DRC, Mali, and the Central African Republic. Such an approach may be the only way to stabilize the Sahel and Horn of Africa, where non-state actors have clearly shown their strength vis-à-vis the state.
In territory where decentralization alone is insufficient because the financial and administrative cost of re-establishing any sort of formal public authority is prohibitive (such as in parts of the Sahel), governments could consider whether it is in their best interests even to try to do so. Instead, their interests may be best served by eliminating their role altogether. Locals could be left to govern themselves as long as they did not disrupt the core geography of a state. Alternatively, countries could consider whether asking neighbors (for example, Algeria, arguably the strongest state abutting the Sahel) or international actors (such as France) to formally play a long-term role might be more productive, especially if they have an equal or larger stake in stabilizing the area (for example, ensuring terrorists, secessionists, or criminals don’t use it).
Countries should also consider more creative ways of unifying disparate groups behind national governments than they do today. Instead of focusing on building governments with strong, comprehensive mandates and powerful leaders in deeply divided polities with weak institutions—which creates zero-sum struggles for power and resources—they should aim to build a weaker center with a limited remit that is dependent upon consensus (or a significant majority) to make major decisions. Rotating presidencies and executive committees that combine representatives from major groups are more likely to gain traction because more major actors would have a stake in the outcome, and the risk of exclusion is lower. Indeed, it is hard to imagine countries such as Iraq surviving as states over the long term without a truncated form of central government along these lines; states will have little legitimacy among minority groups if they do not cede control of most of the functions that matter to them. Where social bonds across groups are less tenuous—in states less broken than Iraq—more effort should be invested in strengthening social cohesion and the inclusiveness of policies in order to reduce the risk of conflict or failure.
Lastly, in a select number of cases, more flexibility regarding borders may be necessary. Places such as Syria, Somalia, and the DRC may be impossible to hold together or reassemble. Although changing borders should be done reluctantly, it should not be ruled out in the most difficult cases, or where there is a long history of suppressing a particular identity group (such as the Kurds in Iraq).
A New Framework for the Old Normal
The United States and other international actors certainly need to think more creatively. Whereas international aid helped many countries transition out of authoritarianism and conflict in the 1970s, 1980s, and 1990s, it has proved much less successful over the past decade and a half because the countries involved are much more fragile—with weaker cohesion and weaker institutions. Countries like Spain, Poland, Brazil, and even South Africa transitioned relatively seamlessly, but Iraq, Yemen, the DRC, Nepal, Afghanistan, and Nigeria have struggled. More and more states are at risk of dissolution or weakening into something almost irrelevant for large portions of their populations and territories. Yet the major Western state powers continue to approach these places with strategies based heavily on promoting democracy and strengthening state governance at the national level—neither of which can stabilize these places given their existing structural conditions.
The new era requires a framework that recognizes explicitly that the “old normal” is here to stay. Even though the world is a much different place than it was centuries ago before the era of European imperialism, the strategies outlined above can inform creative governance concepts that can help better stabilize today’s weak states.
This framework should begin with cooperation, both among the world’s leading states and between them and regional powers. The more international actors line up behind a single strategy, the more likely political order can be reestablished. This may require agreements not to undermine each other’s authority at home and to take into account each other’s interests in some regions abroad. The result may be much less than ideal at times—such as when it requires accepting that the Assads will stay in power in Damascus—but such actions will ultimately advance American interests by advancing a broader regional stability.
Cooperation should occur in a range of spheres. International actors should: invest in regional organizations, which in many cases are better placed to address the myriad problems of weak states, but which rarely have the capacity and resources to do so; make greater use of jointly managed or externally anchored institutions such as the CFA Franc in parts of Africa (co-managed by France and its former colonies), the International Commission Against Impunity in Guatemala (sponsored by the United Nations), and the various mechanisms used by the European Union to upgrade the institutions of prospective members; approach a number of strategically important countries (for example, Egypt, Ethiopia, and Nigeria) for long-term partnerships that aim to strengthen their stability, institutions, inclusiveness, and economic dynamism, so that they might anchor their regions and thus provide some form of positive spillover to weaker neighbors; invest heavily in assessing and understanding the structural constraints and dynamics of fragile states such that policy much better reflects context than is often the case today; and invest more in developing the human resources and organizational capacity (for example, through universities, law schools, and governance academies) within weak states.
The U.S. government will be much better positioned to navigate this new world if it can break the habit of focusing on the central state, and come to better appreciate that the disaggregation of power is sometimes essential to stability. Partnering with local leaders based on a deeper understanding of local landscapes and actors is essential. Diplomacy, development, and defense (3D) will all need to establish more realistic goals and political strategies that recognize the importance of focusing on a wide variety of actors across a landscape instead of just those jockeying for power in capitals; that tradeoffs between competing goals (for example, political order and competitive politics) is necessary; and that progress is liable to be incremental at best. In many cases, nascent local efforts to end conflicts and establish political order will need to be better protected from outside attempts to disrupt or capture them if they are to gain traction and grow in scale.
None of this requires a return to the large-scale interventions of the past. Instead, the U.S. government should be focused on providing low-cost, low-visibility, politically astute technical and financial assistance to promote local efforts to improve governance, reconstruction, and development. Such a mission requires that Washington reconfigure American institutions in order to better respond to complexities on the ground; shift and grow 3D resources to better understand community-level contexts; establish dedicated units in embassies to apply integrated 3D under unified leadership; develop country experts dedicated to long stays in those countries; build easily deployable teams of “special force” governance and mediation experts; create flexible financing mechanisms for non-central state actors; and improve the capacity for policy to reflect feedback from the ground.
The United States and other leading international actors ought to advance reforms with great care in fragile yet stable countries (such as Jordan and Ethiopia), which are highly vulnerable to shocks. We need a greater appreciation for the institutions that do work—no matter what form they take and how imperfect they may be at times. In these places, change that preserves stability can only happen incrementally, in ways that do not undermine or disrupt whatever system exists. Syria, for instance, was highly repressive before 2011, but it provided a decent level of public services, protected minorities, and sought to improve living standards. Regime change was always far less likely to improve the lot of the country’s citizens than efforts aimed at gradually increasing inclusiveness, economic opportunity, basic human rights, and the rule of law.13 It may be better to evaluate how states perform on measures such as these than just to look at what shapes their formal institutions take.
Cooperation and incremental change will work better in some places than in others, and will depend on the ability of locals to combine efforts and attract strategic outside assistance to combat spoilers and extremist elements in their midst. Although outsider interest will often focus on limiting spillover, prioritizing security above all else in the short term does not safeguard security for the long term; in some cases, it may do the reverse. Drones and Special Forces will never be better at reducing the problems these countries face than indigenous capacities for conflict management and governance. The goal of outsiders should be to make the investments that make the latter more effective, scalable, and sustainable.
An era in which non-state actors exercise more power than weak governments—the “old normal”—is not an aberration or “growing pains” leading to a stronger Westphalian order. Indeed, the state model may never work well in some parts of the world. The new era requires a new set of strategies and institutions if we are to minimize the adverse impact on our own security and well-being and increase the security and well-being of the people in fragile states.
1The “twin” attack on the state is analyzed by Nils Gilman in “The Twin Insurgency,” The American Interest (July/August 2014).
2Pierre Englebert, “The ‘Real’ Map of Africa,” Foreign Affairs, November 8, 2015; conversation with Clionadh Raleigh, October 2016.
3James Jeffrey, “The State of State,” The American Interest (July/August 2017).
4See, for instance, Galip Dalay, “Kurdish Nationalism Will Shape the Region’s Future,” Al Jazeera, July 12, 2015.
5See Mark Mazzetti & Michael Gordon, “ISIS Is Winning the Social Media War, U.S. Concludes,” New York Times, June 12, 2015.
6See, for instance, C.J. Chivers, “Facebook Groups Act as Weapons Bazaars for Militias,” New York Times, April 6, 2016.
7See, for instance, Anne Barnard, “In Fight for Aleppo, Tangled Alliances in Syria Add to Chaos,” New York Times, October 6, 2016.
8Chris Wrigley, Winston Churchill: A Biographical Companion (ABC-CLIO, 2002), p. xxiv.
9Herbst, State and Power in Africa: Comparative Lessons in Authority and Control (Princeton University Press, 2000), pp. 45–8.
10Ibid.
11Two books do an especially good job discussing how pre-modern states adapted to their circumstances: the aforementioned Herbst book, State and Power in Africa, and James Scott, The Art of Not Being Governed: An Anarchist History of Upland Southeast Asia (Yale University Press, 2009).
12Seth Kaplan, “What Makes Lagos a Model City,” New York Times, January 7, 2014.
13Seth Kaplan, “The Perils of Regime Change in Syria,” Global Dashboard, November 22, 2011.
The post The Return of the “Old Normal” appeared first on The American Interest.
Broken Records
One of the promises of a computerized age was that patients would be able to control their electronic health records (EHRs), and have all the reams of data produced by their tests and doctors’ appointments in one easily accessed online location. Of course, as all of us who use our doctors’ patient portals know full well, what we have in reality is a collection of interfaces that would look more at home in 2000, function poorly at best, and don’t talk to each other. Patients find them bewildering and often don’t even use them. Doctors spend far too much time, they say, filling out interminable lists of questions for electronic health records (EHRs)—one study in 2016 found they spend fully half their time filling out forms. The resulting record for a single visit, if printed, can be more than one hundred printed pages long. Too many doctors copy and paste data from older forms, sometimes preserving outdated information and leading to errors.
The worst part about this is that the only people with real incentive to change the system are policymakers. If all records were interoperable, EHR vendors would have less of a hold on their customers. Meanwhile, doctors and hospitals also like keeping their patients, and it’s harder to switch providers if you can’t easily take your records with you. And though the law that incentivizes the EHR adoption did recognize the importance of interoperability, it wasn’t a first-order requirement, probably due to the extra expense. Of course, now that these systems are up and running, making them interoperable now is both more expensive and more complicated. (While sharing health records always brings up questions about security and privacy, about 90 percent of providers already use EHRs. They aren’t going away, so the best we can do is make them more useful.)
Is this fixable? Yes, says Julia Adler-Milstein, a scholar working on EHRs. Of course the government could incentivize interoperability across the board, but it might be more effective to make it unavoidable, practically speaking. If, for example, regulations require providers to cut down on unnecessary tests by making sure that the same patient doesn’t get the same test twice, vendors will have to share their records. Alternatively, requirements for community monitoring would have the same effect—make providers responsible for reporting whether their patients get immunizations, whether they get it from that doctor or from any other provider.
These recommendations dovetail nicely with recent calls for better monitoring of opioid prescriptions. The President’s commission on the opioid crisis recently called for all state monitoring programs, called PDMPs, to be interoperable. PDMPs collect data from pharmacies, and doctors treating an opioid-addicted patient are not required to talk to the patients’ previous doctor. (Shockingly, according to a 2016 report, few doctors actually request information from their state PDMPs before writing prescriptions.) As alarm rises over the steady increase in opioid deaths in this country, there might be political momentum (and thus public funds) available for mandating the interoperability of doctors’ and hospital records, so they can treat addicts better and forestall further deaths.
On the other end of this problem from the political difficulties, there’s the technology to consider. Making systems talk to each other is always an expensive proposition, but what if there’s a digital-age solution? Researchers at the MIT Media Lab and Beth Israel Deaconess Medical Center propose turning our health records into blockchain, so that the patient controls a security, time-stamped log of all her records, and can order them to be sent to whoever she wants, whenever she wants. Best of all, it doesn’t require giving all the data to another company to control; blockchain devolves the responsibility for these records to the patient herself. Though this isn’t likely to become tomorrow’s solution, the next decade may bring such revolutionary changes the technology—if this particular innovation than something just as dramatic.
Meanwhile, many recent innovations make managing conditions easier for the patient—remote monitoring of diabetes and heart conditions, to take two examples. These kinds of monitors really lend themselves, and demand, a comprehensive record system to go with the comprehensive data they provide.
The post Broken Records appeared first on The American Interest.
Peter L. Berger's Blog
- Peter L. Berger's profile
- 227 followers
