Peter L. Berger's Blog, page 58

January 7, 2019

A Conversation with Gary Hart

Adam Garfinkle, for The American Interest: Gary, when did you first find out about Matt Bai’s intention to make a movie about the events of May 1987?  Did the screenwriters, directors, or producers come to you at an early stage, or did you find out about it some other way? Did they ask for your advice, or consult you in any way? Have you met or spoken with the producer and co-screenplay writer Matt Bai?

Gary Hart: I believe I first learned of this project in mid-to-late summer 2017. Mutual friends heard about it from Matt Bai, relayed the information to my son John, who informed me. The director, Jason Reitman, came to Evergreen in early August and we discussed the project over lunch. At that time he said he wanted to do the movie to answer his friends’ anguished questions: How did we get Trump?

I never saw the screenplay nor was I consulted at any point. I got to know Mr. Bai when he interviewed me for his 2014 book, All the Truth Is Out: The Week Politics Went Tabloid. But I did not speak to him during the filming.

TAI: How did you feel when you first found out about the intention to make Front Runner?  I mean, it was hard enough the first time. (Even I cringed from a distance, and I barely knew you at the time, before Hart-Rudman Commission days.) Who wants to have to relive something like that? And maybe worse, have to watch your wife and family relive it?

GH: I’m not good at the “feel” questions. I thought that, 30 years on, it was a bit bizarre, but I certainly bought the early premise that my experience ultimately led down the line to Trump. My stepping away from politics speech later that week (still on YouTube) had the memorable line that was preserved in the movie: “If we continue down the road begun this week, to paraphrase Jefferson, I tremble for my country to think of the kind of leaders we will deserve.” The message to the press was: You can’t have it both ways—intrusion into privacy and quality leadership. I believe I was right.

TAI: How did you first see the movie—a private screener, or what?  Has your wife seen it; did she see it with you? Her reaction?

GH: My wife Lee, our son John, and I saw it at a private screening a week or two before its limited release. Lee was non-committal. She did not like seeing herself portrayed on screen and did not enjoy reliving the trauma.

TAI: I thought the storyline felt like a pretty accurate depiction of what seemed to be happening at the time; I thought the disheveled habits of the predatory press were well depicted, as was the standard alcohol consumption levels. The “main” character depictions seemed pretty accurate, too, with one exception. There is a scene in which you (Hugh Jackman, actually) are shouting at your campaign manager, clearly having lost your temper at him. That strikes me as very unlike you. Can you remember ever losing your temper and shouting at anyone working for you during this ordeal?

GH: The movie is a dramatization, not a documentary. Many scenes, especially between Lee and me, never happened. And, of course, all the dialogue is invented. My main concern was for Lee’s depiction and Vera Farmiga did an outstanding job capturing her reserved strength. Otherwise, right: I do not shout, especially at friends.

TAI: Are you left-handed? Hugh Jackman is in the movie.

GH: I’m right-handed, but Hugh wore my neckties and belt, and tried to get my hair.

TAI: Ah, the hair. There’s a line in the movie where some campaign aides are discussing you as they watch you standing at a podium, and one of them says something about your being handsome and really “having the hair.” I found this pretty hilarious since, as you know, we have shared the same barber. Your picture is still on the wall right next to Demetris’ work throne there on 21st Street, NW—vintage about 1980, 1982 maybe, I’d guess. And I have to say, that Hugh Jackman doesn’t come close, certainly not with the hair. I don’t suppose you’d like to venture a view about that?

GH: If you have to be portrayed on film, you can’t do any better than Hugh Jackman. We’ve become good friends.

TAI: Obviously, the deeper theme of the movie is the transformation of the media in its relationship to American politics. This is something I’ve thought about lately, because I agree that this transformation leads in at least a jagged and dotted line from May 1987, if not before, to the mess we have today, with the media having become a significant part of the problem. There is nothing new about the basic observation, of course; as Tom Stoppard said many years ago, “the press is a stalking horse masquerading as a sacred cow.” But the underlying questions are when and how did this happen.

Speculation about the answers is capacious. Maybe the original arrangement, where reporters cut politicians a break in return for access, was corrupt and never should have developed. The “gotcha” adversarialism trend clearly goes back to Vietnam and Nixon, but that had nothing to do with sex. On the other hand, the incentives for media to treat everything as a “who’s up/who’s down” superficial tabloid plot, and all but ignore the issues, seems more of a marketing decision shaped by the dumbing down of political discourse. That started with the television age’s domination of images over words, but has really galloped in recent years with the ubiquity of IT technology. That clickbait and only clickbait makes money in media now is the culmination, I think, of trends decades in the making.

Some of this, too, may have to do the consultant-driven politics-of-intimacy trope—inauthentic authenticity, in other words. When candidates try to persuade voters that they’re “just one of the guys,” not leaders with special skills and roles to play, it amounts to an open invitation to probe into private lives. So many factors possibly at play. . . What’s your analysis of how all this came to be?

GH: After more than 25 years of listening to the base canard that “we only followed him because he dared or challenged us to,” Matt Bai finally got E.J. Dionne to admit that this is not what was meant when I invited him to join me openly in my extremely complicated daily rounds. Since then, the media has quit relying on that myth to justify the first instance in American history where a media organization placed a leading candidate for the presidency under surreptitious surveillance, the mark of dictatorships worldwide.

After hell week, editorial writers everywhere justified this unprecedented intrusive behavior on the grounds that the press had a duty to examine the character of leaders. So much for FDR, Eisenhower, Kennedy, Johnson, and a host of others even after May 1987.

Character is displayed over a lifetime. I had been in public view for 15 years, had won two statewide races, won 25 primaries and caucuses in an 18-month national campaign, and issues of my character had never been raised.

TAI: The movie makes it seem like you were slow on the uptake to recognize the change in media habits. If true, you weren’t alone, because it’s natural to resist crediting a trend that runs directly opposite to an idealistic view of American politics. I wonder, however, if growing up in the Church of the Nazarene had anything to do with your developing a sense of special guardedness about privacy, as happens sometimes in small faith communities outside the cultural mainstream.

GH:  No connection whatsoever. If anything, I was completely trusting of virtually everyone, including reporters. I had nothing to hide. Living a guarded life is beyond me.

No doubt media behavior reflects cultural trends, but, at least in my experience, the change in media habits happened in a five-day period in May 1987. And the issue of media habits may have been marginal to what was really going on at the time: Read the James Fallows piece in the November issue of The Atlantic—“Was Gary Hart Set Up?” There is credible evidence that Lee Atwater, George H.W. Bush’s campaign manager, arranged the entire Miami weekend and beyond. The media has so far refused to honestly appraise Fallows’s evidence. Atwater may well have gotten away with eliminating George H.W. Bush’s main competitor and distracting from the Iran-Contra hearings beginning that Monday. Intricate timing is never accidental.

TAI: I’ll read it.

No doubt many friends and associates have come up to you, or spoken or written to you, since the movie came out, just in the normal course of things—and some probably have seen the movie. Do people tend to mention it, or avoid it? Some of both? Can you sense any ambient awkwardness? Can you please share an anecdote or two, assuming there are some, about your private life since Front Runner?

GH: Everyday people among whom we live have been great—low key, positive, Hugh Jackman jokes here and there. A few have quietly said: The movie shows us what has been lost.

TAI: Last question, Gary. We all know that life imitates art, except that this movie, as a dramatization that deliberately lives very near something that actually happened, is an example of art trying to imitate life—with a recursive potential for other lives to in turn imitate it down the line. I just hope your own earned memories of all this have not been distorted beyond recall by the movie images. But more important, it’s now a bit more than 30 years later: Have any long-maturing lessons emerged, lessons different from those of the immediate aftermath?

GH: Most of what I learned 30 years ago and since has not been positive. Rupert Murdoch has his President who follows his orders. America is not the better for that experience. I don’t grieve for myself; I grieve for this country. Once again, the press cannot have it both ways: We cannot have loss of privacy and high-quality leaders. Sensationalism may sell newspapers, but it drives away thoughtful potential public servants.

The media has hated the movie, and the main newspaper in question is still whining about how it was depicted. There’s no sense of irony there, no effort at a real reckoning—and no willingness, yet anyway, to take the Fallows account seriously. I wonder if it has ever crossed their minds that they may have depicted me elaborately, but wrongly.

TAI: Thanks for agreeing to talk about this with me. Much appreciated.

GH: You’re welcome, Adam.


AG note: I did read it after this conversation took place, and if Fallows’s argument is true, it changes everything we thought we knew about what happened. If true, it makes The Front Runner as obsolete as an artistic artifact as our heretofore common understanding of the facts. Fallows never claims that his account is true, for he admits that circumstances make it all but improvable; he only claims that it is plausible. And, indeed, it is plausible.



The post A Conversation with Gary Hart appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 07, 2019 08:14

Why Brussels Should Be Wary of a Cancelled Brexit

With barely a mistake having gone unmade in British Prime Minister Theresa May’s handling of Brexit, it is perhaps understandable that many observers have allowed their lurid fascination with the chaos in London to obscure the interests of Brussels from their calculations of what may or should happen next. A useful reminder of this slanted perspective came last month following the European Court of Justice’s ruling that Britain could abandon Brexit altogether if it wanted to. It was widely, and somewhat hopefully, reported in the international media that pro-Europeans inside the United Kingdom such as Scottish First Minister Nicola Sturgeon had been quick to embrace the ruling. Outside Britain, Latvian Foreign Minister Edgars Rinkevics tweeted that he would be “more than happy” to see an abandonment of Brexit, and Peter Harris noted that the ECJ ruling might provide a way out for Britain, which is currently “undergoing the biggest constitutional emergency since the Abdication Crisis of 1936.”

But by this stage in the game, it is not at all clear whether Brussels does or even should want Britain back in the fold anyway. To be sure, the more idealistic Europhiles would take Britain back in a heartbeat; others regard Brexit as a mistake that Britain will soon want to rectify. The reliably starry eyed Guy Verhofstadt, Brexit coordinator for the European Parliament, rarely misses an opportunity to predict that Britain will re-join the European Union as soon as the current wave of British politicians is replaced by a younger and allegedly more pro-European generation. But even if younger voters do not become less enthusiastic about the European Union as they mature—and they very well might, given its various problems and flaws—the projected generational shift would still take decades.

As things stand today, the European Union would have to reckon with the following should Brexit be cancelled. First, yet another sledgehammer blow to the European Union’s democratic legitimacy. Having already ridden roughshod over referendum results in France, the Netherlands, and Ireland over the last two decades, it would become harder still to paint the EU as a truly voluntary club were the wishes of the British people, as expressed in the 2016 referendum, simply ignored. It is true that Brussels has found a way to live with inconvenient referendum results in the past, but disrespecting the British people may be a step too far at a time when all across the continent populists are equating Europeanists with neo-authoritarians. Any new referendum with a positive result for Brussels—by no means a certainty anyway—would merely add weight to the view that the EU and its allies will do anything rather than accept a democratic vote they don’t like. In any case, if there is a second referendum supporting EU membership, why shouldn’t Euroskeptics ask for a third to make it the “best of three,” as arch Brexiteer and International Trade Secretary Liam Fox put it? If you think the situation looks chaotic today, just imagine that as a scenario for the next five years.

A second problem comes into focus the moment one starts to imagine how London and Brussels would actually work together again should Brexit be dropped. A clean break in March would at least have the merit of clarity: The European Union would establish diplomatic and trade relations with Britain more or less along the lines it does with everyone else outside the EU. Life for Brussels would be predictable. But if Britain was the awkward squad in Europe even before the 2016 referendum, heaven only knows how much chaos London might bring back into the relationship following a Brexit cancellation. After all, the end of Brexit would not mean the end of the causes of Brexit. To raise just one contentious issue, the migration question would never be off the table. An end to free movement was one of the Brexiteers’ most popular pledges, and British governments would be under permanent pressure to put a stop to it. But as Theresa May’s predecessor David Cameron found out when trying to negotiate with the EU before the referendum, Brussels won’t budge on free movement for its member states because that provision is written into EU treaties. What would transpire, then, would be an endless and bitter row with zero prospect of a resolution. Does Brussels really need that? And it wouldn’t end there: One way or another, every interaction between London and Brussels would be tainted by resentment and mistrust.

Finally, suppose one could make the referendum issue disappear, as if by magic, and further imagine that a manageable, if imperfect, working relationship could somehow be re-established. That still leaves Britain as a parliamentary democracy with a Conservative Party that, if the past is an even halfway reasonable guide to the future, will at some point in the next two or three electoral cycles be returned to power with a decent enough majority to get a Brexit motion through the House of Commons. In the context of so much accumulated anger about the failure to deliver Brexit this time around, any such Tory government would probably be ready to pull the plug on Britain’s EU membership the moment it got into office, and would not feel the need to put the matter to a popular vote on the reasonable grounds that referendums that go the way of the Eurosceptics are never respected anyway. So, if Brexit is most likely to happen regardless, what on earth is the point of dragging this all out only for the same difficult divorce issues to resurface a little further down the road?

There will be many in Brussels today who are rubbing their hands at the chaos in London over Brexit. But they should be careful what they wish for next. What is broken cannot always be mended. There is a point in some failed relationships where it is in everyone’s interests simply to let go.


The post Why Brussels Should Be Wary of a Cancelled Brexit appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 07, 2019 07:36

January 4, 2019

2019 Predictions

With a new year and a new Congress upon us, paid prognosticators are out in full force. This week, we at TAI sat down with our own crystal balls to predict what may lie in store for America and the world in 2019.

View From The Swamp



The Mueller probe brings to light some collusion-adjacent activities but no smoking gun, putting the Democrats in a bind: On the one hand, their base will regard anything less than impeachment as a craven act of surrender; on the other, Muller’s findings probably won’t be enough to justify a Republican abandonment of Trump, which means a serious impeachment attempt could backfire. It would certainly devolve into political theater—the question is whether Democrats can avoid a storyline in which they are villains subverting democracy. Don’t count on it. (Note that the optics of a failed impeachment could be even worse amid an economic downturn.) [AS]


While the Syria withdrawal probably won’t happen in anything more than a cosmetic sense, President Donald Trump is likely to succeed in getting the United States to significantly reduce its presence in Afghanistan, perhaps as soon as this year. This will represent a real turning point for America’s self-perceived role abroad. Internationalists like Robert Kagan and Tony Blinken are already very nervous that there is a strong bipartisan consensus for a less hawkish posture, and given Elizabeth Warren’s recent statement, they’re right to be. If Trump succeeds where Obama failed and ends the longest war America has ever fought, he will have guaranteed himself a place in the history books. [DM]


Constrained on his domestic agenda by a newly Democratic House, President Trump indulges his own impulses on foreign policy all the more. The President does indeed pick a Secretary of Defense whose views are more aligned with his own—or at least one who chooses the Pompeo route of prioritizing his relationship with his boss over any disagreements. The withdrawal from Syria is just the first step, followed by a massive troop reduction in Afghanistan, loss-cutting talks with the Taliban, and eventually a full pullout by year’s end. Taking a page from Jimmy Carter, Trump also orders a drawdown in South Korea, alarming and overruling nearly all his advisers. Tariffs fly with abandon, hitting friend and foe alike: Germany, Canada, the United Kingdom, Japan, and South Korea in particular come under fire, amid much outcry. Less able or willing to restrain him, Trump’s national security team seeks to minimize the damage with allies while focusing on their own particular hobby horses, away from his gaze. John Bolton continues merrily on, denouncing his usual bêtes noires (the UN, the ICC, virtually any international body known by its acronym) in fiery speeches and consoling himself that at least he agrees with Trump’s hawkish Iran policy. In short, the Trump foreign policy in 2019 looks a lot like it did in 2018, only amped up, bigly. [SK]


Reform conservatism—including its more Trump-friendly variants—continues to gain steam, while the old neocon consensus continues to implode, helped along by Trump’s likely drawdown in the Middle East. As for progressives, Current Affairs more than doubled in size this past year, and Jacobin is enjoying its own miniature renaissance. Expect these trends to continue into 2019, with the intellectual energy on the Left coalescing around anti-establishment candidates and policies. Will the demos follow suite? That is less clear. But as 2020 heats up, institutionalists in both parties will be under increased pressure to disown the status quo. [AS]

Across the Pond, and Beyond



Things may be a lot more durable than they seem, especially in Europe. Even a Hard Brexit will likely hurt the United Kingdom less than the prophets of doom would have you think, and Britain’s divorce from the EU won’t be that cataclysmic for Brussels either. And while populist upstart parties will likely make further gains in the upcoming European parliamentary elections, they are still unlikely to get the kind of majorities to make a meaningful difference. At the same time, don’t expect a “liberal wave” to materialize any time soon, either. 2018 ended with Hungary’s President Viktor Orban facing down scores of protesters across the country; but what too few pundits noted is that he was in the main being opposed not by “European values” liberals, but by workers peeved at his changes to a labor law. Even if Emmanuel Macron’s En Marche underperforms and National Rally (neé Front National) overperforms in May, look for the main outcome to be more strident predictions that the end is nigh for the European Union, but not much more. [DM]


As the trade war with the United States grinds on, and an economic slowdown hits China, Xi Jinping rattles sabers on China’s periphery. The likeliest target is Taiwan. As Michael Mazza explained in these pages last year, Beijing’s pressure campaign against Taipei has been building for years, and even the more Beijing-friendly KMT party has soured on reunification as growing numbers of Taiwanese embrace independence. Frustrated by these trends and calculating that his window of opportunity will not grow more propitious, Xi will seek to assert China’s prerogatives by aggressive displays of militarism in the Strait: more (and more frequent) naval and air operations, missile cascades à la 1996, and perhaps much more depending on the U.S. reaction. Expect Trump and team to waffle on the response, with confused and contradictory messaging and a president torn between his China hawkishness and his gut instinct that Taiwan isn’t worth a single American life. [SK]


2019 could be the year that China has that “hard landing” everyone has been predicting for years. Chinese statistics are notoriously unreliable—figuring out just what is going on is a parlor game that China hands play to pass the time in Beijing. No one, not even the Chinese authorities, has a perfectly solid grasp on reality. Tim Cook’s letter to shareholders about Apple Computer missing its earnings forecast was therefore especially jarring for its… empiricism. And it needn’t be a full-on hard landing to be devastating to the global economy. China might be too big even to slow down. [DM]

Culture Wars



At least one other prominent American cardinal or archbishop will be caught up in the ever-widening sexual abuse scandal, whether for a cover up or because of accusations of abuse. The collective efforts of American bishops to address the crisis will be panned in the press, by both the Catholic laity, and by lower ranks of the clergy. Even more state attorney generals will open up investigations into past episcopal handling of abuse cases (to be fair, this isn’t really a prediction; it’s already happening). Coupled with further revelations about the CCP’s broad-based crackdown on religion, which will cast a pall over Francis’s efforts to seek an accommodation for Chinese Catholics, the mainstream media, along with a critical mass of his progressive boosters, will decisively shift to narratives describing a papacy in disarray. [DK]


Twitter’s “Safe Space” crusade continues to expand, leading to the de-platforming of more of the usual self-avowed alt-right types. But the ban-hammer also strikes a smattering of garden-variety conservatives and intellectual dark web types, including Steven Pinker and Jordan Peterson, who fall prey to increasingly organized campaigns to purge the social media network via a clever combination of mass-reporting for terms of service violations and coordinated attacks via networks of woke millennial journalists and bloggers. The success of these campaigns comes with an ironic cost, however. Through processes of natural selection, the remaining right-leaning voices on Twitter evolve ways of getting their message out while evading the censors, thus increasing the appeal of a large spectrum of right-leaning politics. [DK]


The debate over #MeToo becomes more fraught than ever, as several of the accused celebrities claw their way back to the spotlight. Louis CK continues to fan the flames of notoriety with his New York stand-up acts, then releases a comedy special to herald his full return. The Harvey Weinstein trial becomes the most eagerly watched public spectacle since O.J. Simpson, with a partial acquittal for Weinstein at the end. Either way, #MeToo becomes a defining issue in the Democratic presidential primary, where Kirsten Gillibrand brands herself the #MeToo candidate, Joe Biden is raked over the coals for his handling of the Anita Hill hearings, and a live debate emerges over whether Bill Clinton should continue to be embraced as a party elder. [SK]


Avengers: Endgame marks the end of the superhero bubble. After a climactic battle featuring 30 overpowered fighters, subsequent releases feel like let-downs, causing ticket sales to decline. DC doesn’t fill the void left by Marvel: The Justice League had its chance to be competitive and blew it. Antman is squashed, Aquaman drowns—an era draws to a close. [AS]

The post 2019 Predictions appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 04, 2019 12:05

How Donald Trump’s Foreign Policy Defies America’s Best Interests

Halfway into his term, let us dispense with the familiar epithets about Donald Trump such as “crass,” “mendacious,” “brutal,” and “bizarre.” On the global stage, he has acted like the proverbial bull in the china shop, not out of sheer obliviousness, but because he wants to demolish the set. Yet consider how well he has actually done so far, at least by his own lights.

“Obsolete” and “free-riding” NATO, which, according to Trump, has played Uncle Sam for a sucker, is rearming. North Korea’s boy dictator has been all smiles. ISIS is largely decimated, though not dead. Across the Middle East, Trump has harnessed Israel and the Sunni states in a de facto coalition against Iran. He might yet gouge a better nuclear deal out of Tehran.

In trade, Trump’s nasty tactics have gotten the attention of Europe and China. The EU has responded gingerly to America’s punitive tariffs. China advertises its willingness to talk about age-old conflicts over intellectual property and access to its markets. On Trump’s watch, the U.S. has become the largest producer of oil, reducing its dependency on OPEC.

So crude is the new cool? Look again. China’s president Xi has just threatened Taiwan with reunification by force while one of his admirals, Lou Yuan, mused about sinking a couple of U.S. aircraft carriers to make a point, adding that the “conflict over economics and trade” was in fact “ a prime strategic issue.” In his New Year’s address, Kim Jong Un has warned that he might have to seek a “new path” if the U.S. does not drop its sanctions.

Nor is Russia being cowed. Though a failing economy, Putinland is patiently expanding its sway over the Middle East, one step at a time. And why not? Trump has just delivered a prize the new Czars from Khrushchev to Putin could not imagine in their wildest dreams. Trump intends to withdraw from one of the world’s most critical strategic arenas.

Under this dispensation, the U.S. would pull out of Syria and cut its forces in Afghanistan, which will leave its European allies there high and dry. Hence, they, too, will leave. Say “hello” again to the Taliban. The strategic chessboard will now belong to Russia plus Iran and its surrogate Hezbollah. Get ready for a major war between Israel and Hezbollah, which will make the last round of 2006 look like a skirmish.

Syria “belongs to you,” Trump has reportedly told Turkey’s strongman Erdogan. The would-be sultan has threatened to massacre those very Kurds, America’s best allies, who have borne the brunt of the fighting against ISIS. If Trump actually betrays them, their abandonment will serve as a bloody warning to all allies of the United States: Once you do our bidding, we will ditch you.

So much for the ugly daily news that do not make the 45th president shine as a master strategist like Henry Kissinger, who managed to extrude the Soviet Union from the Middle East in the early 1970s. Now go back into history to unearth a nasty pattern that has emerged again and again ever since the United States bestrode the world first as dominant, then mightiest player at the turn of the 20th century. Call it “retraction” in the way of Donald Trump.

Too bad, the astronomical bills followed soon thereafter.

Let us count the ways. The U.S. stayed out of World War I until the last minute. The price was million-fold death in Europe, then the dispatch of 2 million soldiers and 320,000 casualties.

After the Great War, the U.S. turned its back on Europe, leaving the field wide open for the Fascist powers. (“America first,” by the way, was the slogan of U.S. isolationism in the interwar period.) Again the United States stayed out, until December 1941, until Britain was sinking and the Nazis were on a roll from France to Soviet Russia. The price of inaction was gigantic. 130,000 American soldiers were killed in action in Europe alone.

Back home again. At the Yalta Conference in February 1945, Franklin D. Roosevelt told Joseph Stalin that all U.S. troops would be out two years hence. In effect, he also conceded Eastern Europe to the Soviet Union, which lost no time “communizing” Poland, Czechoslovakia, and the rest. By 1946, the U.S. had a new war on its hands: the Cold War. By the 1960s, 300,000 U.S. troops were back in Western Europe, alongside thousands of tactical nuclear weapons. Luckily, the war never turned hot, but the cost of containing Soviet Russia with men and materiel were gargantuan. Though the bulk of U.S. forces were withdrawn in 1994, three years after the suicide of the Soviet Union, an American garrison of some 30,000 is still in place.

In 1950, Secretary of State Dean Acheson pointedly excluded South Korea from the American “perimeter.” Six months later, North Korea invaded the South, and the U.S. intervened. The price was 35,000 dead,

The moral of this tale is as simple as can be. Just like nature, international politics abhors a vacuum. No balance of power, no stability, let alone peace. With his strategy of retraction, Donald Trump has been in good company, so to speak, since World War I.

Indeed, Trump is merely completing what Barack Obama started when he vacated his “red line” against Assad’s chemical weapons. The dictator used them again and again, safely ensconced behind the Russian military in Syria. It is a moot point now. But do imagine what humanitarian and strategic disaster could have been averted by judicious American action before Moscow inserted its might for good. To deter is always more economical and less dangerous than to dislodge your opponent once he is in place.

To be sure, Clio, the goddess of history, likes to issue ambiguous advice. But in the case of the United States, retraction has not proven a winner. Leaving a power vacuum today means raising the ante a hundredfold tomorrow. To recall this lesson is not a pitch for throwing the country’s weight around just because it can.

Vietnam and Iraq II (2003) were the two most foolish wars in American history. Yet in the case of Iraq, the logic of stability and the injunction against withdrawal would have told George W. Bush not to destroy a state that had served as potent bulwark against Iranian hegemonism. So the current strategic disaster in Syria began with “mission accomplished” in 2003. Today, Iran has bought itself a border with Israel while extending its reach all the way to the Mediterranean.

“Fortress America,” which inspires Trump’s not-so-grand strategy, is an illusion that defies history and the well-considered American interest, just like trade wars diminish the welfare of nations by reducing the real incomes of their consumers.

Alas, since the departure of Secretary Jim Mattis, there are no more “adults in the room.” And the Democratic House of Representatives shares many of Trump’s isolationist reflexes; remember that populism is always found on both the Left and the Right.

America is turning into a wobbly pillar of world order. You might want to drag out your old LPs and listen to Kris Kristofferson’s “Help Me Make It Through the Night.”


The post How Donald Trump’s Foreign Policy Defies America’s Best Interests appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 04, 2019 08:36

The Danske Bank Money Laundering Trail

With four separate investigations now underway, UK, U.S., EU, and Danish investigators are all peering into the murky dealings of the Estonian branch of Danske Bank, headquartered in Copenhagen. Over the course of a nine-year period beginning in 2007, the bank is suspected to have served as a key link in transferring more than $225 billion from Russia to London and other leading Western capital market centers.

The scale of the unlawful flows that passed through Danske Bank is without precedent. But the scandal rocking the bank is only a microcosm of a wider problem, and a deeper failure of post-financial crisis governance.

A central goal of European and American banking reforms following the financial crisis was to significantly improve banking risk management, through regulatory reforms and internal strengthening of compliance systems. But the unprecedented scale of malfeasance at major global banks since then demonstrates that this effort has been woefully inadequate. Across the globe, banks have faced record-high fines for corruption, money laundering, interest rate manipulation, and fraud.

In the case of Danske Bank, the names of the original individuals and organizations believed to have engineered the illicit flows of cash from Russia and Azerbaijan are still unknown. So, too, are the beneficial owners of the UK-based companies registered in various Caribbean islands that accepted the funds.

Danske Bank and other similar banks with branches in the Baltic countries are surely undertaking a full review of their compliance approaches and controls. But they need to go far beyond technical reviews to question the core culture of their institutions and the failure of their boards of directors to vigilantly ensure management oversight.

Perhaps the findings of official investigations will wake up the sleepy bankers in their board rooms. If that happens, and if it precipitates real change in how banking risk is managed and compliance ensured, we may have the British to thank.

Of all the ongoing investigations into Danske Bank, the British approach may be both the most original and productive. The British National Crime Agency (NCA) and its newly established affiliate, The National Economic Crime Centre (NECC), are now concentrating their efforts on cash coming from Azerbaijan. This appears to be a plan to go after what may be the single largest set of money laundering schemes in history.

The British were first alerted to substantial inflows of Azerbaijani cash into the United Kingdom well before the full scale of Danske Bank’s dealings became known. In 2017, a group of journalists from countries including Germany, Switzerland, Russia, France, the Czech Republic, the Baltic states, the United Kingdom, and the United States released a series of reports on what they called the “Azerbaijani Laundromat.” Working with the Organized Crime and Corruption Reporting Project (OCCRP), they uncovered “a complex money-laundering operation and slush fund that handled $2.9 billion over a two-year period through four shell companies registered in the UK.”

To combat such activities, the UK authorities are now using a new tool: Unexplained Wealth Orders (UWOs), which were created under the British Criminal Act of 2017 and have been effective since the start of 2018. In a series of court decisions, three UWOs have been brought against Mrs. Zamira Hajiyeva from Baku, Azerbaijan, who lives in London. The UWOs call on her to explain how she was able to raise the funds to own two UK properties valued at over $28 million, as well as jewelry valued at over $500,000 that the NCA impounded from Christie’s auction house.

Mrs. Hajiyeva is the wife of Jahangir Hajiyev, who from 2001 until 2015 was chair of the International Bank of Azerbaijan. For many years, he was very close to his country’s all-powerful first family: President Ilham Aliyev and his wife Mehriban Aliyeva, who also serves as Vice President. But Hajijyev must have displeased them, perhaps taking too much of their cash; he was jailed for 15 years in 2016 and is now facing further charges.

Mrs. Hajiyeva lives in a mansion a three minutes’ walk from the Harrods department store in London where, according to NCA documents, she spent around $21 million over several years. London was a logical second home for the family as the International Bank of Azerbaijan had many dealings in the United Kingdom and its affairs clearly interest the NCA.

But the British are not the only ones to be interested. In mid-October, members of the Council of Europe Assembly signed a joint motion decrying Baku’s inaction in face of the allegations. “One year since the Laundromat investigation exposed Azerbaijan’s multi-billion scheme to carry influence, pay lobbyists, and to launder cash,” they wrote, “nothing has changed in Azerbaijan. No criminal investigation has been launched; no one held accountable.”

Members of the Council have not only expressed concern about the alleged money laundering, but also about reports that significant sums have been spent to hire consultants and politicians in Western countries to polish the reputation of the Azerbaijani regime and thereby deflect attention from the allegations. The Council also noted concern about Merhman Huseynov, a “famous blogger and human rights defender. . . [who] is serving a two-year sentence in an Azerbaijani prison on made up defamation charges. His family has been subjected to pressure which resulted in the death of Huseynov’s mother on 6 August 2018.”

Journalists like Huseynov who investigate alleged corruption and money laundering face extreme risks. He is just one of 14 courageous Azerbaijani journalists who are in prison in their home country. Between March 2005 and March 2018, the following reporters have been killed in Azerbaijan with no official investigations into their deaths: Elmar Husenov, Alim Kazimli, Novruzali Mammadov, Rafig Tagi, Rasim Aliyev, Memman Galandarov, and most recently, Eyyub Karimov.

The NCA’s decision to investigate issues arising from both the “laundromat” reporting and the disclosures of vast alleged money-laundering through Danske Bank will hopefully turn an international spotlight on such courageous reporters, who are running serious risks as they strive to secure transparency into vast cross-border financial transactions.

The Danske Bank scandal underscores the intense complexity of global money laundering systems today, the repeated failures of internationally active banks to be sufficiently diligent in their risk management practices, and the formidable risks that journalists take as they seek to investigate the dark money trails.

Forceful UK action could possibly involve prosecutions and serve as a warning to kleptocrats worldwide that the risks of using the country as a major base to invest their stolen assets are finally rising. The fact is that current practices will continue so long as there are no prosecutions of leading money launderers and top bankers. In the United States, after all, not a single chairman or chief executive officer of a global bank has faced criminal charges over the last ten years, despite multiple investigations by the FBI and U.S. Justice Department that ended with settlements involving multi-billion dollar fines.

Cultural reforms in the boardrooms of banks are equally important. So long as board directors provide major bonuses to senior executives on the basis of short-term profit results, banking cultures that place making money above sound ethics and concern for their public reputation will persist.

That kind of culture, which evidently held sway at Danske Bank, pervades many non-bank institutions too. It can be seen in the leading international financial centers that aid and abet the money launderers, be they crooked international businessmen or corrupt politicians. The enabling armies of lawyers, accountants, real estate brokers, consultants and major auction houses that conspire with international crooks to find ways to establish secretive offshore holding companies, transfer illicit funds from one country to another, and ensure these funds are soundly invested are as much at fault as the bankers and their inefficient regulators.

Exposing and combatting such enablers will be difficult, but the stakes are very high. For one, the scale of international money laundering appears to be rising, as the huge sums that passed through Danske Bank illustrate. Unless illicit cross-border flows of finance are curbed, the stability of the international financial system will itself be placed at risk.

Second, time and again these scandals are exposed by whistleblowers and investigative journalists. Their invaluable work demands great courage, and the risks they face are rising. Members of the Council of the European Assembly and other legislative bodies have at times sought to sound the alarm bell and pass measures to protect journalists. But such efforts are both too few and too weak. From Mexico to Turkey, from Egypt to Azerbaijan, journalists the world over are being jailed for investigating grand corruption.

Those who long for a more accountable, transparent financial order should cheer on the inquiries into Danske Bank. A thorough reckoning with the bank’s transgressions—one that follows the money trail wherever it leads—could open the floodgates to further necessary investigations and eventual reform. The implications of the case are already enormous, but as one party involved recently put it, what we know now may just be the tip of the iceberg.


The post The Danske Bank Money Laundering Trail appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 04, 2019 07:51

January 3, 2019

The Fall of Jim Wright—and the House of Representatives

Speaker Jim Wright: Power, Scandal, and the Birth of Modern Politics

J. Brooks Flippen 

University of Texas Press, 2018, 552 pp., $35


Those of us seeking to understand what has gone wrong with Congress are always on the lookout for characters who can be cast as villains in the institution’s history. Standard discussions (including McKay Coppins’s recent Atlantic story) often center on Newt Gingrich, and certainly this decade’s revelations about Dennis Hastert’s terrible past now make him a superlative heel. But for those who want to look across the aisle, Speaker of the House Jim Wright (D-Tex., in Congress 1955-1989, Speaker 1986-1989) is sometimes singled out.

Understanding who Wright was and how he went wrong is thus of great importance for congressional institutionalists, and we now have a new biography to aid in the effort. J. Brooks Flippen’s Speaker Jim Wright methodically works its way through Wright’s youth in Weatherford, Texas, his service as a bombardier in World War II, his election in 1946, at age 24, to the Texas House of Representatives, his loss of that seat and subsequent business career and stint as Weatherford’s mayor, and from there into his 34-year-long career in the House of Representatives.

Flippen wants his readers to appreciate Wright’s many-sidedness. He was an ambitious man whose energy and discipline drove him to great political successes; a fairly liberal Texan; a fiscal conservative pork-barreler; a representative of his generation and its style of compromise-driven politics, who then became a Speaker whose win-at-any-cost attitude made his downfall inevitable. He convincingly argues that if we focus only on the corruption scandals that allowed his opponents to drive him from office in 1989, we will be unable to understand one of the major figures in 20th-century American politics. Indeed, he grandly closes: “To understand Jim Wright in all his complexity, with all his flaws and mistakes, all his strengths and triumphs, is to understand much of the American past and the politicians who guided it. The story of Jim Wright, whether a tragedy or triumph, is a story of America.”

That seems hard to argue with—but those of us who think that Congress’s devolution threatens to turn the American story tragic ought to regard Jim Wright’s legacy with somewhat less equanimity than this. Wright’s speakership occupied a pivotal place in Congress’s history, a moment in which the era of the “Sun and Moon parties” (with Democrats as the dominant star and Republicans as the orbiting satellite) was giving way to one of balanced partisan competition. No doubt, this was an exceptional challenge. But Wright failed spectacularly, in a way that discredited institutionalism by making it seem like a lame cover for simple corruption. Flippen is right that we can learn from taking a long view of Wright’s life and career, and in so doing appreciate his virtues. At the end of this reflection, though, if we are determined to shake off the inheritance of broken politics that Wright bequeathed, we should not hesitate to render judgment.

Flippen makes it clear that from his youth, Wright was driven by an almost boundless ambition. But, far from leaving him a mere opportunist, the young Wright’s ambitions led him to become a serious policymaker, someone who took his district’s interests as a starting point and made himself into a master of these domains over many decades. As the representative of Fort Worth, Texas, and its surrounding environs, these interests prominently included water policy, infrastructure development, and Latin America policy, each of which became a Wright specialty.

As he pursued these interests, Wright demonstrated a special aptitude as a bridge-builder between disparate factions within the Democratic coalition. Part of this was sheer force of personality; Wright was good at making friends, and his seat on the Environment and Public Works committee provided him the ability to bestow favors on his many friends over the years. But Wright’s background as a liberal from the conservative south was also crucial in giving him credibility with different kinds of people. He skillfully cast himself as an ally of unions, but not someone at their beck and call; as a mild booster of civil rights, but not someone who was pushing things along too far ahead of their time; and as a deficit hawk, but not someone who was stingy about investing in America’s future where good opportunities existed. Importantly, he was also someone who early on styled himself as an advocate of transparency and restrictions on money in politics, while also becoming a powerhouse fundraiser for the party.

Wright’s statesmanlike ability to straddle fences while still forcefully advancing his own priorities put him in an ideal position to climb the party leadership ladder. That was not actually his preferred path; unlike his colleague Tip O’Neill, Wright did not think of himself as a “Man of the House.” When Lyndon Johnson, one of Wright’s mentors in Congress in the 1950s, became Vice President, Wright pushed hard to win his Senate seat in a 1961 special election, ultimately losing a hard-fought primary. He flirted with jumping to the Executive Branch in the Kennedy and Johnson administrations, and was often a vocal defender of the Executive Branch’s primacy in foreign affairs during those years and even into the Nixon Administration. Only once those opportunities passed him by and Wright found himself rising in seniority did he begin to envision himself as a House leader and defend the particular prerogatives of his chamber.

Whatever Wright’s deepest desires, throughout the 1970s he outmaneuvered competitors in the tumultuous Democratic caucus to position himself to rise. Although O’Neill grabbed the majority whip spot after the tragic plane-crash death of Hale Boggs in 1972 and thereby set his own trajectory to the speakership, Wright managed to straddle the divide between reformers and the old guard to eventually beat out Richard Bolling and Phillip Burton by a single vote on the third ballot of the majority leader contest after the elections of 1976. O’Neill and Wright continued the famed Austin-Boston Connection that gave a sense of balance to the national Democratic coalition. Wright waited his turn and then assiduously secured commitments from his copartisans ahead of O’Neill’s exit after 1986, making his rise to the speakership free of significant drama.

As Wright traveled his path to the Speakership, America’s political landscape was transforming. Party politics in Texas when Wright began his career was Democratic politics; Republicans were not completely absent from the scene, but they were treated by the real players as a superfluous sort of local color. The Speaker of the House when Wright arrived to the House was Sam Rayburn, whom the younger Texan Wright looked up to with filial reverence. In Wright’s reckoning, the House when he arrived was marked by its cooperative style in which members’ “mutual assumption of honor. . .held things together.” Not until he was a dozen election cycles in did Wright draw a serious general election opponent in 1974. He prided himself on his “almost unvarying practice never even to mention the name of my opponent. I talked instead of what I believed and what I was trying to achieve.”

Given Wright’s comfort with the familiar one-party political scene, Republicans’ slow post-Civil Rights return from obscurity in the South was bound to unsettle him. John Tower’s election to Johnson’s Senate seat rightly struck him as a harbinger of a confrontational style of politics in the making. When he was Majority Leader and charged with wrangling conservative Texans to stick with the Democratic Party rather than going over to the Reagan Administration, he was bitterly disappointed at every defection. As he put it: “I feel like the wife who was asked whether she considered divorce. She answered, ‘Divorce, no, murder, yes.’” But divorce would come, in any case, for party-switching members including Phil Gramm and Kent Hance. One senses from Flippen’s account that by the mid-1980s, Wright had come to thoroughly resent Republicans, whom he had thought of as a domesticated species. He and O’Neill especially relished pushing through domestic spending bills over Reagan’s vetoes.

Once he had the Speaker’s gavel, it was quickly evident that Wright had left the consensual politics of his younger days behind. Both his fellow Democrats and Republicans found him to be imperious and easily offended by dissent; as Flippen puts it, “he dictated more than he consulted.” Wright restructured the Democratic whip operation without consulting his caucus’s elected whip, Tony Coelho; he tried (unsuccessfully) to mount a leadership takeover of the House Administration Committee; he delayed naming his Rules Committee appointments so that the ultimate recipients would “remember they were the Speaker’s appointees”; and he grabbed extra office space. Members also thought he was addicted to arbitrary deadlines and limits on debates. Most dramatically, as Wright sought to push through a revenue-positive reconciliation bill following “Black Monday” in October 1987, his favored rule was voted down. Chamber rules prohibited an identical vote on the same day, but Wright was determined—so he ordered the House adjourned and then immediately reconvened, saying that this made a new vote permissible. On the brink of losing that re-vote 207-206, Wright kept the voting open 10 minutes past its 15-minute limit until he could pressure a Democratic hold-out to support him.

As Wright saw it, all of this was in the name of efficient policymaking, and his caucus ultimately appreciated it. As he wrote in his memoir, “Instead of autonomous feudal barons each pursuing a leisurely independent course, most members of the hierarchy once described by O’Neill as his ‘College of Cardinals’ now felt part of a purposeful team.” He wanted the Speakership because he “saw things I thought needed doing, and I supposed that I could bring them about.” However inconsiderately he treated his fellow Democrats, then, they could at least console themselves with the sense that he was making the most of their party’s control of Congress.

But Republicans had no such consolation, and they came to see Wright as an enemy, plain and simple. Throughout the 1980s, there was a simmering dispute among congressional Republicans: whether to participate unreservedly in Democrat-led governance, as House Minority Leader Bob Michel preferred, or to adopt a more confrontational stance, as upstart Newt Gingrich loudly advocated. When O’Neill was Speaker, Michel’s way of thinking held its own. But Wright’s tendency to steamroll Republicans ensured that Gingrich would steadily accumulate influence. Political scientist Nelson Polsby judged that Wright’s style “further weaken[ed] the credibility of go-along, get-along Republicans among their Republican colleagues. By drawing partisan lines, Wright gave Republican moderates—moderates in style, not necessarily in policy preferences—no place to go but into the camp of Republican militants.” Said rising Republican star Vin Weber, “The dislike of Speaker O’Neill was ideological. . . .He was really the symbol of northeastern liberalism. The dislike of Speaker Wright is different. Republicans think he is basically and fundamentally unfair; that he does not have the respect for the institution like Tip; that deep down he is a mean-spirited person, ruthless in the truest sense of the word.”

As House Republicans seethed at Wright’s high-handedness as their chamber’s parliamentary leader, many outside of Congress came to think that Wright was acting too big for his britches in other ways as well. From his earliest days in Congress, Wright had developed his interest in foreign affairs, and especially Central America. For many years, he had seen himself as a defender of the Executive Branch’s leading role in diplomacy and war, but revelations during the Reagan Administration led to a sharp turn. Wright learned that the CIA had taken actions specifically with the intent of stonewalling Congress’s fact-finding in war-torn Nicaragua and El Salvador, and generally came to believe that the administration was acting in bad faith. In response, he became embroiled in a war of words with the spy agency, which accused him of improperly leaking classified information. More strikingly, he sometimes seemed to be conducting a one-man foreign policy out of the Speaker’s office, trying to broker deals between the Contras and Sandinistas even as the Reagan Administration refused to deal with them. By Wright’s own reckoning, this adventure was a success, helping to set the groundwork for a ceasefire deal in 1988. But the American press mostly judged him out of line, “an egomaniac who thought that he was president.”

Wright, then, was not lacking for enemies. When they came looking for weapons to use against him, they found he had given them an embarrassment of riches over the years through behavior ranging from shady to obviously corrupt. His 1961 Senate campaign had racked up $90,000 in debt unbeknownst to him, and he insisted that he find a way to cover those expenditures. A fundraiser organized for the purpose drew charges that employees of local Fort Worth businesses had been forced to contribute. That lingering debt gave Wright a chip on his shoulder about what his political career had done to his fortune, which led him to work with his lawyers to figure out how to exploit every available legal loophole. In Flippen’s view, he was “determined to augment his wealth,” and “thought it only fair that he had the same opportunities others had enjoyed.” Both of Wright’s wives’ professional lives became entwined in his business ventures in awkward and suspicious ways. Wright became partners in a real estate venture with a political admirer, George Mallick, which proved to be highly lucrative. Their company would come to make Wright large loans over time. More directly relating to Wright’s political responsibilities, he rather tactlessly interceded with the Federal Savings and Loan Insurance Corporation on behalf of a number of Texas firms.

Perhaps most infamous was the scandal surrounding a book that Wright published as Majority Leader in 1984, Reflections of a Public Man. A collection of Wright speeches, the book did not seem likely to be a hot seller. But Wright nevertheless negotiated a 55 percent royalty for himself on all copies sold. Before long, those who wished to ingratiate themselves with Wright were buying in bulk. One man later admitted to buying 1,000 copies “just trying to make a contribution to Jim’s income.” This was most definitely legal—limits on outside income had an exception for book royalties—but it stank of corruption.

Newt Gingrich picked up that scent and identified it as one of Democrats’ biggest weaknesses. He thought Wright’s relationship to money was a relic of an earlier era, writing in his own memoir: “Just as Spiro Agnew had discovered that the style of corruption taken perfectly for granted in 1960 Maryland would destroy him as Vice President a decade later, the habits and practices that were perfectly survivable in Texas a generation earlier would not pass the national standards of the late 1980s.” He knew that gunning for Wright would make him unpopular with those happy with the institutional status quo, but he rightly calculated that it was a political risk worth taking given the public’s general wariness of Congress. Because of his agitation, the House Ethics Committee was convinced of the need for a special counsel to investigate Wright’s activities. The counsel’s report, submitted in February 1989, was damning, running to almost 500 pages and painting an unequivocal picture of corruption. In April, the Ethics Committee found 69 different charges worth pursuing further, and in May it was reported that the IRS would conduct a criminal investigation. Realizing that he was effectively crippled, Wright resigned on June 6, 1989. The ascendancy of Gingrich’s brand of politics was all but certain.

Wright and Gingrich deserve to be remembered together by posterity. Gingrich, of course, kept on brawling. In the midterm election of 1994 he delivered the House majority to the GOP for the first time in four decades. His own speakership was tumultuous, to say the least, and he was driven out after two terms, having alienated his own caucus and run up his own list of ethics charges. In many ways, then, Gingrich and Wright look like mirror images of each other: the two men who saw that cutthroat partisan competition was now at hand, after a long era of one-party dominance, and led their parties to adopt a stance of mortal combat in response.

One might wonder whether their respective attempts to turn the House of Representatives into a well-honed tool for partisan warfare ought to be regarded as ultimately salutary, if often painful to experience close-up. After all, ambition is supposed to counteract ambition, and both Speakers mobilized Congress against presidents they saw as out-of-line. When the American people chose divided government in 1986 and 1994, wasn’t partisan combat what they were signing up for? A case can certainly be made. The clashes between Wright and Reagan on taxes and spending, and between Gingrich and Bill Clinton on welfare and spending, arguably produced reasonable syntheses on thorny political issues.

But zoom out to the level of institutional history, and it is hard to avoid seeing Wright and Gingrich as destroyers. Although they locked horns with presidents, their fury was directed at least as much inwardly, against their own colleagues—and not just those across the aisle. Both men believed that centralization of power within their own office was the best way of serving their parties, which allowed them to rationalize abusing anyone and everyone who could be seen as an independent power center. Their actions ultimately hollowed out the House’s policymaking abilities (which were deeply intertwined with the committee system they trampled over) and helped to ruin the reputation of their institution as a place for anything other than partisan jockeying.

In Wright’s farewell address before resigning as Speaker, he denounced members who had “become self-appointed vigilantes carrying out personal vendettas against members of the other party, in God’s name that is not what this institution is supposed to be about!” He continued: “All of us in both political parties must resolve to bring this period of mindless cannibalism to an end! There has been enough of it.” Fine words, but the period is still going strong almost three decades later. Ultimately, the will to end it is unlikely to come from party leaders, who are cast in the role of field generals in the ongoing war between the parties, thereby securing the loyalty of their infantry. Instead, a new era for Congress will have to come from its members insisting on organizing the institution so as to create meaningful opportunities for action for rank-and-file members, even those in the minority. Tell this to regular members, and they vigorously nod along—before fretting that the defeat of the other party in the next election really is the most important thing. And so, we are destined to spend at least a little longer in the world that Wright and Gingrich made.


Jim Wright, Balance of Power (Turner Publishing, Inc., 1996), p. 39.

Ibid., 228.

Nelson Polsby, How Congress Evolves: Social Bases of Institutional Change (Oxford University Press, 2004), p. 133

Ibid., 134.

Ibid., 133, 132-136.

Ibid., 135.



The post The Fall of Jim Wright—and the House of Representatives appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 03, 2019 09:19

January 2, 2019

After the Crimean Consensus

Russia brazenly captured global attention and three Ukrainian naval vessels in a late-November clash in the Azov Sea, the culmination of months of growing tensions. This event amplified murmurs that President Vladimir Putin, worried by a precipitous decline in his approval rating, stands prepared to escalate the war in Ukraine to distract the Russian populace from economic stagnation and unpopular reforms. While this explanation may feel compelling, it suffers a critical flaw: Putin’s approval rating has actually been relatively stable for several months now.

At present, pollsters FOM and Levada Center find Putin’s current job approval rating hovering somewhere between 60 and 65 percent. Those figures are down from the neighborhood of 80 percent in spring 2018—a notable drop, but one that has levelled off since July. The same dynamic is observed in trust measures, wherein Russians are asked to volunteer the names of politicians they trust. These days, roughly 35 percent name Putin: a 15 to 20 percent drop from spring figures, but also a trend that has stabilized. Perhaps of most acute interest to the Kremlin, protest potential indicators have ebbed somewhat, particularly since rallies this summer. Russians may remain discontented and jaded, but they are less apt to take to the streets over it. That is critical for Putin’s team, which places a tremendous a premium on optics. In short and as usual, rumors of Putin’s imminent political demise are greatly exaggerated.

That said, the Crimean Consensus—the surge in political cohesion following the annexation of Crimea—is now thoroughly over, a change with critical implications for governance in Russia.

The key dynamic that has emerged is what might be termed an increase political friction: Political actors who would have simply gone along with the agenda handed to them in 2014 are now more willing to bargain or demand concessions. Practically, delivering on policy will now be slightly more difficult for Putin, requiring ample time and sweeteners for stakeholders. While wholly unremarkable in regular democracies, this is a greater challenge in a “managed democracy.” Powerful domestic politics curators in the Presidential Administration apparatus, who have long deserved more attention in the Western press, are going to be a lot busier over the next six years.

Though Russians may continue to see Putin as above politics (albeit less so than in the past), United Russia, the party that essentially represents the status quo, is about as unpopular as he is popular. Though the party dutifully delivered on pension reform, it did so with a notable public defection. More recently, at its annual congress, party members enacted a ban on public criticism of its own policy decisions. United Russia is stuck with a classic principal-agent problem. The party exists to move forward Putin’s agenda in the Duma, but necessarily has its own interests and instincts, survival chief among them. Across the aisle of the Duma, Russia’s “opposition” has taken a more active role after a period of obsolescence following the annexation of Crimea. Liberal Democratic and Communist Party leaders Gennady Zyuganov and Vladimir Zhirinovsky received a positive response to their request for regular meetings with Putin. While neither is capable of holding up the Kremlin’s agenda, both they and Putin’s team understand full well that the “systemic opposition”—nominal opposition within the Putinist system—stands to lose credibility as a pressure valve for angry voters if it fails to deliver at least nominal opposition.

Recent gubernatorial elections are a concrete example of this new political dynamic. The recent election in Primorye Krai was cancelled after blatant voter fraud that pushed the local United Russia candidate over the line. Central authorities moved to prevent the “victorious” Communist Party candidate from participating in repeat elections, while the acting governor appointed after the debacle then ran and won as an independent candidate. While no one should expect him to be an oppositional stalwart, he derives some of his legitimacy from his independence, and will have to demonstrate it. While the Communist Party did not contest being muscled out of victory in Primorye, that appears to have been more of a horse trade than acquiescence: In the Siberian republic of Khakassia, the Communist candidate was allowed to take power unopposed after a first round vote. That result spurred Kremlin discussions on running trustworthy opposition candidates in places where United Russia is unpopular to avoid further embarrassment. (It also ended the 17-year Senate tenure of the best haircut in Russian politics.)

Russia’s political friction can also be seen in the Kremlin’s use of local referenda to make a populist end run around traditional politics altogether. A recent campaign saw a series of . Though perhaps of more cultural significance than political, the strategy marks a means to head off apathy among voters—low turnout makes elections, even falsified ones, feel illegitimate—while avoiding the horse-trading that the use of traditional political channels might require. Popular referenda also have the benefit of keeping United Russia out of news, its powder dry for further unpopular policies should the need arise. However, a turn to direct democracy has costs of its own, and requires just as much curation as traditional electoral politics do.

Russia is not on the cusp of becoming a pluralist democracy, but its politics are becoming ever so slightly more “normal.” However, in a carefully curated political system, more “normal” means more difficult and expensive to manage. With regards to foreign policy, then, the argument that Putin is about to personally gobble up Ukraine to avoid his imminent overthrow lacks nuance. It is more the case that with the costs of doing political business in the domestic arena increased, Putin and his team may find it easier to engage abroad, escalating where the opportunity arises to achieve the right optics. That likelihood is compounded by Putin’s consistently demonstrated preference for avoiding “boring” domestic politics; he would much rather deal with advanced missile tests than labor issues. The key question looming over this next, and probably last, term is whether he is able to enjoy that luxury.

 


The post After the Crimean Consensus appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on January 02, 2019 09:04

December 31, 2018

Greatness and Goodness: Parting Thoughts on 1968

In the 1990s, Bill Clinton, like fellow presidents George W. Bush and Donald J. Trump a “68er” (all three men graduated from college in 1968), reportedly told the celebrated Martin Luther King, Jr. biographer Taylor Branch that how a person answered a single question—“Do you think on balance the 1960s were good or bad for America?”—could predict with 80 percent accuracy how he or she would vote. Those who said “good”—who saw the decade as a time when civil rights were expanded, authority was questioned, the young marched against an unjust war, and new forms of cultural expression erupted—tended to be progressives and vote Democratic. Those, in contrast, who saw the decade as one of internal chaos, riots in the streets, a rising drug culture, and a lost war, were more likely to be conservatives and vote Republican.

Over the course of 2018, Americans of all stripes have commemorated 1968, that great annus horribilis of modern American history, and the tumultuous events it spawned: King’s assassination, the bloody Tet Offensive in Vietnam, the murder of Robert F. Kennedy, the seizure by North Korea of the USS Pueblo, the riots at the Democratic National Convention in Chicago, the Black Power salute by American athletes at the Mexico City Olympics, and what many saw as the redemptive circling of the moon by Apollo 8 on Christmas Eve.

As Clinton’s archetypical Democratic voters would argue, there can be no question that the United States of 2018 is in many ways a “better” society than that of 1968—fairer, more just, less provincial, more inclusive, less judgmental. The assassination of King and the urban riots that followed reflected the unresolved problem of race in America, as had been documented in the report issued in February 1968 by the Kerner Commission (formed to probe the causes of the urban rioting during the previous year). The feminist movement was still in its infancy in 1968, with inequality and lack of opportunity a universal fact of life for that half of the population that happened to be born female. Homosexuals (the word gay was not yet in general use) were almost universally subject to social ostracism and legal persecution. Other groups—Mexican-Americans, Puerto Ricans, Native Americans, and the poor of Appalachia—were similarly disadvantaged, as Robert Kennedy had emphasized in his brief electoral insurgency. Nothing as yet had been done to rectify the injustices done to the Japanese-American community in World War II. While race, gender, ethnicity, and sexual orientation remain preoccupations of contemporary America, few today would argue, following the election of an African-American president, after Obergefell v. Hodges, and with the electoral gains made by women in the 2018 midterm elections, that the United States has not become a fairer and more inclusive society.

On the other hand, Clinton’s archetypical Republicans might argue, the United States of 1968 was in many respects a “greater” country than the America of 2018: more capable of doing big things, more willing to sacrifice for what was seen as the common good, less of a whiner, more of a risk-taker—a “weary titan,” in Lord Salisbury’s phrase, but a titan nonetheless. And in these two sides of 1968—“greater” but not as “good”—lies a question worth pondering as 2018 draws to its close. How is it that the United States of 1968, as unfair and provincial as in retrospect it appears to have been, was able to accomplish things that the more just, more equal, more inclusionary, more cosmopolitan—not to mention bigger, richer, and more educated—America of 2018 finds impossible to achieve or even attempt?

Paying One’s Way in the World

To begin with the obvious, the United States of 1968 paid its way in the world; that of 2018 does not. In 1968 the United States ran a surplus on trade in goods and services of some $2.4 billion, a small percentage of its then GNP of $860.7 billion. Registering a trade surplus in 1968 was a bit like winning the Olympic pole vault—something that Americans had done uninterruptedly since the late 19th century, part of the natural order of things. Today, of course, the situation is very different. The United States last ran a trade surplus in 1975. Most Americans alive today have never lived in a country that for a single year or month of their lives sold more to the world than it bought—that did not every day run down its international assets and increase its external debts by consuming more than it produced. President Trump has broken with decorum by ranting about the deficit and imposing punitive tariffs on friend and foe, but the deficit has persisted and indeed grown somewhat on his watch. It is on track to reach more than $600 billion, or about 3.3 percent of national product (down from nearly 7 percent of GDP in 2006) by year’s end.

To run a balance-of-payments surplus in 1968, moreover, was not merely a mark of economic strength; it was a political and strategic necessity growing out of the Bretton Woods system and the special burdens it placed on the United States. Today’s foreign policy establishment has made a fetish of Bretton Woods, a key part of the so-called liberal international order established after World War II and now said to be under attack by Trump. The reality, of course, is that Bretton Woods in 1968 meant something entirely different than it does in 2018. Today the Bretton Woods institutions, the International Monetary Fund and the World Bank, are home to well-paid international bureaucrats who mostly dispense advice and administer loans to poor or mismanaged countries. These institutions are overseen by government officials who meet on a regular basis, mostly with likeminded officials from other countries, generally in very pleasant spots with the best of accommodations. Except for those countries who suffer the misfortune of having to turn to the fund and the bank, there is no sacrifice, hardship, or even discipline involved with being part of Bretton Woods.

In 1968, in contrast, the real Bretton Woods entailed substantial obligations on the part of the United States, ones that the American political class and presidents in particular took with deadly seriousness. The key to the system were the U.S. commitments in the original IMF Articles of Agreement (since abandoned) to exchange on-demand dollars or foreign currencies into gold at the rate of $35 per ounce. The United States had to run surpluses over the long run, lest foreigners cash in their excess dollars, depleting the U.S. gold stock and bringing the system crashing down. The system was in some ways primitive (Keynes had called the gold standard a “barbarous relic,” and the par value system established in 1944 still relied heavily on gold), but it imposed disciplines on countries and limited, for better or worse, the movement of capital by private firms and speculators.

President Lyndon B. Johnson began 1968 by announcing a new set of policy measures—one of many such packages put in place before the Bretton Woods system finally collapsed in the early 1970s—aimed at strengthening what Johnson called the liquidity position of the United States. The government cut its overseas expenditures and undertook efforts to promote exports and attract investment. Even ordinary citizens and the private sector were asked to sacrifice in ways that from today’s perspective seem almost quaint. Johnson urged business and labor (unions had far more power then) to avoid strikes in key industries that could threaten American exports and the trade surplus, while the American people were asked to continue adhering to an earlier request to “defer for the next two years all nonessential travel outside the Western Hemisphere.”

The real Bretton Woods was unsustainable, and no one suggests that it could or should be restored. But its significance is widely misunderstood. Trump rails against foreigners who have cheated the United States in trade deals or by systematically undervaluing their currencies, forgetting that when America was, by his own definition, “great,” it did not publicly whine and complain about the behavior of others, but in fact made great sacrifices to maintain a system seen as vital to our credibility, the stability of the postwar order, and the struggle against world communism.

As for Trump’s liberal internationalist critics, Bretton Woods is mostly an empty slogan. The billionaires who gather in Davos to talk of world order could not have amassed their fortunes in a world of capital controls and fixed currencies. Countries invoke global order, but go their own way when it comes to trade and payments, ignoring the IMF’s fundamental purpose (theoretically still in force) of “limiting disequilibria” in the international balances of payments of its members. Germany and China wrap themselves in the mantle of multilateralism while pursuing ruthlessly mercantilist policies to amass enormous surpluses. As for the United States, it merrily goes on, year after year, importing hundreds of billions of dollars more than it exports, happy to take in those cheap consumer goods and unable or unwilling to build industries that would permit even an occasional balance. In a word, the system works for everyone, but whether it is “great” (or even “good”) is another question.

Immigration: Who Did the Work?

If the America of 1968 was self-sufficient with regard to goods and money, that was also mostly true with regard to people. Immigration was by historic standards quite low in 1968. Annual net immigration was about 450,000, less than a third the current rate, in a country with about two-thirds the current U.S. population. The increased flows and accumulating stocks of immigrants triggered by the Immigration and Nationality Act of 1965 and the later rise in illegal immigration had not yet kicked in. The number of foreign-born in the population was about 9.6 million, compared with nearly five times that level—45.3 million—in 2018. Immigrants made up only 4.7 percent of the U.S. population, nearly a historic low point in the 20th century, while today the share approaches 14 percent and is projected to rise to historically unprecedented levels.

As with Bretton Woods and fixed exchange rates, the immigration regime of 1960s was untenable and few argue that it can or should be restored. But the question arises as to how the United States of 1968 was able to bear the burdens that it did and register the achievements of that time chiefly relying on its native-born population. To be sure, immigrants made a vital contribution to U.S. society. Two of the Nobel Prize winners in science that year were naturalized citizens who had been immigrants: Lars Onsager from Norway, winner of the prize in chemistry, and H. Gobind Khorana from India, who shared the prize for medicine.

There was nothing, however, comparable to today’s situation, when Silicon Valley whines that unless it secures enough H1-B visas to bring scientists and engineers in from India and elsewhere it cannot possibly maintain its competitive position in the world. Much the same applies at the low end of the skill and income scale, where vast numbers of immigrants are needed to staff hospitals and restaurants, work as carpenters and laborers to turn houses into McMansions, and pick our fruits and vegetables.

International Competition: Science and Sport

Speaking of the Nobel Prizes, in 1968, the United States won all of the science awards, continuing its post-World War II dominance. Luis Alvarez of UC Berkeley won in physics, Onsager of Yale in chemistry, and the trio of Khorana (Wisconsin), Robert W. Holley (Cornell), and Marshall W. Nirenberg (National Institutes of Health) in medicine. By the 2000s, the rest of the world had caught up and the U.S. share of the prizes had dropped significantly. Concerns emerged that Americans were losing faith in science and not investing enough in fundamental research; the smartest people, some said, were going to Wall Street rather than into the laboratory.

Still, 2018 was not a bad year for prizes, reflecting the more inclusive society created since 1968. Americans won a respectable 43.75 percent of the three science prizes. That was down from 100 percent in 1968, but one of the winners in chemistry, Frances Arnold of Caltech, was a woman. Arnold had gotten her start in science in the 1970s, studying engineering at Princeton, an institution that in 1968 was still debating whether to admit women.

“History repeats itself, first as tragedy, second as farce,” Karl Marx famously wrote, and so it was in that other great arena of human competition, athletics. In 2018, Colin Kaepernick, a quarterback who started in 58 games over a five-year period in the National Football League, compiling a 28-30 win-loss record, made headlines by signing a multimillion dollar advertising contract with Nike. Kaepernick had won fame (or notoriety, depending on one’s point of view) in 2016 by his refusal to stand during the playing of “The Star-Spangled Banner.” As Kaepernick explained his action, “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color.”

Nike reportedly was attracted by the interest of identity-conscious American youth, and the company’s stock soared after the announcement of the deal with Kaepernick. One of the ads with the former quarterback featured the lines: “Believe in something. Even if it means sacrificing everything”—except perhaps the $5 million per year that Nike was reported to be paying Kaepernick. Two of the great radical intellectuals of 1968, the philosopher Herbert Marcuse and his star student Angela Davis, would have recognized in this corporate co-opting of a social cause confirmation of Marcuse’s theory of “repressive tolerance.”

The Kaepernick episode inevitably recalled a more famous incident from the 1968 Mexico City Olympic Games, when Tommie Smith and John Carlos, sprinters from San Jose State University, turned toward the American flag and raised their black-gloved fists while the national anthem played as Smith was awarded the gold medal for the 200 meters. (Carlos had taken third in the event, behind Peter Norman, who won the silver). Smith later wrote that the gesture, which produced one of the most iconic images of the 1960s, was not a “black power salute” but a “human rights salute.” Clearly it was both. Norman, a white Australian, joined the American runners in wearing a human rights badge on his jacket.


[image error]

Wikimedia Commons


All three runners suffered repercussions from the protest, and none got rich. Carlos had a brilliant track season in 1969, played briefly in the Canadian Football League, and then worked in California as a high school counselor and track and field coach. Smith, one of the most gifted track athletes of all time, played three seasons in the American Football League and later became the track coach at Division III Oberlin College. In 2010 he reportedly put his gold medal and spikes up for auction, with bids starting at $250,000 (a twentieth of what Nike reportedly pays Kaepernick each year).

Apart from the drama of the black power/human rights salute, 1968 was a year of remarkable athletic achievement for the United States, especially in track and field, the premier event of the games and a proxy for overall competitiveness. Americans won 12 of the 24 gold medals awarded in men’s events, including, in addition to Smith’s victory in the 200, the 100 meters, the 400 meters, the 110-meter hurdles, the two relays, and the long jump. The latter event produced the most stunning performance of the games: Bob Beamon’s leap of 8.90 meters, which smashed the world record by nearly two feet and later was named by Sports Illustrated one of the five greatest sports feats of the 20th century. The winners of these events were all African-Americans, most involved in or sympathetic to the protest movement.

The achievements of white athletes were almost as impressive. They won the pole vault, keeping intact an American lock on the event that had begun in 1896, shot put (Americans had won the event every time it was held since 1896, with the exception of Hitler’s 1936 Berlin Olympics), high jump, and the decathlon. Most remarkable was the achievement of Al Oerter, who again won the discus throw, as he had done in 1956, 1960, and 1964 (becoming the only track and field athlete to win the same event in four successive Olympics), each time improving his throws and outdistancing younger competitors. “I don’t compete with other discus throwers. I compete with my own history,” he once put it.

The United States has remained a great Olympic power, in track and field as well as in other sports, but the results of recent Olympiads show how much the rest of the world has caught up. In the most recent summer games, at Rio de Janeiro in 2016, American men won gold in six of 24 events in track and field, half the level of 50 years earlier. Americans on occasion still win the shot put and the pole vault, but these events are no longer a lock. American sprinters have long been surpassed, improbably, by their Jamaican rivals. Here as well, however, the more inclusive nature of American society (as well as the collapse of the drug-fueled sports machines of communist Eastern Europe) is reflected, as American women actually improved their medal-winning performance in 2016 over what it had been in 1968.

War and Defense

Looming over all the achievements of 1968—the trade surplus, the Nobel prizes, the Olympic medals, and the voyage to the moon—was of course the war. Vietnam ripped apart American society and forever changed the country. In all, more than 58,000 Americans died in the conflict, along with untold numbers of Vietnamese. 1968 was the peak year for casualties, with combat deaths reaching more than 500 a week during Tet. Having been shot down on October 26, 1967, John McCain was in his first full year of what would turn out to be more than five years of captivity, and was just beginning to undergo, in August 1968, the torture that was to damage his body and shape his thinking for the remainder of his life (including his strong opposition to the U.S. use of torture in the “global war on terror”).

By way of contrast, the United States in 2018 was in its seventeenth year of the endless war that had begun with the terrorist attacks on New York and Washington on September 11, 2001. Americans were weary of war, but casualties were down and there were no widespread protests. Total U.S. casualties in Afghanistan, Iraq, and the lesser theaters of the war on terror were approaching 7,000 by the end of 2018—by no means trivial, but still only about 12 percent of those killed in Vietnam over a shorter period and from a smaller population.

The identities of those killed reflected the different ways in which America fought its wars. Of the 58,193 individuals killed in Vietnam, nearly half were aged 20 or under, with the highest number being aged 20 (14,095), followed by 21 (9,705), and 19 (3,103). Twelve were just 17. Those killed in Afghanistan, Iraq, and Syria on balance have been older, especially in recent years as the United States has moved from a direct combat into an advisory capacity. Of 24 Americans who died in the global war on terror in 2018, 14 were in their thirties, one was 42. Only one was aged 20, none was 19 or under. Whereas in the earlier war it was mostly young, inexperienced draftees who were killed, often just days after being thrown into combat, in the war against Islamic terror it was mostly men in their late twenties or thirties, all volunteers in the professional, post-Vietnam military, whose luck finally ran out after tour after tour of combat duty. The ethnic and geographic composition of the casualty lists also reflected the ways in which American society had changed since 1968. In Vietnam there was a perception that blacks died in disproportionately high numbers (which partially fueled the Mexico City protests). Many of those killed, both white and black, were from the big industrial cities of the Northeast and Midwest. In Afghanistan and Iraq, in contrast, those killed were more often from small towns and rural areas in the South and West.

Quite apart from the war, the U.S. military in 1968 was an enormous globe-straddling colossus. The active duty force comprised 3,547,000 men and women (mostly men), with another 2.8 million in the reserves. National defense accounted for 9.8 percent of GDP, or 45.0 percent of the Federal budget. The U.S. Navy had 933 ships on active duty, the Army 23 divisions. These forces were mostly deployed in Europe and Asia, positioned to effect an ambitious national strategy that called for simultaneously fighting and winning two and a half wars, a major conflict in Europe, another in Asia, and a “brushfire” war somewhere else in the world.

The U.S. military of 2018 is still large and operates in dozens of countries, but the force is smaller and more compact: 1,314,000 men and women in the active force, with another 815,900 in the reserves; 42 army brigades (roughly equivalent to ten divisions), and 299 Navy ships. Defense spending in 2018 accounted for 3.1 percent of GDP, or about 16 percent of the Federal budget.

Like the force of 1968, today’s military is strained by years of constant combat and charged with a strategy that it lacks the forces to effect. While no one argues that the United States now needs forces as large as those of 1968, there is general agreement among outside experts and in the Pentagon that after the drawdowns of recent years and with the continued military modernization programs underway in Russia and China, the U.S. ground, air, and sea forces are too small. The Trump Administration has begun a program to expand as well as modernize the force. Whether the money, personnel, and political support for such a build-up can be found on a sustained basis, however, remains very much an open question.

Space and the Race to the Moon

Not least there is space, the “final frontier.” Perhaps the less said about the U.S. manned space program in 2018 the better. Following the retirement of the space shuttle, the United States no longer has a capability to launch astronauts into space. Since 2011 NASA has been entirely dependent on the Russian space agency to send crews to the International Space Station, its only manned space activity. Roscosmos had its own difficulties in 2018, but it managed to dispatch four U.S. astronauts, two on Soyuz MS-08 in March, one on Soyuz MS-09 in June, and another on Soyuz MS-11 in December, to the station. Members of Congress (including the late Senator McCain) bristled at this dependence on a hostile foreign power, and in the course of the year the Trump Administration ramped up plans for an American space renaissance. Others placed their hopes on the private efforts of several of the country’s leading billionaires, notably Elon Musk and his SpaceX and Jeff Bezos with his Blue Origin.

How different was the story in 1968. The Apollo program was a huge government effort, led by NASA, an elite agency then in its prime, which oversaw a vast supply chain of corporate and university contractors and subcontractors. Relying upon computers less powerful than the smartphones that the average American now checks some 200 times a day, Apollo 8 blasted off from Cape Kennedy on December 21. Traveling at an initial speed of 24,200 miles per hour, faster than any previous spacecraft had flown, the three men on board became the first humans to view Earth from space.


[image error]

Earthrise (via Wikimedia Commons)


Apollo 11, when it landed on the moon the following July, was the greater engineering feat, but Apollo 8 was arguably the more significant event in long-term perspective. It marked a fundamental change in how humans viewed their planet, the “tiny blue marble” about which Johnny Cash and Waylon Jennings later would sing and which astronaut William Anders captured in the famous photograph Earthrise, taken as the spacecraft made its fourth orbit of the moon on Christmas Eve. The vague religious sentiment that many felt at the time and that was captured in the lyrics of the song was shared by the astronauts themselves, who expressed it by reading to a vast worldwide television audience the first part of the Hebrew creation story in Genesis 1.

Atheist Madalyn Murry O’Hair filed a lawsuit complaining that this reading by government employees represented an unconstitutional government promotion of religion. The Supreme Court sidestepped the question by dismissing the case, saying that O’Hair lacked jurisdiction. NASA subsequently curtailed any overt connection with religion on its missions, but few seemed to mind the reading at the time, which in retrospect appeared more as an expression of mid-American corniness than a serious threat to the separation of church and state.

Apollo 8 began its trip back to Earth on Christmas Day. The command module, all that remained of the massive apparatus launched some six days earlier, splashed down in the Pacific south of Hawaii on December 27 and was quickly recovered by the fabled USS Yorktown, one of the 23 aircraft carriers in the U.S. fleet at that time (today there are 11).

There was no break, of course, in the fighting in Vietnam. During the seven days of the mission, 147 Americans were killed in Southeast Asia, and another 99 in the remaining four days of the year. This brought the total for 1968 to 16,899, an average of 46 per day. 

Another Time, Another Country

So that was America in 1968, a country not always good but often great—in what it achieved and attempted, in what it asked of its people, and in what they delivered.

Many of the quantitative differences between 2018 and 1968 are striking. The decreased expenditures in 2018 on defense, as well as lower spending on foreign aid and space, along with the shift from an international payments surplus to a deficit represent a swing of more than 9 percent in GDP, money that presumably now goes to increased consumption, higher expenditures on health care, and servicing of the external debt that began accumulating in the 1980s. President Trump wants to make America “great again,” but it is unlikely that he (or any other president) could engineer a 9 percent shift in GDP that would pay for investments to make this happen—to replace the military’s worn-out equipment, to fix crumbling national infrastructure, to return to space in a big way, or to tackle the crises in education, opioid addiction, and declining life expectancy.

Apart from the changes that leap out from the quantitative indicators, there are many qualitative differences between American society in 1968 and that of 2018 about which sociologists, historians, cultural commentators, and poets and novelists have written. The United States of 2018 is clearly more diverse, more urban, and less religious than that of 1968. Monetization (a word with a different connotation in 1968) and professionalization are striking features of America in 2018. With jobs in manufacturing and other industries increasingly scarce or ill-paying, and with Silicon Valley providing the facilitating technology, individuals now monetize their every asset—their cars with Uber and Lyft, their spare rooms with Airbnb, and their time through various micro-monetization schemes.

Professionalization of everything from military service to fundraising is now a ubiquitous feature of American life, a product of the intensifying competition that has come with globalization and the MBA culture introduced in the 1980s. In general, this has been a good thing. Americans enjoy better services (think of how today’s modern dental practices differ from the one-man offices of the past). For the most part we no longer rely on 19- and 20-year olds to fight our wars as we did in 1968.

All this has come at a cost, however. Large organizations spend vast amounts of time on unproductive strategic planning, while governments and corporations have become increasingly risk-averse (without, ironically, really dealing with risk, as witnessed by the 2008 financial crash). No government agency today would have launched Apollo 8, which was changed at nearly the last minute from an Earth orbit to a more ambitious mission to the moon, and which the astronauts themselves recognized as a high-risk venture they might not survive.

The internet is a marvelous invention that has improved many aspects of life from what they were in 1968, but in some ways it has made America both more frenetic and more boring. The poet Jorie Graham complains that her students at Harvard live in a two-dimensional world: if it is not on the screen, it is not real. For assignments, she makes them rise early and listen to birds—the only way for them to understand Keats’s “Ode to a Nightingale.”

Despite Hollywood’s best efforts to create heroes and martyrs that will sell tickets or spur downloads, many of our celebrities are neither interesting nor inspiring. In 1968, Mel Pender was part of the 400-meter relay team that won the gold medal and broke the world record in Mexico City. After joining the U.S. Army at age 17 to escape the discrimination and lack of opportunity in his native Georgia, Pender spent 11 years as an enlisted man in the 82nd Airborne. On Okinawa in 1964, someone in authority recognized from his play on an Army Ranger football team that Pender was fast—very fast. He was asked to take part in a friendly meet with the Japanese track team as it prepared for the 1964 Tokyo Olympics and then, as a result of his performance in this competition, began at age 25 an amateur running career and competed in the 1964 Olympics.

The games over (he made the finals of the 100 meters and placed sixth), Pender went back to his duties in the Army. In 1968 he was pulled out of combat in Vietnam, with no advance notice, shipped back to the states and put on the Olympic team, where he won his medal and helped break the world record. The games over, he went back to the Army and to Vietnam, where he won a Bronze Star for combat to go with his gold medal.

For better or for worse, stories such as this simply do not happen anymore. Today, when coaches from big-money fueled college sports teams scour even elementary schools looking for talent and when parents plot how their toddlers might someday make the pros and reap the vast financial rewards on offer, it is unlikely that one of the world’s six fastest runners would be an undiscovered enlisted man in the U.S. Army, or, even more improbably, be sent back into combat after having been discovered. 

The Question

And so we are back where we started: How was the smaller, poorer country of 1968, one that did not fully tap into the talents and energies of a large part of its population, able to accomplish what it did? How is it that the United States of 2018, which draws on the contributions of a larger and more diverse population and which every year takes in 1.5 million people and $500 billion in borrowed money from the rest of the world, can no longer accomplish things that it did in 1968?

At least in theory, this question should be answerable. It could be addressed by a team or teams of economists, political scientists, demographers, sociologists, historians, and subject matter experts in a university setting or at a think tank. Mancur Olson’s work on the rise and decline of nations and the role of social rigidities would be relevant, as would Francis Fukuyama’s recent work on political dysfunction and decay. Works on imperial overstretch, such as those by Paul Kennedy and Robert Gilpin, could be cited. Added to these would be more specialized literatures on immigration, trade, social solidarity, and political leadership.

The possible reasons for relative decline in capabilities and ambitions could then be analyzed and quantified. As simple indicators like Nobel prizes and Olympic medals suggest, the catching up and increased competition from the rest of the world are factors, but these would need to be disentangled from purely domestic causes (like decreased investments in science or fewer physical education classes). The effects of immigration and diversity would need to be considered, with the benefits of a larger and more diverse population weighed against such factors as depressed wages and the decline in solidarity that results as societies become less homogeneous.

The achievements of 1968 itself would have to be held up to critical scrutiny: to what extent was the United States in those years living off of past investments—ships, for example, such as the Yorktown that were built during World War II and that could only be replaced at vastly greater cost in a peacetime economy, or the gold stock that slowly was being run down in the battle to sustain Bretton Woods? The environmental movement was in its early days in 1968, and the United States of that time was still building up an environmental debt to the future that it has begun to repay only in recent decades (although not yet in the crucial area of carbon emissions, where U.S. policy in 2018 was still deadlocked and ineffective).

Various theories about turning points—about what exactly went wrong and when—would have to be examined, such as Thomas Frank’s polemical thesis that things went off the rails in the 1970s, when the Democratic Party abandoned its traditional role as the champion of the American working class in favor of affluent professionals, or Paul Volcker’s arguments that the turn came in the 1980s, when Ronald Reagan began piling up vast deficits and systematically denigrated the role of government. Studies of growing income inequality, the decline in middle class, and the role of big money in politics and other areas of national life would need to be the considered, as would the conservative counterarguments, that it was precisely when the country strayed from Reagan’s orthodoxy of small government, family values, and strong defense that things took a turn for the worse. Simple as well as complex theories would need to be examined, such as Andrew Bacevich’s longstanding contention that what ails America is less any grand combination of structural factors than a set of bad decisions, chiefly in defense and foreign policy, that officials of both political parties have made over the last 25 years and that they stubbornly refuse to admit or correct.

Differences in the structure of international politics would need to be considered. The United States in 1968 faced what it thought was a single mighty adversary—international communism—but it also had strong and purposeful allies. The powerful West German army stood alongside the U.S. Army in the defense of Western Europe, while the United Kingdom still had primary responsibility for Western security in the vast region East of Suez. (Another milestone of 1968: it was in January of that year that UK Foreign Secretary George Brown informed a dismayed Secretary of State Dean Rusk that Great Britain would be withdrawing from this region over the course of the next three years, leaving what U.S. officials saw as a security vacuum in the vast area between Egypt and Singapore.) The United States went on to capitalize on the split in international communism between China and the Soviet Union and outlast the latter in the Cold War, but today it faces a more diffuse and complicated set of challenges: an aggressive and reviving Russia, an ascendant China, chaos in much of the area “East of Suez,” and an array of restive allies with mostly weak militaries that are also economic competitors.

If done correctly and with the proper scope, such a study might produce a net assessment of where America is and where it has been—one that would take account of changing values, of increases in consumption and health and the many new technologies and inventions that Americans enjoy today, but also examine the decline in national “greatness” that many today lament.

Until then, debate on these questions is likely to remain highly political, informed more by ideology than by facts. Governor Andrew Cuomo of New York staked out one position when he stated, in August 2018, that America “was never that great.” This view reflects the sentiment, something of a reigning orthodoxy both in academia and increasingly in today’s identity-conscious Democratic Party, that “greatness” and “goodness” cannot be separated—that a society that did not from the beginning provide equality to women, people of color, and other groups could never have been great. On the other side of this debate is the barely disguised nativism of many Trump supporters, who argue that America was great and can be made so again, but imply or even declare outright that this can only be done by returning to the exclusivity of the past. Both sides, albeit for different reasons and from different directions, reject Samuel P. Huntington’s argument that it was the culture of the “Anglo-Protestants” who settled North America that made the United States great, but that this culture and its values need not be tied to any particular ethnicity and can be upheld and reaffirmed in a more diverse America.

In the middle, of course, are the moderates, the establishment voices who reject populism, be it from the Left or Right. In opposition to the Left, they argue that America was great and that to argue otherwise is perverse. But they also reject the Trumpian Right’s argument that America is no longer great. In their view, it is and can remain great, but under one proviso: that the right sort of people, the meritocrats who earned their positions in life with the highest test scores and by studying at the best schools, are again put in charge. Judging by the debate over meritocracy that erupted at the end of 2018, it is not clear, however, that this position can sustain itself, either in the United States or in Europe, from assaults by populists on either side.

On January 20, 2019 the United States will mark the 50th anniversary of the inauguration of Richard M. Nixon, a president whose election was ensured by the turmoil of 1968. Nixon moved to address many of the strains that the United States was under at that time—ending Bretton Woods, winding down the Vietnam War, ending conscription, and forming a quasi-alliance with China against the Soviet Union—but he of course brought with him his own turmoil. Debates on greatness and goodness, however they are defined, thus are certain to continue into the New Year. But 1968 likely will always have a special significance, and with good reason.


Remarks by Taylor Branch, Library of Congress Madison Council luncheon, April 2, 2014. Branch’s conversations with Clinton are recounted in his The Clinton Tapes: Wrestling History with the President (Simon & Schuster, 2009).

For a comprehensive catalog of the turmoil of 1968, both national and international, see Niall Ferguson, Kissinger, Vol. 1 (Penguin Press, 2015), pp. 786-788.

Report of the National Advisory Commission on Civil Disorders (U.S. Government Printing Office [USGPO], 1968).

Economic Report of the President, 1969 (USGPO, 1969), Table B-84, p. 324, and Table B-1, p. 227.

Chapter 5 of the Economic Report of the President, 2018 (USGPO, February 2018) contains an especially thorough discussion of U.S. international trade and payments performance over time, albeit one formulated to defend the Trump Administration’s aggressive new approach to trade.

Economic Report of the President, 1968 (USGPO, February 1968), p. 15.

U.S. Bureau of the Census, Statistical Abstract of the United States, 1969 (USGPO, 1969), p. xiii. This figure was itself a sharp increase from previous years, which in 1965 was under 300,000.

See Migration Policy Institute, Frequently Requested Statistics on Immigrants and Immigration in the United States, February 8, 2018. Additional data can be found at https://www.numbersusa.org, a group that campaigns for lower levels of immigration.

Tommie Smith, Silent Gesture: The Autobiography of Tommie Smith (Temple University Press, 2007); Jack Buehrer, “Olympics Black Power Heroes Are Still Waiting for an Apology,” The Daily Beast, August 4, 2016; Allen Barra, “Fists Raised, but Not in Anger,” New York Times, August 22, 2008.

All results at https://www.olympic.org/mexico-1968/athletics.

Tobias Oelmaier, “The Perfect Jump: 50 Years On,” DW (Deutsche Welle), October 18, 2018.

Casualty figures from https://www.militaryfactory.com/vietnam/casualties.asp and www.virtualwall.org.

Data from https://thefallen.militarytimes.com.

Naval History and Heritage Command; Stacie L. Pettyjohn, U.S. Global Defense Posture, 1783-2011 (RAND, 2012).

Office of the Under Secretary of Defense (Comptroller), Defense Budget Overview: Fiscal Year 2019 Budget Request, February 2018.

David Ochmanek et al, America’s Security Deficit: Addressing the Imbalance Between Strategy and Resources in a Turbulent World (RAND, 2015).

Details about Apollo 8 are at https://www.nasa.gov/topics/history/features/apollo_8.html. The definitive account of the mission is Robert Kurson, Rocket Men (Random House, 2018).

Personal conversation with author, Washington, DC, December 2018.

George Banker, “Soldier-Athlete Mel Pender, Olympic Gold Medalist,” Runner’s Gazette.

Mancur Olson, The Rise and Decline of Nations: Economic Growth, Stagflation, and Social Rigidities (Yale University Press, 1982); Francis Fukuyama, Political Order and Political Decay: From the French Revolution to the Present (Farrar, Straus and Giroux, 2014).

Paul Kennedy, The Rise and Fall of the Great Powers (Vintage Books, 1989); Robert Gilpin, War and Change in International Politics (Cambridge University Press, 1981).

Thomas Frank, Listen Liberal: Or, What Ever Happened to the Party of the People? (Metropolitan Books/Henry Holt, 2016); Paul Volcker, Keeping At It (Public Affairs, 2018).

Andrew J. Bacevich, “It’s Time for David Brooks to Reckon With David Brooks,” The Nation, February 23, 2017.

Memorandum of conversation, January 11, 1968, Foreign Relations of the United States, 1964-1968 (USGPO, 2001), Vol. 12, p. 608.

Aaron Blake, “Andrew Cuomo says America was never that great, drawing gasps,” The Washington Post, August 15, 2018.

Francis Fukuyama, “Huntington’s Legacy,” The American Interest, Vol. 14, No. 2 (November/December 2018).

Anne Applebaum, “A Warning From Europe: The Worst Is Yet to Come,” The Atlantic, October 2018; David Brooks, “The Rise of the Resentniks,” New York Times, November 15, 2018; Andrew J. Bacevich, “When David Brooks’ Dreams Don’t Work Out,” The American Conservative, December 3, 2018.



The post Greatness and Goodness: Parting Thoughts on 1968 appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on December 31, 2018 08:06

A Jew’s Guide to New Year’s Eve

It’s one thing to be a Jew on Christmas in a majority-Christian land like the United States, an incongruity made famous (infamous?) in pop culture all the way from Adam Sandler songs to South Park episodes. But it’s another to be a Jew on New Year’s Eve and Day. Pop culture offers nothing to help on that score. So what’s that about then?

As everyone knows, the evening of December 31 is New Year’s Eve. And that’s right, if one reckons by the common calendar, used virtually worldwide these days thanks to the antique successes of European imperialism. And the coming year is 2019. But why is December 31 New Year’s Eve? And why is the next year the number 2019?

If you’re like most normal, historically oblivious Americans, this question simply does not come up. If it ever does, the most popular answer is easy to predict: “It’s New Year’s Eve on December 31 because it just is and always has been, and the year ahead is 2019 because it follows 2018, you big dummy—so what the deuce are you talking about?”

Well, okay (and I’ve been called a lot worse). But just a moment’s reflection can convince even the densest person, sober or not, that, no, it hasn’t always been this way, “always” being a pretty darned long time when pointed backwards as well as forwards. Here’s a very short history of the matter.

In 46 or 45 BCE, Julius Caesar established January 1 as New Year’s Day even as he introduced a new calendar that was far more accurate than the one Rome had been using up to that point. The old calendar had only 304 days, divided among only ten months. Not good if you want to concord solar and lunar cycles, or have years that are roughly symmetrical astronomically from one to the next. Caesar named January after the Roman god of doors and gates, Janus, who had two faces, one looking forward and one looking backward. This was a terrific idea.

Before long, Roman pagans began marking December 31 with drunken orgies. They rationalized their debauchery by claiming that it constituted a solemn re-enactment of the chaotic void that existed before the gods brought order to the cosmos. Even way back then, people made up excuses to party hard and have sex with people whose names weren’t particularly important at the time. It’s good to know that some things don’t change.

But December 31/January 1 did not remain the start of the year for long. As Christianity spread, and then became the official religion of the Roman Empire in 380 (Constantine allowed the toleration of Christianity in 313, but it was left to Emperor Theodosius to do the deed for which Constantine is often credited), pagan holidays were either incorporated into the Christian calendar or abandoned. In the case of January 1, it was incorporated, very conveniently becoming the Festival of the Circumcision. Yes, that’s right, if you count inclusively from December 25 to January 1 you get eight, as in the eight days of circumcision. That was painfully obvious to fourth-century Christians.

January 1 thus became an important day in early Christianity, but not as New Year’s Day. The Festival of the Circumcision came to symbolize the triumphal rise and reign of Christianity and the would-be death of Judaism—the supersession of the Church over the Jews as God’s chosen people. By the time of the Council of Nicaea in 325 CE, this interpretation was standard fare, and seems to have been formally ratified theologically at that Council.

Now, it so happens that the Pope at the time, whose name was Sylvester, convinced Constantine to prohibit Jews from living in Jerusalem. At the Council of Nicaea, too, Sylvester promulgated a host of new anti-Semitic legislation. Sylvester became a saint in the Church for this and other achievements, and his Saint’s Day is (you guessed it) January 1. That’s why Israelis today call the secular New Year’s Eve revelries and New Year’s Day (since Jews mark the beginning of a day at sunset) “Sylvester.” (Why they do this I’m not sure, since Sylvester was sort of an ass from a Jewish point of view, and since in medieval Europe the night of December 31 was often reserved for synagogue and Hebrew book burnings, torture, and standard-issue murder-for-sport. I think the term was imported from Eastern Europe, where the term is still used for that purpose in some places.)

But already by that time, as I have suggested, January 1 was not New Year anymore. That connection was still associated with Caesar’s pagan Rome, and Christians wanted to separate themselves from that unenlightened, pre-Gospel, pre-Christian time. So Christian Europe regarded March 25, Annunciation Day, as the beginning of the year. That made sense because it was near the vernal equinox, the new year for many of the European tribes the Church sought to convert.

The one exception worth noting, starting in the 11th century, was England.

William the Conqueror was crowned King of England on December 25, 1066, and at that time (his transition team was very efficient) he decreed that January 1 should once again be the New Year. He thus ensured that, with Jesus’ birthday aligning with his coronation, Jesus’ circumcision would start the new year and symbolize the supersession of the Normans over the earlier Saxon inhabitants of Britain. He tried, in other words, to make the calendar of Christian Norman England align with his personal biography.

This was very clever, but William’s innovation eventually lost favor. England’s Catholic clergy in time realigned English custom to fit that of the rest of the Christian West. March 25 was to mark the new year, and there it remained for roughly half a millennium.

Then, in 1582, Pope Gregory XIII moved it back to January 1. Like Caesar, the occasion was the introduction of a new calendar, the eponymous Gregorian one used today. The problem with the Julian calendar, as is well known, is that its slight inaccuracy caused Easter to creep too far back from the vernal equinox at the rate of about one day per century. That creep had amounted to 14 days by the time Gregory became Pope, really screwing up the religious calendar and the general sense of right-fittedness as well. You’ll be wanting lilies and daffodils blooming on Easter, not a foot of snow on the ground.

The Pope based his new calendar on the day, 1,257 years earlier, when the Council of Nicaea convened on the vernal equinox: March 21, 325. Otherwise, the vernal equinox in 1582 would have fallen on March 11, way off from where the sun and stars were supposed to be for an equinox. He kicked the calendar ahead ten days, turning the day after October 4, 1582, into October 15, 1582, and January 1 again became the New Year.

Except in England and, by extension, in its colonies. The English resisted the change, not because they were still ticked at William the Conqueror’s vanity, but for reasons having to do with the Reformation and thinking it apposite to resist the Pope’s authority and all that. The Gregorian calendar did not win adoption in England, and hence in America, until 1752, and oh what a mess that caused. As one can learn from Ben Franklin’s Almanac of that time, to get the math to work out, 1751 consisted of only 282 days, from March 25 to December 31. The year 1752 began on January 1, but January 1 had to be advanced 11 days to catch up with the Gregorian count, so 1752 had only 355 days. I’m sure this is the origin of the wild drunkenness in Britain and America on New Year’s Eve. How else was a person to cope with such disturbing stuff? (This explanation does not apply to the Irish.)

What does this have to do with the Jews? Well, back on New Year’s Day 1577 Pope Gregory decreed that all Roman Jews had to attend a Catholic conversion sermon given in Roman synagogues after Friday night services. The penalty for skipping out was death. Then, on New Year’s Day of the next year, the Pope signed a law forcing Jews to pay for a “House of Conversion” whose purpose was to convert Jews to Christianity. Talk about adding insult to injury.

The House of Conversion did not work out so well for the Pope, however, so on New Year’s Day 1581 he ordered the confiscation of all the Roman Jewish community’s Hebrew scrolls and books. That caused a lot of violence; the Jews took it in the neck, as usual, when, virtually unarmed, they faced a vastly superior state-backed force.

Does any of this matter anymore? Very few Jews know this history, whether they live in Israel, America, or anywhere else. Very few non-Jews in the United States and Canada associate New Year’s Eve and January 1 with Pope Sylvester or with the Festival of the Circumcision, or know that January 1 became New Year’s Day in British North America only in 1752. Indeed, the whole shebang is presumed by most Americans to be wholly secular in nature, having nothing to do with any church calendar (Catholic and Anglican, anyway) going back some 1,680 years. Well, duh: Did either Guy Lombardo or Dick Clark seem like a religious type to you?

Except that, as ought by now to be clear, New Year’s origins very much do go back to church calendars (and to pre-Christian religious rituals, as well). Besides, if New Year’s Eve/New Year’s Day really were secular in origin, they could not be much older than a few centuries, in other words, not older than the notion of secularism itself. That sure contradicts the “always” premise, now doesn’t it?

I guess it comes down to this: If you join in the revelry of New Year’s Eve, you can do it for any number of historically appropriate reasons. You can do it because of Julius Caesar and the gods’ turning chaos to cosmos (perfect for pagans), you can do it in memory of Pope Sylvester (just right for anti-Semites), you can do it to commemorate William the Conqueror (tailor-made for Anglophiles), you can do it to mark the advent of Pope Gregory’s much improved calendar (terrific for math/science/astronomy buffs), you can do it to celebrate Jesus’ bris (my personal favorite), or you can do it just because it’s a convenient pretext to get hammered (everyone else’s favorite, judging by all appearances).

So, whatever your reasons, Happy New Year!


The post A Jew’s Guide to New Year’s Eve appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on December 31, 2018 07:24

December 30, 2018

The EU Can’t Fulfill Its Purpose

The EU has outlived its purpose as an ordering force in Europe. It is incapable of addressing the historical challenge facing the West: its rising geopolitical competition with revisionist powers. It failed to radiate security on its frontiers to the east and south. And it has proven too weak to keep in check the unilateral policies of its largest member, Germany. In brief, the EU cannot compete, cannot secure its borders, and cannot keep Europe’s balance.

While the U.S. has finally awakened to the necessity of competing with rival great powers such as China and Russia, the EU is stuck in a post-modern daze. In that worldview, the main threats are not to the wellbeing of a polity but to what is seen as a higher purpose of international politics: multilateralism. As Federica Mogherini, the EU High Representative for foreign policy, affirmed in a September 2018 speech, the “priority of our work will be to strengthen a global network of partnerships for multilateralism.” (my emphasis) Multilateralism is the foreign policy goal, not just a means; process trumps outcomes.

Such confusion of goals and means is understandable, perhaps. EU members rarely agree on foreign policy, leaving ambitious EU politicians with little to do except to push for process and more process. Multilateralism—endless paeans to dialogue and understanding—thus becomes the objective that few oppose because at first sight it does not appear to be noxious. What can be so bad about a slew of meetings in Geneva or Paris among global partners working to enhance multilateralism?

In reality, when promoted with relentless enthusiasm, as in the case of Mogherini who has done so over the past years, a foreign policy that sets multilateralism as a purpose rather than as a means results in a dangerous world detached from reality. Any agreement is good simply because it is a multilateral agreement, and any opposition to such agreement is greeted as the ultimate sin. Hence, the JCPOA (the “Iran deal”) or the Paris climate accords are to be protected at all costs, even if rival powers like Iran or China disproportionately benefit from it. The U.S. withdrawal from both arrangements was perceived by EU elites as a greater menace than Chinese predatory economics or Iranian military expansion in the Middle East because it undermined multilateralism.

This inability to think in competitive geopolitical terms has also weakened the EU’s frontiers. This is particularly, and tragically, visible in Ukraine where the EU demonstrated that it was unwilling and incapable of fighting for its enlargement. The EU could promise the extension of its borders, raising the hopes of Ukrainians eager to shed the corrupted ways of their leaders, but it could not admit that it had to compete for it. The Western conceit was that the benefits of an integrated market, of a borderless area, of diluted sovereignty, and of transnational rules were self-evident and equally appealing to all. All the EU had to do was to ensure that the rules were followed by the aspirant nations and even the most recalcitrant opponents in and out of Ukraine (including in Moscow) would sooner or later realize the futility of their opposition. Enlargement of the European project required patience and managerial stamina, but not a martial spirit and willingness to compete.

The outcome was tragic. In 2014 Russia sent “little green men” and artillery; the EU, its progressive narrative of History. And so far Foucault is losing to Kutuzov.

As the Iranian threat grows and an arc of instability from North Africa to the Middle East continues to burn, Europe is discovering it also has volatile frontiers to its south. But beyond calls for more multilateralism, the EU does not offer much succor to its most affected members in the Mediterranean basin. Individual EU members, notably France and Italy, are left to pursue their own, often contradictory and clashing, policies in North Africa seeking to mitigate the security problems emanating from there.

Finally, the EU also served the purpose of freeing Europe from the historical vagaries of its internal balances of power. Its member states were, of course, never going to be equal in power but EU institutional mechanisms would, so to speak, transcend power imbalances. In blunter terms, the EU promise was that Germany would not dominate Europe; Germany would become Europeanized rather than Europe Prussianized.

The EU has succeeded, but only to a degree. Berlin is not Europe’s capital—indeed, Europe has no capital. But German power is not containable by the EU and, after Brexit, will be even less so. The 2008 financial crisis showed to the debt-ridden Southern European states that German power is decisive and opposing it in financial matters is futile. More disturbingly, Berlin has no qualms of pursuing policies that undermine the security of other EU members. Two policies in particular are worth recalling. In 2015 Chancellor Merkel opened Germany’s borders to, what turned out to be, hundreds of thousands refugees. While she won widespread international admiration from Bono and the UN, her own electorate began to have serious doubts. And other European countries, on the forefront of the migration crisis, resented the unilateral German decision which immediately affected them and over which they had no say.

The second decision was to strike a dubious deal with Russia to build a second gas pipeline (Nord Stream 2) that has no economic value but enormous geopolitical consequences: by not having to cross Ukraine and Central Europe, Russian gas can be delivered directly to German industries while Moscow can threaten to cut off supplies to states deemed by it to be part of its sphere of influence. This German decision abets Russian imperial aspirations toward Ukraine as well as toward EU member states in Central Europe. Berlin may speak highly of the EU, but wields its power as it wishes, in open disdain of other EU members.

The harsh reality is that the European Union was a project that may have had a chance in a benign geopolitical environment. In a competitive world, with antagonistic external powers and growing internal imbalances, the EU is failing. European states have to figure out on their own and through alliances (especially with the United States) how to keep rivals out and Germany more cooperative. The EU won’t do it for them.


The post The EU Can’t Fulfill Its Purpose appeared first on The American Interest.

 •  0 comments  •  flag
Share on Twitter
Published on December 30, 2018 06:00

Peter L. Berger's Blog

Peter L. Berger
Peter L. Berger isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Peter L. Berger's blog with rss.