Chris Hedges's Blog, page 197
July 22, 2019
McDonald’s Loves Exploiting Our Public Schools
Corporate America is looming larger and larger in U.S. public schools. That’s not a good thing for educators, students, or workers.
Nowhere could this be more clear than the case of McDonald’s, whose founder once scouted locations for new stores by flying over communities and looking for schools. The fast food giant pioneered methods of attracting school children to its stores — from Happy Meals to marketing schemes like McTeacher’s Nights.
McTeacher’s Nights have become almost commonplace in many parts of the country. Here’s how they work.
Teachers and other public school employees prompt students and parents to eat at their local McDonald’s on an otherwise slow night. Then teachers volunteer their time behind the cash register, serving students and their families junk food, while McDonald’s workers are often told not to go in that night for their shift.
A small amount of the proceeds — about $1 to $2 per student — then goes back to the school.
Many students have grown up with these seemingly innocuous fundraisers. Hundreds, if not thousands, happen across the U.S. each year, according to Corporate Accountability and the Campaign for a Commercial-Free Childhood.
Meanwhile, thanks to gross underfunding of public schools, such fundraisers get less scrutiny than they should. Beyond the obvious problem of enlisting teachers — the people children trust most, next to their parents — to serve young people junk food, there’s also the issue of labor rights.
Teachers are already woefully underpaid for the service they provide our communities. McTeacher’s Nights engage these teachers to volunteer additional hours, often displacing low-income McDonald’s workers in the process.
What results is what one former McDonald’s CEO described as philanthropy that’s “99 percent commercial” in nature. What do we call it? Exploitation.
Teachers need to be standing in solidarity with McDonald’s employees, not at cross-purposes. They are our students, family members, and our neighbors. For their long hours working on their feet, they are often paid poverty wages.
And as a recent report from the National Employment Law Project finds, the corporation is failing in its legal duty to provide employees a safe work environment. Dozens of women from California to Florida have filed complaints alleging sexual harassment by supervisors and co-workers in McDonald’s stores and franchises. And thousands of workers in 10 cities walked off the job to protest these abuses.
In the education field, we know the importance of a strong union to prevent abuses like these. Yet McDonald’s has been accused of union-busting, and even firing employees for attending Fight for $15 rallies to raise the minimum wage.
That’s why more than 50 state and local teachers unions have signed an open letter challenging McDonald’s CEO Steve Easterbrook to end McTeacher’s Nights. And this year, the American Federation of Teachers (AFT), representing 1.7 million members and 3,000 local affiliates, adopted a resolution rejecting all corporate-sponsored fundraisers for schools.
It’s time for McDonald’s and other corporations to stop exploiting our schools, children, and their own workforce. Until they do, we will continue to stand with McDonald’s workers in their fight for a living wage and a safe workplace — and for teachers fighting for the funding their local schools need.

Breaking Stories About a Tech Giant May Have Cost This Reporter His Job
Three years after tech reporter Christopher Calnan was terminated from the Austin Business Journal, he received threats from its parent company for the same reason he was hired and praised for in the first place: his ability to cover breaking news about the powerful Austin-based Dell Technologies Inc.
“They recruited me because they were having a hard time getting any real, breaking news at all,” Calnan said. “Even in the interview, they asked me if I could break anything on the company, because Dell was the big company in Austin.”
Calnan had worked for the American City Business Journals (ACBJ)—a company that runs 43 local business news outlets across the country—for 11 years, and was recruited to the Austin Business Journal (ABJ) after three years of covering technology at ACBJ’s Boston-based Mass High-Tech.
ACBJ is owned by Advance Publications, Inc., perhaps best known as the parent company of Conde Nast, which owns 18 publications, including the New Yorker, Wired and Vogue. Advance is also the majority shareholder of the social media/news aggregator site Reddit. Advance was founded by the Newhouse family, which funds the S.I. Newhouse School of Communications at Syracuse University, one of the most renowned journalism schools in the country.
Michael Dell (Hartmann Studios/Oracle)
Calnan’s issues with Advance and Dell began in 2015, when he wrote an article for ABJ about Dell CEO Michael Dell accepting an award from the environmental group Keep America Beautiful. The article pointed out that Dell had paid $75,000 to sponsor tables at the luncheon where he was presented with the award.
Calnan said during his reporting process, he reached out to Dell for comment and received no objections to the story. But after leaving the office on vacation, Calnan returned to find the story removed from ABJ’s site. He was asked to delete the accompanying tweets.
“When I came back into the office, my publisher told me Michael Dell personally called our corporate office, and two weeks after I was called in for a surprise performance review and I was told that my job was in jeopardy,” Calnan said.
Calnan said he believes the performance review served to cover up the true reason behind his ultimate termination: corporate censorship by ABJ for fear of Dell’s wrath.
“It was all handled behind closed doors, and they tried to cover it with the annual review,” Calnan said. “I hadn’t had an annual review in over three years.”
In fact, prior to this incident, Calnan had received glowing reviews from his higher-ups and the business news community. In 2010, he was awarded ACBJ’s Eagle Award for Excellence for 2009, recognized his reporting prowess and work ethic. In a 2012 performance review, he was described as “one of the most professional reporters we know,” whose ethics were “above reproach.”
But In 2016, Calnan said, Dell denied him press access to the company’s annual conference, DellWorld, with a spokesperson even telling his editors that they would allow anyone else from ABJ.
“They wouldn’t give me press credentials to DellWorld, and my editors didn’t even object to that,” Calnan said.
Calnan’s job was terminated that same year, after he wrote an article about Dell planning to move the 2017 conference from Austin to Las Vegas, and highlighted that the move would cost the Texas city millions in annual revenue.
Upon termination, Calnan was offered an $8,500 nondisclosure agreement, which he refused. He moved back to Boston and later published articles on his website, ChristopherCalnan.com, continuing to write about Dell’s censorship tactics and his own experience with them.
Paul Sweeney, an Austin-based tech reporter of 30 years, spent months researching and reporting on Calnan’s story and Dell’s censorship tactics for the Texas Observer. Two years later, the article’s draft remains unpublished.
“The editor at the Texas Observer didn’t feel it was strong enough,” Sweeney said.
From what I recall, it was that it was too circumstantial.… I stand by the story. Everything I wrote about was documented, or I talked to people. There’s no question on the information. … Maybe he wanted more of a smoking gun than I gave him, but the theme of the story is this is what happens to a good reporter, this is the way they do it.… The story isn’t that Michael Dell showed up in the office and walked up and pointed at him and said, “Off with his head.” … It’s very subtle. There are phone calls made, there’s a paper trail created…a period of ostracization.
However, since Sweeney’s story, Advance has taken even more steps against Calnan. This past March, Calnan received a letter from Advance’s attorney, threatening legal action on two claims that Calnan and his attorney easily refuted.
The first was an allegation that Calnan was illegally accessing one of ABJ’s Twitter accounts to post his own content. “Please be advised that any future attempt by you to improperly use an ACBJ social media account may result in ACBJ pursuing all available legal remedies against you, including the civil and criminal remedies available under the Computer Fraud and Abuse Act,” the lawyers wrote.
However, the account in question, @ABJTech, was formerly Calnan’s, @abjcanlan. After his termination, the company took over the account and changed the name. The articles that were shared on @ABJTech were posted via bit.ly, because Calnan’s usurped Twitter account and bit.ly accounts were linked.
The second claim was that Calnan was infringing on copyright by creating a website called Business Journal Network with the domain name, austinbizjournal.com, names confusingly similar to that of American City Business Journals. One article he had posted on his own website allegedly appeared on the site, but Calnan said he had never heard of it, and still said he has not visited the page. The site with the domain name austinbizjournal.com no longer exists.
When Calnan and his attorney responded to Advance refuting their claims, they did not receive a reply.
Neither Advance nor ABJ responded to FAIR’s requests for comment.
Steve Gilmore, a member of Dell’s global communications team responded to FAIR via email: “Our only comment on this matter is that any suggestion of Dell involvement in ABJ’s HR matters, at any level, is simply baseless.”
However, as Sweeney said, Michael Dell need not walk into newsrooms and demand reporters be fired to have an effect on personnel matters. Calnan said he had heard his pieces sparked Dell to threaten legal action.
“One of my editors told me that the company threatened them with a lawsuit,” Calnan said.
But why would a subsidiary of Advance, a multi-billion-dollar conglomerate, fear a libel suit for stories based in fact about a public figure who would need an extremely high burden of proof to win? Calnan said he was never able to find out, but that he knows Dell is a big advertiser for many outlets.
In Austin, Dell has just about as much influence as money can buy. Michael Dell is worth $35 billion, and his philanthropy has placed his name on the Dell Children’s Medical Center, Dell Seton Medical Center, Dell Diamond baseball field, Dell Medical School at the University of Texas, Michael and Susan Dell Hall and more.
So when it comes to unfavorable press, Sweeney said, “He just has the ability, apparently, to call the top executives and put out any kind of brush fire.”
Post-Calnan coverage of Dell in the Austin Business Journal (7/19/17)
Sweeney also said tech outlets like ABJ often work as mouthpieces for powerful companies, rather than upholding the journalistic ethic of being critical of power.
“Something like these business journals are designed to be the booster press,” Sweeney said.
After Calnan left ABJ, the company certainly made efforts to boost Dell. Calnan’s successor, Mike Cronin, penned an article (7/19/17) about Dell’s clever product placement in the Spider-Man: Homecoming film.
“It was uncritical,” Sweeney said.
Sweeney told FAIR corporate headquarters are gaining more power over what content outlets publish.
“I want readers to realize that their news is being sanitized,” Calnan said. “It’s being company-approved.”

The Demobilization of Americans Is a Surreal Tragedy
As I turn 75, there’s no simpler way to put it than this: I’m an old man on a new planet — and, in case it isn’t instantly obvious, that’s not good news on either score.
I still have a memory of being a camp counselor in upstate New York more than half a century ago. I was perhaps 20 years old and in charge of a cabin of — if I remember rightly — nine-year-old campers. In other words, young as they were, they were barely less than half my age. And here’s what I remember most vividly: when asked how old they thought I was, they guessed anything from 30 to 60 or beyond. I found it amusing largely because, I suspect, I couldn’t faintly imagine being 60 years old myself. (My grandmother was then in her late sixties.) My present age would have been off the charts not just for those nine year olds, but for me, too. At that point, I doubt I even knew anyone as old as I am now.
Yet here I am, so many decades later, with grandchildren of my own. And I find myself looking at a world that, had you described it to me in the worst moments of the Vietnam War years when I was regularly in the streets protesting, I would never have believed possible. I probably would have thought you stark raving mad. Here I am in an America not just with all the weirdness of Donald Trump, but with a media that feeds on his every bizarre word, tweet, and act as if nothing else were happening on the face of the Earth. If only.
A Demobilizing World
In those Vietnam years, when a remarkable range of people (even inside the military) were involved in antiwar protests, if you had told me that, in the next century, we would be fighting unending wars from Afghanistan to Somalia and beyond I would have been shocked. If you had added that, though even veterans of those wars largely believe they shouldn’t have been fought, just about no one would be out in the streets protesting, I would have thought you were nuts. Post-Vietnam, how was such a thing possible?
If you had told me that, in those years to come, the American military would be an “all-volunteer” one, essentially a kind of foreign legion, and that those who chose not to be part of it would endlessly “thank” the volunteers for their service while otherwise continuing their lives as if nothing were going on, I wouldn’t have believed you. If you had also pointed out that economic inequality in America would reach levels that might have staggered denizens of the Gilded Age, that three Americans would possess the same wealth as the bottom half of society, that a CEO would, on average, make at least 361 times the income of a worker, and that for years there would be no genuine protest around any of this, I would have considered it un-American.
If, in those same years, you had assured me that, in our future, thanks to a crucial Supreme Court decision, so much of the money that had gushed up to the wealthiest 1%, or even .01%, of Americans would be funneled back, big time, into what still passed for American democracy, I would have been stunned. That a 1% version of politics would essentially pave the way for a billionaire to enter the White House, and that, until the arrival of Bernie Sanders in 2016, protest over all this would barely be discernable, I certainly wouldn’t have believed you.
In sum, I would have been amazed at the way, whatever the subject, Americans had essentially been demobilized (or perhaps demobilized themselves) in the twenty-first century, somehow convinced that there was nothing to be done that would change anything. There was no antiwar movement in the streets, unions had been largely defanged, and even the supposed “fascist” in the White House would have no interest in launching a true movement of his own. If anything, his much-discussed “base” would actually be a set of “fans” wearing red MAGA hats and waiting to fill stadiums for the Trump Show, the same way you’d wait for a program to come on TV.
And none of this would have staggered me faintly as much as one thing I haven’t even mentioned yet. Had I been told then that, by this century, there would be a striking scientific consensus on how the burning of fossil fuels was heating and changing the planet, almost certainly creating the basis for a future civilizational crisis, what would I have expected? Had I been told that I lived in the country historically most responsible for putting those carbon emissions into the atmosphere and warming the planet egregiously, how would I have reacted? Had I been informed that, facing a crisis of an order never before imagined (except perhaps in religious apocalyptic thinking), humanity would largely demobilize itself, what would I have said? Had I learned then that, in response to this looming crisis, Americans would elect as president a man who denied that global warming was even occurring, a man who was, if anything, focused on increasing its future intensity, what in the world would I have thought? Or how would I have reacted if you had told me that from Brazil to Poland, the Philippines to England, people across the planet were choosing their own Donald Trumps to lead them into that world in crisis?
Where’s the Manhattan Project for Climate Change?
Here, let me leap the almost half-century from that younger self to the aging creature that’s me today and point out that you don’t have to be a scientist anymore to grasp the nature of the new planet we’re on. Here, for instance, is just part of what I — no scientist at all — noticed in the news in the last few weeks. The planet experienced its hottest June on record. The temperature in Anchorage, Alaska, hit 90 degrees for the first time in history, mimicking Miami, Florida, which was itself experiencing record highs. (Consider this a footnote, but in March, Alaska had, on average, temperatures 20 degrees warmer than usual.) According to figures compiled by the National Oceanic and Atmospheric Administration (NOAA), not just that state but every state in the union has been steadily warming, compared to twentieth-century averages, with Rhode Island leading the way. Europe also just experienced a fierce heat wave — they’re coming ever more often — in which one town in southern France hit a record 115 degrees. India’s sixth-largest city, under its own heat emergency, essentially ran out of water. The sea ice in Antarctica has experienced a “precipitous” fall in recent years that shocked scientists, while a glacier the size of Florida there seems to be destabilizing (bad news for the future rise of global sea levels). As a NOAA study showed, thanks to sea-level rise, flooding in coastal American cities like Charleston, South Carolina, is happening ever more often, even on perfectly sunny days. Meanwhile, the intensity of the rainfall in storms is increasing like the one that dumped a month’s worth of water on Washington, D.C., one recent Monday morning. That one turned “streets into rivers and basements into wading pools,” even dampening the basement of the White House — and such storms are growing more frequent. Oh yes, and the world’s five hottest years on record have all occurred since 2014, with 2019 more or less a surefire add-on to that list on a planet on which the last 406 consecutive months have been warmer than the twentieth-century average. (By the end of the month of January 2019, that same planet in only 31 days had already set 35 records for heat and only two for cold.) And that’s just to start down a longer list of news about climate change or global warming or, as the Guardian has taken to calling it recently, the “climate emergency” or “climate breakdown.”
In response to such a world, sometimes — an exaggeration but not too much of one — it seems as if only the children, mainly high-school students inspired by a remarkable 16-year-old Swedish girl with Asperger syndrome, have truly been mobilizing. With their Friday school strikes, they are at least trying to face the oncoming crisis that is increasingly our world. In a way the adults of that same world generally don’t, they seem to grasp that, by not mobilizing to deal with climate change, we are potentially robbing them of their future.
In that sense, of course, I have no future, which is just the normal way of things. Our lives all end and, at 75, I (kind of) understand that I’m ever closer to stepping off this planet of ours. The question for me is what kind of a planet I’ll be leaving behind for those very children (and their future children). I understand, too, that when it comes to climate change, we face the wealthiest, most powerful industry on the planet, the fossil-fuel giants whose CEOs, in their urge to keep the oil, coal, and natural gas flowing forever and a day, will assuredly prove to be the greatest criminals and arsonists in a history that doesn’t lack for great crimes — and that’s no small thing. (In those never-ending wars of ours, of course, we Americans face some of the next most powerful corporate entities on the planet and the money and 1% politics that go with them.)
Still, I can’t help but wonder: Was the Paris climate accord really the best the planet could do (even before Donald Trump announced that the U.S. would pull out of it)? I mean, at 75, I think to myself: Where, when it comes to climate change, is an updated version of the Manhattan Project, the massive government research effort that produced (god save us) the atomic bomb? Or the Cold War version of the same that so effectively got Americans onto the moon and back? It was possible to mobilize at a massive level then, why not now? Imagine what might be done in terms of renewable energy or global projects to mitigate climate change if the governments of Planet Earth were somehow to truly face the greatest crisis ever to hit human life?
Imagine being the Chinese government and knowing that, by 2100, parts of one of your most populous regions, the North China Plain, will likely be too hot to be habitable. Grasping that, wouldn’t you start to mobilize your resources in a new way to save your own people’s future rather than building yet more coal-fired power plants and exporting hundreds of them abroad as well? Honestly, from Washington to Beijing, New Delhi to London, the efforts — even the best of them — couldn’t be more pathetic given what’s at stake.
The children are right. We’re effectively robbing them of their future. It’s a shame and a crime. It’s what no parents or grandparents should ever do to their progeny. We know that, as in World War II, mobilization on a grand scale is possible. The United States proved that in 1941 and thereafter.
Perhaps, like most war mobilizations, that worked so effectively because it had a tribal component to it, being against other human beings. We have little enough experience mobilizing not against but with other human beings to face a danger that threatens us all. And yet, in a sense, doesn’t climate change represent another kind of “world war” situation, though it’s not yet thought of that way?
So why, I continue to wonder, in such a moment of true crisis are we still largely living on such a demobilized world? Why is it increasingly a Trumpian planet of the surreal, not a planet of the all-too-real?

Greta Thunberg Holds Up D-Day Veteran’s Call to Avert Collapse of Civilization
Climate activist Greta Thunberg on Sunday urged people to recognize “the link between climate and ecological emergency and mass migration, famine, and war” as she was given the first “Freedom Prize” from France’s Normandy region for her ongoing school strikes for climate and role in catalyzing the Fridays for future climate movement.
The 16-year-old received the award before a crowd of roughly 2,000 people in the city of Caen. She shared the stage with D-Day veterans and prize sponsors Léon Gautier of France and Charles Norman Shay of the U.S.
“I think the least we can do to honor them,” said Thunberg, “is to stop destroying that same world that Charles, Leon, and their friends and colleagues fought so hard to save.”
Thunberg spent the day before the award ceremony with Shay:
I had the honour of spending the day with Charles Norman Shay at Omaha Beach. Charles was in the first wave that spearheaded D-Day. He is also Native American and member of the Penobscot tribe of Maine. He’s a hero in a way that is almost impossible to understand. #prixliberte pic.twitter.com/7KnrowF96d
— Greta Thunberg (@GretaThunberg) July 20, 2019
On Twitter, Thunberg also highlighted some of Shay’s remarks during Sunday’s ceremony, calling them “the most powerful words on the climate and ecological emergency I’ve ever heard.”
“All these many damages on Mother Nature make me sad,” said Shay. “As a soldier I fought for freedom to liberate Europe [and the] world [from] Nazism 75 yeas ago, but this is no sense if Mother Nature is deeply wounded, and if our civilization collapses due to inappropriate human behaviors.”
These are the most powerful words on the climate and ecological emergency I’ve ever heard.
Listen to Charles Norman Shay, a Native American D-Day veteran who was in the first wave and landed on Omaha Beach 06.30h June 6th 1944.#PrixLiberte #ClimateBreakdown #EcologicalBreakdown pic.twitter.com/M28dxeWvu0
— Greta Thunberg (@GretaThunberg) July 21, 2019
Agence France-Presse reported on Thunberg’s remarks at the ceremony:
“This is a silent war going on. We are currently on track for a world that could displace billions of people from their homes, taking away even the most basic living conditions from countless people, making areas of the world uninhabitable from some part of the year.
“The fact that this will create huge conflicts and unspoken suffering is far from secret.”
“And yet the link between climate and ecological emergency and mass migration, famine, and war is still not clear to many people. This must change.”
Thunberg beat out two other finalists, Saudi blogger and dissident Raif Badawi and Chinese photojournalist Lu Guang, to become the winner.
The Freedom Prize website offers this background of the new award:
Focused on the meaning and values of the Allied landings, the Freedom Prize gives young people all over the world the opportunity to choose an exemplary person or organization, committed to the fight for freedom. Just like those who risked their lives when they landed on the Normandy beaches on 6 June 1944.
On 6 June 1944, it was in the name of the ideal of freedom that 130,000 soldiers, including a significant number of young volunteers, risked their lives and that several thousand died on these unfamiliar beaches. 17 nations were involved in Operation Overlord to open up “Liberty Road” on which nearly 3 million soldiers traveled in their bid to save the world from the barbarity of the Nazis. The Allied landings remind us that freedom is a universal demand.
Today, many situations around the world testify to its fragility. The Freedom Prize pays homage to all those who fought and continue to fight for this ideal.
Thunberg, responding to a recent question from one of the readers of the U.K’s Observer, made clear that her commitment to the fight for urgent climate action is unwavering.
“We must never give up,” she said. “I have made up my mind and decided to never, ever give up.”

Scapegoating Iran
A little more than a year ago, Chris Hedges interviewed the Iranian ambassador to the United Nations and wrote about the situation between the United States and Iran. Now, amid even greater tension and threats of war between the two nations, Truthdig reposts that June 10, 2018, column. Hedges is on vacation and will return with a new article Aug. 12.
NEW YORK—Seventeen years of war in the Middle East and what do we have to show for it? Iraq after our 2003 invasion and occupation is no longer a unified country. Its once modern infrastructure is largely destroyed, and the nation has fractured into warring enclaves. We have lost the war in Afghanistan. The Taliban is resurgent and has a presence in over 70 percent of the country. Libya is a failed state. Yemen after three years of relentless airstrikes and a blockade is enduring one of the world’s worst humanitarian disasters. The 500 “moderate” rebels we funded and armed in Syria at a cost of $500 million are in retreat after instigating a lawless reign of terror. The military adventurism has cost a staggering $5.6 trillion as our infrastructure crumbles, austerity guts basic services and half the population of the United States lives at or near poverty levels. The endless wars in the Middle East are the biggest strategic blunder in American history and herald the death of the empire.
Someone has to be blamed for debacles that have resulted in hundreds of thousands of dead, including at least 200,000 civilians, and millions driven from their homes. Someone has to be blamed for the proliferation of radical jihadist groups throughout the Middle East, the continued worldwide terrorist attacks, the wholesale destruction of cities and towns under relentless airstrikes and the abject failure of U.S. and U.S.-backed forces to stanch the insurgencies. You can be sure it won’t be the generals, the politicians such as George W. Bush, Barack Obama and Hillary Clinton, the rabid neocons such as Dick Cheney, Paul Wolfowitz and John Bolton who sold us the wars, the Central Intelligence Agency, the arms contractors who profit from perpetual war or the celebrity pundits on the airwaves and in newspapers who serve as cheerleaders for the mayhem.
“The failed policies, or lack of policies, of the United States, which violate international law, have left the Middle East in total chaos,” the Iranian ambassador to the United Nations, Gholamali Khoshroo, told me when we met in New York City. “The United States, to cover up these aggressive, reckless and costly policies, blames Iran. Iran is blamed for their failures in Yemen, Iraq, Afghanistan, Syria and Lebanon.”
The Trump administration “is very naive about the Middle East and Iran,” the ambassador said. “It can only speak in the language of threats—pressure, sanctions, intervention. These policies have failed in the region. They are very risky and costly. Let the Americans deal with the problems of the countries they have already invaded and attacked. America lacks constructive power in the Middle East. It is unable to govern even a village in Iraq, Afghanistan, Yemen or Syria. All it can do is use force and destructive power. This U.S. administration wants the Middle East and the whole world to bow to it. This is not a policy conducive to sound relationships with sovereign states, especially those countries that have resisted American influence.”
“The plan to arm ‘moderate’ rebels in Syria was a cover to topple [Syrian President] Bashar al-Assad,” the ambassador went on. “The Americans knew there were no ‘moderate’ rebels. They knew these weapons would get into the hands of terrorist groups like Daesh [Islamic State], Al-Nusra and their affiliates. Once again, the American policy failed. The Americans succeeded in destroying a country. They succeeded in creating bloodbaths. They succeeded in displacing millions of people. But they gained nothing. The sovereignty of Syria is expanding by the day. It is hard to imagine what President Trump is offering as a strategy in Syria. One day, he says, ‘I will move out of Syria very soon, very quickly.’ The next day he says, ‘If Iran is there, we should stay.’ I wonder if the American taxpayers know how much of their money has been wasted in Iraq, Syria and Yemen.”
Donald Trump’s unilateral decision to withdraw from the Iran nuclear deal, although Iran was in compliance with the agreement, was the first salvo in this effort to divert attention from these failures to Iran. Bolton, the new national security adviser, and Secretary of State Mike Pompeo, along with Trump lawyer Rudy Giuliani, advocate the overthrow of the Iranian government, with Giuliani saying last month that Trump is “as committed to regime change as we [an inner circle of presidential advisers] are.”
“The Iran nuclear deal was possible following several letters by President Barack Obama assuring the Iranian leadership that America had no intention of violating Iranian sovereignty,” Ambassador Khoshroo said. “America said it wanted to engage in a serious dialogue on equal footing and mutual interests and concerns. These assurances led to the negotiations that concluded with the JCPOA [Joint Comprehensive Plan of Action]. From the beginning, however, America was not forthcoming in its dealings with us on the JCPOA. President Obama wanted the agreement to be implemented, but he did not want it implemented in its full capacity. Congress, on the day JCPOA was implemented, passed a law warning Europeans that were doing business with Iran. The staffs of companies had to apply for a visa to the United States if they had traveled to Iran for business purposes. This began on the first day. The Americans were not always very forthcoming. OFAC [Office of Foreign Assets Control] gave ambiguous answers to many of the questions that companies had about sanctions, but at least in words the Obama administration supported the JCPOA and saw the agreement as the basis for our interactions.”
“President Trump, however, even as a candidate, called the agreement ‘the worst deal America ever made,’ ” the ambassador said. “He called this deal a source of embarrassment for America. Indeed, it was not the deal but America’s unilateral decision to walk away from an agreement that was supported by the United Nations Security Council, and in fact co-sponsored and drafted by the United States, that is the source of embarrassment for America. To walk away from an international agreement and then threaten a sovereign country is the real source of embarrassment since Iran was in full compliance while the U.S. never was.”
“In 2008, the Israelis told the world that Iran was only some days away from acquiring an atomic bomb,” he said. “The Israelis said there had to be a military strike to prevent Iran from acquiring a nuclear weapon. What has happened since? During the last two years, there have been 11 reports by the IAEA [International Atomic Energy Agency] clearly confirming and demonstrating Iran’s full compliance with the JCPOA. All of the accusations [about] Iran using nuclear facilities for military purposes were refuted by the IAEA as well as by Europe, Russia, China, along with many other countries in Asia, Latin America, Africa. America is concerned about Iranian influence in the region and seeks to contain Iran because the U.S. administration realizes that America’s policies in the Middle East have failed. Their own statements about Iran repeatedly contradict each other. One day they say, ‘Iran is so weak it will collapse,’ and the next day they say, ‘Iran is governing several Arab capitals in the Middle East.’ ”
Iran announced recently that it has tentative plans to produce the feedstock for centrifuges, the machines that enrich uranium, if the nuclear deal is not salvaged by European members of the JCPOA. European countries, dismayed by Trump’s decision to withdraw from the agreement, are attempting to renegotiate the deal, which imposes restrictions on Iran’s nuclear development in exchange for the lifting of international sanctions.
Why go to war with a country that abides by an agreement it has signed with the United States? Why attack a government that is the mortal enemy of the Taliban, along with other jihadist groups, including al-Qaida and Islamic State, that now threaten us after we created and armed them? Why shatter the de facto alliance we have with Iran in Iraq and Afghanistan? Why further destabilize a region already dangerously volatile?
The architects of these wars are in trouble. They have watched helplessly as the instability and political vacuum they caused, especially in Iraq, left Iran as the dominant power in the region. Washington, in essence, elevated its nemesis. It has no idea how to reverse its mistake, beyond attacking Iran. Those both in the U.S. and abroad who began or promoted these wars see a conflict with Iran as a solution to their foreign and increasingly domestic dilemmas.
For example, Israeli Prime Minister Benjamin Netanyahu, mired in corruption scandals, hopes that by fostering a conflict with Iran he can divert attention away from investigations into his abuse of power and the massacres Israel carries out against Palestinians, along with Israel’s accelerated seizure of Palestinian land.
“The most brutal regime is now in power in Israel,” the Iranian ambassador said. “It has no regard for international law or humanitarian law. It violates Security Council resolutions regarding settlements, its capital and occupation. Look at what Israel has done in Gaza in the last 30 days. On the same day America was unlawfully transferring its embassy to Jerusalem, 60 unarmed Palestinian protesters were killed by Israeli snipers. [Israelis] were dancing in Jerusalem while the blood of unarmed Palestinians was running in Gaza. The Trump administration gives total support and impunity to Israel. This angers many people in the Middle East, including many in Saudi Arabia. It is a Zionist project to portray Iran as the main threat to peace in the Middle East. Israel introducing Iran as a threat is an attempt to divert attention from the crimes this regime is committing, but these too are failed policies that will backfire. They are policies designed to cover weakness.”
Saudi Crown Prince Mohammed bin Salman, facing internal unrest, launched the war in Yemen as a vanity project to bolster his credentials as a military leader. Now he desperately needs to deflect attention from the quagmire and humanitarian disaster he created.
“Saudi Arabia, as part of [the civil war in Yemen], has a tactical and strategic cooperation with Israel against Iran,” the ambassador said. “But the Saudi regime is defying the sentiments of its own people. How long will this be possible? For three years now, Saudi Arabia, assisted by the United States, has bombed the Yemeni people and imposed a total blockade that includes food and medicine. Nothing has been resolved. Once again, Iran is blamed for this failure by Saudi Arabia and the United States in Yemen. Even if Iran wanted to help the Yemenis, it is not possible due to the total blockade. The Yemeni people asked for peace negotiations from the first day of the war. But Saudi military adventurism and its desire to test its military resolve made any peaceful solution impossible. The U.S. and the U.K. provide military and logistical support, including cluster bombs to be used by the Saudis in Yemen. The Emiratis are bombing Yemen. All such actions are doomed to failure since there is no military solution in Yemen. There is only a political solution. Look at the targets of Saudi airstrikes in Yemen: funerals. Wedding ceremonies. Agricultural fields. Houses. Civilians. How do the Saudis expect the Yemeni people to greet those who bomb them? With hugs? The war has cost a lot of money, and Trump responds by saying [to Saudi Arabia], ‘Oh you have money. [Paraphrasing here.] Please buy our beautiful weapons.’ They are killing beautiful children with these ‘beautiful’ weapons. It is a disaster. It is tragic.”
And then there is President Trump, desperate for a global crusade he can use to mask his ineptitude, the rampant corruption of his administration and his status as an international pariah when he runs for re-election in 2020.
“Of course, blaming and threatening Iran is not new,” the ambassador said. “This has been going on for 40 years. The Iranian people and the Iranian government are accustomed to this nonsense. United States intervention in the internal affairs of Iran goes back a long time, including the [Iranian] war with Iraq, when the United States supported Saddam Hussein. Then America invaded Iraq in 2003 in their so-called ‘intervention for democracy and elimination of WMDs.’ Iran has always resisted and will always resist U.S. threats.”
“America was in Iran 40 years ago,” the ambassador said. “About 100,000 U.S. advisers were in Iran during the rule of the shah, who was among the closest allies of America. America was unable to keep this regime in power because the Iranian people revolted against such dependency and suppression. Since the fall of the shah in 1979, for 40 years, America continued to violate international law, especially the Algeria agreements it signed with Iran in 1981.”
The Algeria Declaration was a set of agreements between the United States and Iran that resolved the Iranian hostage crisis. It was brokered by the Algerian government. The U.S. committed itself in the Algeria Declaration to refrain from interference in Iranian internal affairs and to lift trade sanctions on Iran and a freeze on Iranian assets.
The warmongers have no more of a plan for “regime change” in Iran than they had in Afghanistan, Iraq, Libya or Syria. European allies, whom Trump alienated when he walked away from the Iranian nuclear agreement, are in no mood to cooperate with Washington. The Pentagon, even if it wanted to, does not have the hundreds of thousands of troops it would need to attack and occupy Iran. And the idea—pushed by lunatic-fringe figures like Bolton and Giuliani—that the marginal and discredited Iranian resistance group Mujahedeen-e-Khalq (MEK), which fought alongside Saddam Hussein in the war against Iran and is viewed by most Iranians as composed of traitors, is a viable counterforce to the Iranian government is ludicrous. In all these equations the 80 million people in Iran are ignored just as the people of Afghanistan, Iraq, Libya and Syria were ignored. Perhaps they would not welcome a war with the United States. Perhaps if attacked they would resist. Perhaps they don’t want to be occupied. Perhaps a war with Iran would be interpreted throughout the region as a war against Shiism. But these are calculations that the ideologues, who know little about the instrument of war and even less about the cultures or peoples they seek to dominate, are unable to fathom.
“The Middle East has many problems: insecurity, instability, problems with natural resources such as water, etc.,” Khoshroo said. “All of these problems have been made worse by foreign intervention as well as Israel’s lawlessness. The issue of Palestine is at the heart of turmoil in the Middle East for Muslims. Any delay in finding solutions to these wounds in the Middle East exposes this region to more dangerous threats. Americans say they want the Middle East to be free from violent extremism, but this will only happen when the Middle East is free from occupation and foreign intervention. The Americans are selling their weapons throughout the Middle East. They calculate how much money they can earn from destruction. They don’t care about human beings. They don’t care about security or democratic process or political process. This is worrisome.”
“What are the results of American policies in the Middle East?” he asked. “All of the American allies in the region are in turmoil. Only Iran is secure and stable. Why is this the case? Why, during the last 40 years, has Iran been stable? Is it because Iran has no relationship with America? Why is there hostility between Iran and America? Can’t the Americans see that Iran’s stability is important for the region? We are surrounded by Pakistan, Afghanistan, Iraq, Syria, Yemen. What good would come from destabilizing Iran? What would America get out of that?”

July 21, 2019
Trump Is All But Provoking al-Qaeda With His Latest Move
Trump is sending US troops to be based in Saudi Arabia, and King Salman has signed off on the paperwork.
Several hundred US military personnel will be at Prince Sultan Air Base, accompanying fighter jets and Patriot missile defense systems.
US troops were first based in Saudi Arabia during the Gulf War of 1990-91, when George H. W. Bush assembled a coalition to force Iraq back out of Kuwait, which it brutally occupied and tried to annex in August, 1990.
As Courtney Kube at NBC notes, in 1996, a terrorist bombing at Khobar Towers killed 19 US air force personnel and wounded 400 other persons, and as a result US servicemen were thereafter kept at Prince Sultan Airbase, which the US leased from the Saudis. The then no-fly zone over Iraq was policed from Prince Sultan.
Apparently everyone has forgotten that al-Qaeda made hay over the US military presence in Saudi Arabia. That country includes the twin holy cities of Mecca and Medina, and traditionalist Muslims believe that the second commander of the faithful, `Umar ibn al-Khattab, forbade non-Muslims to live in the holy land.
Usama Bin Laden characterized the US military presence in Saudi Arabia, the monarch of which is the “Guardian of the Two Holy Shrines,” as a military occupation by foreign Christians and Jews (“crusaders and Zionists”) of the Muslim holy land. Al-Qaeda used this allegation as a recruiting tool, and Bin Laden gave it as one of three major motivations for the September 11, 2001, attacks (the others were the sanctions on Iraq that killed thousands of children and the Israeli occupation of Jerusalem).
After the 2003 US invasion and occupation of Iraq, the US had lots of bases there, and felt no further need for the Prince Sultan Airbase, from which it withdrew. Qatar also offered its facilities, and the US put 10,000 troops into al-Udeid Air Base.
The impetus for the return of US troops to Prince Sultan Airbase in Saudi Arabia is the renewed tensions with Iran, which Trump caused by breaching the 2015 nuclear deal with Tehran.
Saudi Arabia’s King Salman may also have lobbied for this move because Riyadh put Qatar under blockade, and feels at a disadvantage in this struggle because the US base al-Udeid makes Washington tilt toward Doha. But the US should be telling King Salman to make nice with Qatar; it shouldn’t be enabling his blockade by coddling Riyadh.
But it is a very, very bad idea.
First, al-Qaeda and kindred movements such as ISIL have not gone away. They will use this move very effectively for propaganda and for radicalization.
Second, Saudi Arabia in 2015 launched a massive and debilitating air war on little Yemen, in which Riyadh is still embroiled. The Houthis who control northern Yemen have managed to get hold of silkworm missiles and have occasionally struck substantially inside Saudi Arabia. The US is inserting itself further into this struggle, and Prince Sultan base could be targeted by the Houthis, though their technology and resources are limited.
Third, given the murder of dissident journalist Jamal Khashoggi, it is very bad optics for the US to be opening a base in Saudi Arabia. It looks like an endorsement of the assassination of dissidents.
Fourth, the bigger the US footprint in the region, the more exposed US personnel are to attacks, and hawks like Secretary of State Mike Pompeo and national security adviser John Bolton will manipulate any such attacks into a reason for the US to go to war with Iran.

Hawaii’s Mauna Kea Protectors Fight for ‘Our Right to Exist’
Thirteen thousand feet above the planet’s surface, forces of the sacred and the secular are locked in an epic struggle. At stake are the rights of an indigenous people and the fate of a crown jewel of world science.
The battlefield is the summit of Mauna Kea, a long-dormant volcanic mountain on Hawaii’s Big Island. Native Hawaiians are putting their bodies on the line to stop construction there of the Thirty Meter Telescope (TMT), a massive instrument that would give astronomers unprecedented access to the mysteries of the cosmos.
Many Hawaiians consider Mauna Kea to be sacred ground. They believe that building the world’s largest visible-light telescope atop the 13,803-foot mountain would be a desecration, as well as illegal under state law. Astronomers see the site as ideal for observation of the heavens: Hawaii is the most isolated population center on earth, situated in the mid-Pacific Ocean 2,300 miles from any other significant land mass. The skies are clear, the air is as clean as anywhere in the world and the Big Island’s dark sky law keeps light pollution to a minimum.
Related Articles

The 500-Year-Long Battle to Save the Planet
by

The Mass Extinction No One Is Talking About
by Robert Scheer
The battle to defeat the TMT began as soon as scientists picked the Mauna Kea site in 2009. Past protests have sometimes attracted only a small band of native Hawaiian elders. But, fueled by a cultural renaissance that has helped Hawaiians better understand and cherish their heritage, the movement has grown, blunting the ambitions of an international scientific consortium.
The confrontation escalated last week as the state gave TMT construction a green light. Protesters chained themselves to a cattle grate to block the access road. Thirty-four people, most of them elders, were arrested. Hawaii Gov. David Ige (EE’-gay) declared a state of emergency and called in the National Guard. The controversial move had the effect of galvanizing demonstrators statewide.
By week’s end, a thousand people were rallying at the base of the mountain. Other protests sprang up on the neighboring islands of Oahu, Maui and Kauai. And 3,000 miles away in Utah, more than 150 people gathered in solidarity with the Mauna Kea protectors at a site ceded to Hawaiians by the Mormon church.

The trail to the Mauna Kea summit (Max Pixel / Creative Commons)
The protesters also found some unexpected allies: In an open letter, 200 astronomers and astronomy students affiliated with TMT partner institutions denounced the criminalization of the protectors of Mauna Kea.
By Friday, Ige was sounding conciliatory, but the standoff continued. A TMT spokesman said Sunday the consortium had no plans to back out of the $1.4 billion project.
The conflict has even touched the 2020 presidential campaign. Democratic contender and Hawaii Congresswoman Tulsi Gabbard called on Ige to cancel the emergency order and delay construction.
Jennifer Sinco Kelleher of The Associated Press in Honolulu writes that Mauna Kea, the world’s tallest volcanic mountain, is symbolic of a wider, historical struggle for the rights of native peoples.
[T]he long-running telescope fight encapsulates critical issues to Native Hawaiians: the 1893 overthrow of the Hawaiian kingdom, clashes over land and water rights, frustration over tourism, attempts to curb development and questions about how the islands should be governed.It’s an example of battles by Native Americans to preserve ancestral lands, with high-profile protests like [the] Dakota Access pipeline leading to arrests in southern North Dakota in 2016 and 2017.
“The TMT and Mauna Kea is just the focal point,” said Hinaleimoana Wong-Kalu, a teacher and cultural practitioner. “For me it’s just a galvanizing element. … It goes back to the role that foreigners played and continue to play in Hawaii.”
Wong-Kalu said the exploitation of Hawaiian culture dates all the way back to the 1700s and Capt. James Cook. “They capitalize and commercialize our culture. … They prostitute the elements that make us Hawaiian. They make it look pretty and make it look alluring in an effort to bring more money into this state.”
“This is about our right to exist,” said Kaho’okahi Kanuha, a protest leader. “We fight and resist and we stand, or we disappear forever.”
Glen Kila of the Marae Ha’a Koa cultural center called Mauna Kea a living, life-giving entity.
“So that’s a different philosophy from the scientific world, that it’s just a mountain that can be used for an observatory. It can be developed,” Kila said. “For us, that’s sacrilegious.”

July 20, 2019
American History for Truthdiggers: Bill Clinton, the ‘New Democrat’
Editor’s note: The past is prologue. The stories we tell about ourselves and our forebears inform the sort of country we think we are and help determine public policy. As our current president promises to “make America great again,” this moment is an appropriate time to reconsider our past, look back at various eras of United States history and re-evaluate America’s origins. When, exactly, were we “great”?
Below is the 35th installment of the “American History for Truthdiggers” series, a pull-no-punches appraisal of our shared, if flawed, past. The author of the series, Danny Sjursen, who retired recently as a major in the U.S. Army, served military tours in Iraq and Afghanistan and taught the nation’s checkered, often inspiring past when he was an assistant professor of history at West Point. His war experiences, his scholarship, his skill as a writer and his patriotism illuminate these Truthdig posts.
Part 35 of “American History for Truthdiggers.”
See: Part 1; Part 2; Part 3; Part 4; Part 5; Part 6; Part 7; Part 8; Part 9; Part 10; Part 11; Part 12; Part 13; Part 14; Part 15; Part 16; Part 17; Part 18; Part 19; Part 20; Part 21; Part 22; Part 23; Part 24; Part 25; Part 26; Part 27; Part 28; Part 29; Part 30; Part 31; Part 32; Part 33; Part 34.
* * *
He was bright, he knew the details of domestic policy in and out, and he was a natural politician. William Jefferson Clinton, the “man from Hope,” Ark., grew up poor and rose to spectacular and unexpected heights. But he was also deeply insecure and obsessively needed to be liked, and, ultimately, it was unclear just what, if anything, the man believed in. Although Bill Clinton dreamed of being a great president, in the vein (he thought) of John F. Kennedy and Franklin D. Roosevelt, his abundant ambition was not enough to produce that result. But whatever his failures as a leader and a person, he reached voters, “felt their pain” and, on the surface at least, seemed to possess a common touch, an everyman empathy that drew multitudes to him. Having grown up among black people in Arkansas, he seemed particularly comfortable around African Americans, leading the famed novelist Toni Morrison to dub him “the first black president.” In time, she, and many of her fellow African Americans, undoubtedly came to regret those words as Clinton’s rather conservative, “New Democrat” policies proved to be disastrous for most blacks in the United States.
Clinton loved politics; he was known to talk endlessly into the night about the intricacies of policy. However, that mastery did not extend to his abilities as a boss. He had an awful temper, often lashed out at staff and seemed ill-disciplined and out of his depth early in his first term. Many staff members found him self-pitying, narcissistic and laser-focused not on values or issues but on opinion polls, his own political standing and re-election. Clinton was perhaps America’s first permanent campaigner president, setting the tone for his copycat successors in an age of tribal political partisanship. He was good at it too. A master of media “spin” and carefully crafted talking points, it was no accident that he would quickly be nicknamed “Slick Willie.”
Perhaps the depth of his character flaws was no more severe than that of presidents before him (one thinks of JFK’s scandals with medication and women), but Clinton occupied the Oval Office at a time when the media was far less likely than in earlier generations to defer to authority figures and look the other way. He was under a microscope for eight straight years, and what the public found was often disturbing. His personal foibles and dubious character traits were perhaps best summarized by Michiko Kakutani of The New York Times at the end of the second term. “In his adolescent craving to be liked …,” Kakutani wrote, “and the tacky spectacle of his womanizing, Mr. Clinton gave us a presidency that straddled the news pages and the tabloid gossip columns.” How accurate that assessment was, and how well it could be applied to the current chief executive!
The ascendant political right—in Congress, the conservative media and the pulpits of evangelical mega-churches—absolutely loathed the man, and even more so his ambitious wife, Hillary. Republicans exaggerated, cried wolf, one might say, depicting the Clintons as extreme leftists out of touch with middle (“real”) America and, indeed, even unpatriotic. They stifled the president at nearly every turn, at least when he attempted even marginally liberal policies. Yet just as often the conservatives sided with him, forming alliances of convenience, as Clinton showed his true colors, which were centrist and even right-leaning. Indeed, it would be apparent, in hindsight, that the 42nd president was the first outright neoliberal chief executive, tacking right time and again and paving the way for the rise of neoconservative Republican power.
Perhaps it should have been little surprise that Clinton never lived up to his idols, JFK and FDR, or managed to bolster the standing of the flailing Democratic Party. After all, in a three-way race, Clinton earned only 43 percent of the popular vote in winning his first presidential term in 1992. This meant that 57 percent of Americans voted for the Republican George H.W. Bush or the fiscally austere deficit hawk Ross Perot, who ran as a third-party candidate.
Clinton was truly a new breed of Democrat. More interested in winning than liberal dogma, he read the tea leaves of the “Reagan Revolution” and decided to undercut his conservative opponents by taking right-facing positions. Did he really believe that such once-Republican policies were in the nation’s best interest? Or did he just do what was necessary to win? The question will undoubtedly be argued forever. What’s certain is Clinton did not occupy the liberal ground once traveled by George McGovern, or even Jimmy Carter. He was a corporate Democrat, one who deftly convinced all parties in the waning Democratic coalition that he was “on their side,” while often selling them out soon afterward. This went for blacks, Hispanics, gays, union workers and the very poor. Clinton even spoke differently depending on his audience, managing, despite his inconsistent policies, to win the adoration of many of those he would abandon once in the White House.
Whereas Republicans were once seen as the “big money” party, Bill Clinton carried 13 of the 17 most affluent congressional districts in the 1996 election. Was liberalism dead? Maybe. It was, no doubt, at least in remission. The canny Clinton could stymie the Republicans, to their utter frustration, but he was more often than not the slave of the conservative majority on Capitol Hill and the (white) conservatives in an increasingly right-leaning populace. The least among the American people, theoretically the beneficiaries of the political left, did not gain from the Clinton years. The rich became richer, the poor poorer, and through a slew of neoliberal policies, Clinton managed to cede victory to the prevalent ideas of the conservative right. The consequences were often severe.
Clinton at Home: Neoliberalism Ascendant
Clinton ran as a centrist in 1992 but depicted himself as more fiscally conservative than GOP rival George H.W. Bush and promised to “end welfare as we know it.” He sometimes sounded as though he had co-opted the right’s message and its favorite talking points. Still, early in his first term, Clinton attempted two relatively liberal actions: health care reform and ending the military’s ban on gay members. He was rebuked on both counts. In the arena of health care, Republicans, more-conservative Democrats and powerful insurance industry lobbyists counterattacked, ran millions of dollars worth of negative ads and ultimately killed the liberal dream of universal medical insurance that had existed at least since the administration of Harry Truman. The defeat was a political embarrassment to the first lady, whom Clinton had tapped to lead the endeavor. More worrisome were the stark outcomes. Some 30 million Americans had no health care coverage at all, and U.S. infant death rates and other key health indices were at deplorable levels. Nearly all other industrialized countries had full coverage for their citizens and superior health outcomes.
When Clinton sought to end the ban on gays serving in the military, once again he ran up against a congressional coalition, this time one joined by senior military officers including Chairman of the Joint Chiefs Colin Powell. Their fierce opposition spooked the president, who agreed to a compromise policy known as “don’t ask, don’t tell.” Under a spectacularly vague statute, service members were not to reveal their sexual preference and military leaders were not to inquire. Still, an admission of homosexuality or discovery of a gay sexual act would still lead to dismissal from the military. In following years, more than 10,000 gays were discharged from the service, a disproportionate number of whom filled the paltry, but vital, ranks of linguists and foreign area experts. In the wake of the terror attack of Sept. 11, 2001, the military would desperately miss those troopers. Then, in a final abandonment of gays, Clinton signed the Defense of Marriage Act, a federal law that allowed states not to recognize out-of-state same-sex marriages and defined (from a federal perspective) the institution as a union between “one man and one woman.” The number of Democrats who supported or acquiesced to such discriminatory legislation demonstrated the prevailing and persistent social conservatism of the times.
After the debacles involving gays in the military and health care, and after Republicans swept to victory in the 1994 congressional midterm elections, Clinton went fully neoliberal, abandoning almost every traditional liberal cause. Over the complaints of unions and manufacturing workers, he would sign the North American Free Trade Agreement (NAFTA), which Bush had negotiated. Indeed, as pro-labor elements had feared, many companies would move to the cheap-labor environments of Mexico and elsewhere. Additionally, Clinton supported the racially charged crime bill that Democratic Sen. Joe Biden had shepherded through Congress. Defended with academically debunked charges—which Hillary Clinton repeated—about the existence of young “super-predators” in the inner cities, the crime bill was a disaster for the urban poor, especially blacks. The legislation delivered a blow to opponents of the death penalty (it limited appeals and decreased the time between conviction and execution), eliminated federal parole, encouraged states to take similar steps, and instituted “three strikes” programs (whereby a third felony conviction automatically led to life imprisonment), harsh mandatory minimum sentences and even new disparate sentencing guidelines for certain classes of narcotics. Specifically, possession of crystalized, or crack, cocaine (associated with users in poor, black urban areas) carried 100 times the criminal sentence as similar possession of powder cocaine (associated with affluent white users). The results were disastrous. About a million more Americans entered prison, a majority of them black and Hispanic, and the U.S. incarceration rate soon led the entire world by far.
Clinton would also follow through on his promise to gut welfare. In the insultingly titled 1996 Personal Responsibility and Work Opportunity Reconciliation Act, Clinton ended the core poverty-reduction program of the already sparse social safety net, Aid to Families with Dependent Children (AFDC). New limits were imposed on the working or unemployed poor. Benefits were cut after two years, lifetime benefits were capped at five total years, and the impoverished without children could receive only three months’ worth of food stamps. The ultimate result was to help increase income inequality, keep wages stagnant and limit opportunities of the poor.
The president is also remembered as an economic wizard, and, indeed, the economy did improve during the Clinton years as the U.S. emerged from the early 1990s recession that happened under Bush 41. The reasons for this were varied. First, a tech boom in Silicon Valley jump-started markets at the high end. The Dow Jones tripled from 3,600 to 11,000 during Clinton’s eight-year tenure. Then, Clinton raised, ever so slightly, the top federal tax bracket to 39.6 percent (still much lower than the Eisenhower-era rate of 90 percent and the Nixon administration rate of 70 percent) and made modest cuts to various social programs. Though the tax change was moderate, the Republicans, as a bloc, balked. Clinton’s budget and corresponding tax legislation passed by only two votes in the House, and Vice President Al Gore had to break a 50-50 tie in the Senate to win approval for the package. Not a single Republican crossed the aisle. The long-term economic results, though unclear at the time, were mostly positive. Increased revenue and slight spending cuts created a budget surplus for the first time in many years, and during Clinton’s tenure the federal debt decreased by some $500 billion. Had they not hated the president so viscerally, and irrationally, the (ostensible) deficit hawks of the GOP might have been pleased.
Still, amid low unemployment, a stock market boom and seeming prosperity, the gains of the “roaring ’90s” were highly uneven. In keeping with a trend since the early 1970s, working wages were stagnate. The vast majority of the income gains was seen by the super rich. That shouldn’t have surprised classical liberals. Indeed, Clinton, in pursuing a strategy for victory in increasingly fiscally conservative America, had long sought to distance himself from “tax-and-spend” liberals. He had even caustically needled his staff with the declaration, “I hope you’re all aware we’re the Eisenhower Republicans. We stand for lower deficits and free trade and the bond market. Isn’t that great?” It was true. Not only was Clinton not the fiscal leftist that the conservatives accused him of being, but his economic policy would most certainly have been squarely Republican in nature just a generation earlier.
The statistics of the economic “boom” were disturbing to those concerned with income disparity and inequality. Though dot-com stocks soared and unemployment dropped to 4.1 percent by 2001, Americans without a college degree or without a high school diploma suffered from stagnant or falling wages in this period. Furthermore, middle-class insecurity seemed to be on the rise. The new jobs created were generally nonunion, low-wage, service-sector jobs that lacked benefits. In many families, both parents now had to have jobs or one or both of the adults had to hold more than one job just to maintain a previous standard of living. The average American worked longer hours in the 1990s yet earned the same or less in real wages than he or she had in the 1970s. This was by no means an equally shared economic “recovery.”
As the rich got ever richer, especially in the tech and financial (non-manufacturing) sectors, a pervasive cultural worship of the very rich became widespread. This tendency has been common throughout U.S. history: American citizens—at least when compared with their more socialist and labor-centric Western European cousins—have been inclined to admire, rather than resent, the rich. New York City provided a telling case study. From 1992 to 1997, half of the increase in income came to those working in the financial sector, even though those employees accounted for just 5 percent of workers in the city. The number of city residents classified as middle class, meanwhile, steadily decreased. The trends of the Clinton years, in a very real sense, were little more than a continuation of those in the “Reaganomics” era. In 1980-2005, the top 1 percent of earners received over 80 percent of all income increases, doubling their share of the national wealth.
CEO wages, meanwhile, skyrocketed. In 1965, the average CEO earned 24 times as much as a standard worker. By 1999, the average CEO made more than 300 times his (and these executives usually were men) average worker. As for the working class and the chronically poor, they languished, earning less and, in many cases, seeing their benefits fade or vanish as Clinton strangled welfare. Between 1990 and 1998, the number of Americans who filed for bankruptcy increased by 95 percent (a high percentage of cases due to unaffordable medical bills and related out-of-pocket expenses). And when Clinton began his second term with a sense of economic exuberance, there were still 36.5 million Americans below the poverty line. Many in the underclass—perhaps “caste” is a more appropriate word—were black or Hispanic urban residents. National recoveries and new jobs rarely touched their inner-city neighborhoods.
Such racial disparities remained a standard facet of American life, 129 years after the Emancipation Proclamation and nearly three decades after the Civil Rights Act. Unemployment rates for Clinton-era urban minority youths were five times higher than those of white youths. In 1998, black unemployment held firm at 9.9 percent, more than twice the national average. All this income inequality was hardly unavoidable. It was a conscious choice, a product of deliberate economic policies delivered by dogmatic elite Republicans and their corporate “New Democrat” allies. Clinton, supposedly the “first black president,” sold out minorities and the very poor for a simple reason: They voted at lower rates than affluent Americans, had no disposable income to contribute to re-election campaigns and had—truth be told—few alternatives to voting Democrat. The conservatives, after all, weren’t selling any financial policies that the indigent would buy. When it came to the poor, then, Clinton, while professing to “feel their pain,” could aggravate their condition without suffering any political penalty. He, and his inflexible, angry, Republican opposition (and sometimes allies), just about completed the Reagan Revolution.
The Hazards of Liberal Interventionism: Clinton and the World
Clinton’s first, and perhaps only, love was domestic policy. “Foreign policy is not what I came here to do,” he complained in response to the global affairs that had the nasty habit of embroiling him. The president, having been the governor of a small Southern state and never having worked at the federal level in any significant way, had little experience with the world at large. He was most definitely far less qualified for making foreign policy than his 1992 opponent, George H.W. Bush. Clinton never clearly articulated a coherent doctrine or model for international affairs, as such, usually being reactive rather than proactive on such issues. With the Cold War over and no clear superpower rival, Clinton, and certainly more liberal doves, hoped to reduce defense spending and reallocate the “peace dividend” to social programs or middle-class tax cuts. It was not to be. The president, spurred on by global and domestic pressure, and afraid to suffer the Democratic political disease of “looking weak,” eventually submitted to “mission creep” and a form of unipolar liberal interventionism. As a consequence, Clinton set the stage for more aggressive neoconservative successors who held bolder, more flagrant plans for American power projection.
Throughout his eight-year term, in foreign affairs Clinton always seemed to do too little, too late, or to overreach, overpromise and get bogged down in indefinite missions. More often than not he simply failed. In 1993-94, the president tried to mimic Jimmy Carter and broker Mideast peace. With help from Norwegian negotiators in Oslo, Clinton helped broker an end to the Palestinian uprising known as the First Intifada (1987-93), convinced the Palestine Liberation Organization’s Yasser Arafat to recognize the state of Israel, and influenced Israeli Prime Minister Yitzak Rabin to grant a measure of (but not much) autonomy to Palestinians in the isolated, occupied West Bank and Gaza Strip. It was all supposed to be a starting point for later negotiations and a two-state solution, but this never unfolded—mainly due to Israeli intransigence and resulting Palestinian terror attacks. So, despite his place in the well-publicized photo of Rabin and Arafat shaking hands on the White House lawn in 1993, Clinton accomplished very little in the way of achieving lasting Arab-Israeli peace. Perhaps it was all over soon after the ceremony when a Jewish extremist assassinated the (relatively) moderate Rabin.
In Somalia, where the U.S. was conducting a “humanitarian” military mission ordered in the last days of the Bush administration, Clinton fell victim once again to mission creep, along with bad luck. When he agreed to widen the mission from famine relief to commando operations against warlords (there were plenty in Somalia), he greatly increased the potential for disaster. It struck on Oct. 3, 1993, when two U.S. Army Black Hawk helicopters were shot down by Somali militiamen. Eighteen American troops were killed, and video images of an American special operator’s body being dragged through the streets of Mogadishu flashed across global media. (There were few comments, of course, on the perhaps thousands of Somalis, including many civilians, who had been killed.) Clinton was torn about what to do next and ultimately hedged. In the short term, he said the right things, and he sent in extra troop support, but—like Ronald Reagan before him in Lebanon—quietly withdrew the troops soon after. A burgeoning Saudi Islamist jihadi, and one-time U.S. ally in the Soviet-Afghan war, Osama bin Laden, took notice and claimed a victory of sorts. Kill a few American troops, he surmised, and the U.S. military would turn tail and run.
In Bosnia, Clinton—fearful of another Vietnam, or another Somalia—dithered for two years while Serbs conducted a brutal ethnic cleansing campaign against poorly armed Muslims, often civilians. After a bloodbath at a refugee camp at Srebrenica, and a Serbian shelling of a Sarajevo market, Clinton intervened along with NATO allies. U.S. and NATO warplanes bombed Bosnian Serb positions—maintaining high altitudes to avoid pilot casualties—killing thousands, including civilians, and forcing the Serbs to the negotiating table. An uneasy truce held and thousands of U.S. and allied troops flooded into Bosnia, but, contrary to Clinton’s assertion that the deployment would last only a year, American troops remained on the ground. Moreover, many Bosnian Muslims would resent the U.S. hesitance to come to their aid and the resultant deaths of tens of thousands of their people.
Then, in 1999, in the same Balkan region, when the Serbian army began forcible removals and killings of the Albanian Muslim majority in the province of Kosovo, the U.S. and NATO again unleashed a bombing campaign, this time not only on the Serbian army but on the capital. Thousands died, including hundreds of innocents, and an errant bomb hit the Chinese Embassy, killing several staff members and stressing Sino-American relations. The Kosovars won a degree of autonomy, but many Americans doubted that the military intervention had been prudent. Indeed, the insurgent Kosovo Liberation Army was far from an innocent party and exacted bloody postwar retribution on many Serbs in the province. And, once again, U.S. troops became ensconced in the tiny province. Perhaps most vitally, this second, more aggressive, intervention in the Balkans alienated the Russians, who had long been allied with the Serbs and saw the region as being in their own sphere of influence. By underestimating Russian ambitions and failing to sufficiently negotiate with Moscow ahead of time, the Americans overreached in Kosovo, adding to Russian grievances and later tensions between the U.S. and the (still nuclear-armed) former superpower.
Once again, however, the most influential and meaningful Clinton-era foreign policy crises unfolded in the tense Middle East, specifically around the Persian Gulf. Islamist terrorist attacks on American targets at home and in the region increased. Two embassies in Africa, a U.S. naval vessel in Yemen and the World Trade Center in New York City all were bombed between 1993 and 2000. Most attacks were claimed by Osama bin Laden’s al-Qaida terror network, filled with veterans of the Afghanistan War who had once been on the CIA payroll. The term “blowback” aptly characterized the situation. Bin Laden’s attacks should hardly have come as a surprise. He had literally declared war, in writing, on the United States, citing three grievances that he said justified attacks on American targets: the continued U.S. military presence near the Saudi holy sites of Mecca and Medina; reflexive, one-sided U.S. support for Israeli military occupation of Palestine; and strict U.S. economic sanctions on Iraq, still in place after the Persian Gulf War of 1991 and which, he accurately said, had resulted in the deaths of half a million children. His assertions would find agreement among many across the world. Clinton’s team, headed by Secretary of State Madeleine Albright, didn’t even deny bin Laden’s final charge. In a moment of (admittedly callous) honesty, Albright, when asked whether the deaths of all those Iraqi children were “worth it,” replied, on air, that “yes, the price, we think the price is worth it.” In addition to the crippling sanctions, Clinton continued, even escalated, the Bush policy of regularly bombing Iraqi antiaircraft positions, intelligence headquarters and other targets in response to supposed Iraqi development of weapons of mass destruction and alleged involvement in a 1993 assassination attempt against George H.W. Bush in Kuwait. Hundreds of civilians were killed in these ubiquitous attacks.
By 2001, Bush I and Clinton had locked Saddam Hussein’s regime in a box. Saddam presented no serious threat to U.S. interests, his forces were regularly bombed—especially in 1998-2000—and, whether or not Clinton intended it, the stage was set for a later U.S. military regime-change invasion of Iraq. Some critics at the time charged that military strikes on a (supposedly) bin Laden-linked Sudanese pharmaceutical plant, al-Qaida training sites in Afghanistan, Iraqi positions and Serbian troops in Kosovo represented little more than attempts by Clinton to distract the American public from his ongoing (mainly sexual) personal scandals of the same period. No doubt his affairs (and general domestic policy) did cross to some extent with matters of foreign war and peace. Such intersections have not been rare in American politics.
A Question of Character: Clinton’s Scandals
Much about Clinton’s soap opera-like scandals was overblown. They nevertheless dominated media coverage of his administration. This obsessive, almost perverse, focus on Clinton’s private life proved, in time, to be both farcical and absurd. Republicans, mimicking the liberal-led investigation of Richard Nixon’s Watergate scandal, kept Clinton under nearly perpetual legal scrutiny during his two terms by using a new law that allowed the appointment of special prosecutors to investigate presidents. For the most part, Kenneth Starr, eventually a celebrity as special prosecutor, would uncover very little. Moreover, he probably should never have been appointed. A partisan solicitor general under George H.W. Bush, Starr in private practice had worked in the legal team of one of Clinton’s accuser’s, Paula Jones, and this clearly represented a conflict of interest in terms of his appointment as special prosecutor.
Republican watchdogs first went after a former Clinton real estate deal, dubbed Whitewater, but found mostly smoke and no fire. Regrouping, Starr investigated allegations of sexual harassment brought against the president by a former Arkansas state employee, Jones. After a federal investigation lasting four years and costing $40 million, a judge dismissed the Jones case as a “nuisance.” Flailing, Starr shifted attention to allegations that Clinton had, while president, engaged in an extramarital affair with a young White House intern, Monica Lewinsky. The Lewinsky case would dominate the last phase of Clinton’s presidency. It turned out that the charges had merit. The president had received oral sex on numerous occasions, even in the Oval Office, from Lewinsky, then proceeded to lie about it under oath; eventually, he would publicly dispute the definition of “is” and equivocate about what constituted “sex.” It was a remarkably juvenile reaction from a sitting president.
Nonetheless, both sides retreated to their familiar battle stations. The Republican majority in Congress filed articles of impeachment. The House would vote to impeach, but in a mainly partisan vote the Senate refused to remove Clinton. That lies about an adulterous affair rose to the level of scandal and illegality of Reagan’s Iran-Contra scandal was laughable, but Republican legislators and right-wing Fox News cared not and proceeded with a remarkable lack of self-awareness. Nor were liberals consistent. In fact, many acted hypocritically and embarrassingly. Avowed feminists who had rallied to Anita Hill and fully believed her allegations of sexual harassment against Clarence Thomas now offered full-throated defenses of Clinton. A leading feminist of the day, Betty Friedan, claimed that the president’s “enemies are attempting to bring him down through allegations about some dalliance with an intern,” adding, “Whether it’s fantasy, a set-up, or true, I simply don’t care.” The whole spectacle was symptomatic of a dangerously partisan era as both liberals and conservatives reflexively assembled along party lines regardless of ideological consistency or proper context.
Whatever the negative effects on the republic and the office of the presidency, the Lewinsky affair proved to be a media boon. Fox News had its best ratings to date, and more centrist (CNN) and patently liberal (MSNBC) cable news channels also flourished and raked in profits. However, when all was said and done, the Republicans had overreached. Most Americans (63 percent) disapproved of the attempt to remove the president, and the week after Clinton’s impeachment his approval ratings hit a new high of 73 percent. (Years later, Newt Gingrich, House speaker from 1995 to 1999 and a ring leader of the impeachment faction, admitted he was having an extramarital affair with a congressional aide 23 years his junior while he was pressing for the impeachment of Clinton.) If the Moral Majority had seemed to gain traction and influence in preceding years, the public response—ultimately a lack of concern—regarding Clinton’s affair and lies demonstrated that a more permissive cultural environment had now arisen.
Was Clinton, as the first lady would later claim, the victim of a “vast right-wing conspiracy”? Yes and no. Republican legislators and pundits no doubt exaggerated Clinton’s crimes and the severity of his character failures. They had shown no such concern when President Reagan’s staff illegally and secretly sold arms to Iran, laundered the profits through Israel and funded Contra death squads in Nicaragua. Still, perhaps Americans should expect their president to be truthful under oath and demonstrate mature judgment. Clinton’s scandals were neither as serious as the right clamored nor as slight as the left insisted. More than simply a “right-wing conspiracy,” the obsessive coverage of Clinton’s personal life reflected the times—and not in a flattering way.
The wide exposure, to the detriment of policy and international matters, reflected Americans’ growing obsession with celebrity, scandal and sex. The media statistics were genuinely embarrassing. In a single week in March 1994, the big three TV networks over the previous three months ran 126 stories about Whitewater and other alleged scandals, compared with 107 on bloodshed in Bosnia, 56 on tensions in the Mideast and 42 on health care reform! Media had, once and for all, morphed into an institution geared toward entertainment over information. As for the populace of the United States, Americans had allowed themselves to be overtaken by voyeurism fed by the indefensible actions of the president and the self-serving ways of his conservative detractors. When it came to the Whitewater, Paula Jones and Monica Lewinsky affairs, there were no adults in the room.
* * *
Bill Clinton was many things, his administration an exercise in cognitive dissonance. He was (notionally) a Democrat but oversaw a shrinking federal debt and several years of budget surplus after his Republican predecessors had paradoxically produced the opposite. The economy of the late 1990s boomed, but its beneficiaries were hardly representative of the nation at large. The president cut welfare, imprisoned hundreds of thousands of (often black) citizens and helped send manufacturing jobs overseas or across the southern border. The candidate who had promised universal health care and the right of gays to serve in the military failed miserably on the former and abandoned the latter, even signing legislation that barred marriage between loving same-sex adults. Perhaps, in part, because of the economic downturn and the international turmoil that occurred under his successor, George W. Bush, many Americans later would pine for the simpler, affluent (for some) era of Clinton, sex scandals and all; however, much of this happy memory was illusory.
Clinton left office a popular president, but he had hardly empowered the Democrats writ large. His anointed successor, Vice President Al Gore, unlike George H.W. Bush after Reagan, would not win the presidency (though he won the popular vote), and Bush the Elder’s more conservative son ascended to power. Nevertheless, Clintonism lived on in the spirit of many congressional copycat centrists then dominant on the Hill, and, most distinctly, in his wife, the ambitious, jaded and relentless Hillary Rodham Clinton. Her own rise, first to the U.S. Senate and then grasping at the very height of American power, would, in many ways, define the “liberalism” of the era to come. The tribal, partisan culture wars hardly abated when Bill left power, and the Clintons, peculiarly, remained the favorite scapegoat of the right. The people, Americans as a whole that is, possessed only the illusion of improvement after eight years of Bill and Hill. The aspirations and dreams of liberalism—once the consensus force in American politics—were ditched by the Clintons in the name of power and money. It would take many more years of domestic and international disaster, and a new, insurgent, progressive generation to strike back and contest ownership of the true mantle of liberalism. The outcome of that story remains uncertain and unclear.
* * *
To learn more about this topic, consider the following scholarly works:
• Gary Gerstle, “American Crucible: Race and Nation in the 20th Century” (2001).
• Kevin M. Kruse and Julian E. Zelizer, “Fault Lines: A History of the United States Since 1974” (2019).
• Jill Lepore, “These Truths: A History of the United States” (2018).
• James T. Patterson, “Restless Giant: The United States From Watergate to Bush v. Gore” (2005).
• Howard Zinn, “The Twentieth Century” (1980).
Danny Sjursen, a regular contributor to Truthdig, is a retired U.S. Army officer and former history instructor at West Point. He served tours with reconnaissance units in Iraq and Afghanistan. He has written a memoir and critical analysis of the Iraq War, “Ghost Riders of Baghdad: Soldiers, Civilians, and the Myth of the Surge.” He lives in Lawrence, Kan. Follow him on Twitter at @SkepticalVet and check out his podcast, “Fortress on a Hill,” co-hosted with fellow vet Chris “Henri” Henrikson.

July 19, 2019
Our Last Shot at Democracy

[image error] “The House of the Pain of Others: Chronicle of a Small Genocide”
Purchase in the Truthdig Bazaar
“Democracy May Not Exist, but We’ll Miss It When It’s Gone”
A book by Astra Taylor
In Anna Burns’s Man Booker Prize-winning novel, “Milkman,” the main character—“middle sister”—learns about two-thirds of the way through the book that she has been defined as a “beyond-the-pale” in her community. This status comes not as a result of her attempts to avoid the sexual stalking of a local paramilitary leader (“Milkman”), but instead as a result of her habit of walking while reading. “Longest friend” tells her that her behavior is “not natural”; it is “disturbing,” “deviant,” “[n]ot public-spirited,” “[n]ot self-preservation.” Walking while reading, her friend tells her, is an activity that is “incapable of being mentally grasped, of being understood.” She is “[going] around in a political scene” with her “head switched off.”
Of course, middle sister’s head is very much switched on, but not in the way her community demands. Rather than using her head to defer to what Astra Taylor calls “constraining common sense,” middle sister goes about in public letting her body and mind inform each other, opening up new possibilities for thought and movement.
Related Articles

The 500-Year-Long Battle to Save the Planet
by
Burns’ book is my example, not Taylor’s, but “walking while reading” is as good a description as any for the kind of democratic citizenship Taylor advocates in her magnificent, paradigm-shifting new book, “Democracy May Not Exist, but We’ll Miss It When It’s Gone.” Taylor’s book challenges the very old idea that the demos is the “belly” of the polity, which depends on the “head” of elites to direct and guide it. This idea, which goes back at least to Plato, continues to inhabit the majority of recent books on democracy, which tend to look to institutions, experts, and “norms” to address the rise of right-wing leaders, rather than to the people. Taylor does something very different: she looks to democratic actors as democratic thinkers. Democratic movements, Taylor notes, have always had “strong pedagogical components,” from the Knights of Labor’s emphasis on self-education, to the Freedom Schools of the Civil Rights movement, to the volunteer-run libraries of the Occupy movement, among many other examples. Even as the people in these movements “walked”—protesting, congregating, occupying, striking—they also read and thought.
In our society, as in the quasi-mythical (para)military society of “Milkman,” walking-while-reading is threatening to the powerful (and to those who identify with them) because both groups rely on hierarchies of knowledge to constrain our actions and imagination. In Burns’s “Milkman,” “longest friend” uses the dominant notion of “public spirited” to discipline middle sister, and walking while reading—as an individual practice—turns out to have little power: middle sister is ultimately reintegrated into patriarchal and social norms. Similarly, in our society, a dominant narrow idea of democracy—as a set of electoral institutions, norms, and rights—teaches us to see popular participation, action, and movements as signaling a “crisis” in democracy rather than democracy itself. Yet if these dominant ideas help economic and political elites keep their outsize role in the political process, then Taylor’s book invites us to consider how learning to read while walking together may be one way of starting to democratize political power.
Click here to read long excerpts from “Democracy May Not Exist, but We’ll Miss It When It’s Gone” at Google Books.
¤
Many recent books chart a narrative of democratic decline, then call for the reinstatement of norms or institutions that might re-democratize us. By contrast, Taylor describes democracy as perpetually changing and unfolding; this is “as much a cause for jubilation as despair,” she argues, because it “remind[s] us that we are part of a long, complex, and still-unfolding chronicle, whatever the day’s headlines might be or whoever governs the country.” This idea of democracy powerfully draws on, even as it innovates, a strain of democratic thinking within the academy (by figures like Bonnie Honig, Wendy Brown, and Jason Frank). This strain sees democracy as always imperfect, and constituted by participatory contestation—not for its own sake, but because contestation is how the marginalized are able to challenge an always exclusive consensus and remake their institutions to be more equitable and empowering.
The problem right now, as Taylor shows us in lucid and deep detail, is that we live in a society designed to disempower most people and to render their lives precarious and meaningless. We live, in other words, in a society that works to keep the people from being able to negotiate democratic dilemmas in a democratic way. Our institutions are built, Taylor argues, to segregate and isolate, to enable voter suppression and limit political participation, to train most people for the “servile arts” instead of the liberal arts, to ransack the planet for the sake of the enrichment of a few, and to keep most people in precarious economic situations while some elites become wildly wealthy. In such a de-democratized society, most people are kept from having the tools to think freely, live equally, and exercise power in public affairs. In other words, most people are deprived of the status of citizen, understood as a category that must be enacted, taken, and demanded, not legally conferred. Due to the paucity of opportunities to act democratically, Taylor argues that we also have trouble reflecting on what democracy is or should be.
For Taylor, democratizing our society is thus not just about saving certain liberal procedures, norms, and institutions—indeed, some of those procedures and norms are likely part of the “constraining common sense” that limits our ability to act and think more democratically. For her, democratizing our society demands that we save democracy from capitalism by creating greater economic equality, making public goods like college and health care free, aggressively regulating and taxing fossil fuel companies, and radically reducing popular indebtedness. Looking to the writings, speech, and actions of democratizing movements, she shows that they consistently demand economic equality as necessary to the exercise of free democratic governance. Without it, the economically powerful will inevitably hold disproportionate political power and use it—as history has shown—to favor their narrow interest.
In making this argument, Taylor joins a chorus of recent intellectuals (including Naomi Klein, Keeanga-Yamahtta Taylor, and Corey Robin) who insist that economic equality is vital to freedom, social justice, and democracy. Writers like these are doing important work in this political moment, reinvigorating the leftist intellectual imagination. As Taylor and others show (Nancy MacLean’s “Democracy in Chains” comes to mind here), the right wing recognized the power of big ideas to shape public life just as the left was turning away from those big ideas toward the end of the ’70s. This was, as Taylor rightly says, a deeply consequential mistake that left a huge void, where the only ideas of “freedom,” “equality,” and “democracy” on offer in the public realm were narrowly defined in terms of the market, parties, and electoral institutions. Taylor helps to fill that void by offering a major left rethinking of democracy for a general audience. Her book is complex but also deeply accessible and lucidly written. It is, in other words, a democratic book about democracy.
¤
But Taylor doesn’t just seek to explain democracy to the public; instead, she reimagines democratic thinking from the ground up, looking to social movements and everyday political actors for their philosophical insights. The perennial fear of elites is that if the people really hold power, we will end up with total disorder, but Taylor shows that this fear has always been a product of the elite imaginary, not reality. In “The Republic,” Plato articulated this fear through the parable of the mutinous ship; Taylor counters by showing that the most democratic of ships, the pirate ship, was actually extremely equitable and well governed. Taking us through histories of democratic movements, and putting the voices of political thinkers on equal footing with democratic citizens, Taylor shows that when we enact democracy democratically, the result is not disorder, but an ongoing practice of “living in tensions.” Democracy, she argues, will always be a mix of inclusion and exclusion, expertise and popular action. For her, the practice of democracy is not about designing ideal democratic institutions (a tendency of most democratic theory), but instead about becoming comfortable with looking to our fellow citizens as we wrestle with political dilemmas and battle for the most democratic society we can.
Taylor’s emphasis on thinking democracy—on democratizing public philosophy—is perhaps the book’s most exciting intervention. Her argument calls to mind the political theorist James Tully’s case for democratizing philosophy in his two-volume “Public Philosophy in a New Key.” But Taylor is not primarily interested in remaking academic political theory and philosophy (to, as Tully argues, be charged with the task of creating “toolkits” for activists). Rather, she is concerned with making philosophy part of democracy, to be pulled well out of the academy and into the streets (where, she shows, it has always been anyway). Taylor, in other words, wants us to stop seeing thinking as the realm of experts—and thus opposed to democracy—and instead to see democracy as itself a practice of action and thinking, of active reflection and reflective action.
Taylor’s position could appear too close for comfort to those she criticizes. She could be seen as saying that intelligence is necessary to democratic governance, a position that comes close to the argument for a meritocracy that always elevates certain kinds of intelligence over others (e.g., Pete Buttigieg). But Taylor is carving out a distinct position: that thinking and philosophy (not “intelligence,” a 20th-century category that correlates with the sorting techniques of standardized testing and machine learning) are important to democracy. And they are important not because they offer solutions to our problems (although sometimes they might), but because they help us listen to each other, chart new paths, constitute power, and make hard judgments and decisions.
Consider, for example, the ideas of freedom that black children offered during “Freedom Summer” (that Taylor quotes from Eric Foner’s account): “‘going to public libraries’; ‘standing up for your rights’; ‘having power in the system’; a ‘state of mind.’” This more “multidimensional conception of freedom” depends on political power and economic equality, emerges out of democratic reflection, and challenges powerful interests. What might happen if we create economic and political conditions where these views of freedom can be taken as seriously as the voices of professors and pundits? Expert knowledge is always important, but we need, Taylor says, to democratize our concept of expertise, to see everyone as equally capable thinkers and actors, who have different kinds of knowledge and experience to bring to the table.
¤
Yet how can individuals who are segregated, isolated, and disempowered, feel empowered to begin this project, to change the economic and political conditions of our society, to begin walking and reading together? Perhaps I am wishing that Taylor had added one more chapter to talk about this problem, which is, in my view, also endemic to democracy. When people are systematically isolated and disempowered, when the dominant ideas of their society teach them that coming together will endanger or further isolate them rather than empower them, how can they forge associations, collective groups, consciousness-raising groups, or something new to democratize their society? How do disempowered individuals become empowered collectives?
This brings me back to middle sister, who never gets there. Toward the end of the book, Milkman is killed in an explosion (this is not a spoiler, you learn this at the very outset of the novel). Middle sister starts walking again—but not walking while reading—and she is reintegrated into her community. In the final pages, she goes running with “third brother-in-law” who had given her a long lecture on the problems with walking while reading earlier in the novel, and “almost laughs.” But this is not a happy ending. Rather, middle sister’s almost laughter reflects that she has been reabsorbed into communal norms of what feminist thinker Sara Ahmed would call “happiness,” that feeling of almost contentment at having achieved, or being seen as at work achieving, what white bourgeois society portrays as its ideal: being married, having children, owning a home, earning a stable income, et cetera. For Ahmed, the ideal of happiness is deeply depoliticizing. It pulls people away from public life and into the domestic realm, denigrates racialized forms of intimate and public life that do not conform to bourgeois ideals, and teaches us to see any problems we have as signs of a disordered or unhappy domestic life rather than as signs of political problems that need to be democratically addressed. In such a depoliticized society, moving from this isolated disempowerment to collective empowerment can feel dangerous, unpleasant, and sometimes (as Lori Jo Marso argues) perverse.
One way that disempowered people can begin to form collectivity is by noticing that they have already started, that they are actually already in a more collective situation—with more power—than they realized. For example, they might notice that their intimate or workplace conversations are actually about political problems that could be collectively resisted. Taylor’s book moves us toward this condition by writing to us as readers who are already reading while walking. Taylor’s deep and wide examination of democratic movements, conversations, and grassroots institutions makes the reader feel as if they are already part of this conversation, as if by reading the book, they have already started to engage in the practice of being a democratic citizen. She speaks to the reader on the level of an equal, and the reader starts to feel as if they could speak back. To put it differently, reading Taylor’s book makes one think democratically, but this thinking also invigorates one’s everyday movements, and makes one start to feel democracy as a pleasure of thinking and acting.
Taylor’s closing lines offer an image of how we might understand this pleasurable, philosophical practice of democracy. “Instead of founding fathers,” she says, “let us aspire to be perennial midwives, helping always to deliver democracy anew.” In invoking midwives, Taylor draws on a fertile symbol of democratized knowledge and explicitly draws out a feminist impulse that orients the book as a whole. Historically, midwives’ expertise was pushed aside by the rise of professionalized medicine (dominated by men). But Socrates also described himself as a midwife, helping to deliver truth. Taylor’s invocation of midwives at the end is thus nicely fitting: she suggests that democracy is something that demands everyone’s expertise, knowledge, and thinking—even and especially the knowledge of those who have been historically excluded.
Yet ultimately, Taylor does not ask us to—like Socrates—deliver “truth,” but, instead, to create the conditions for a more democratic society and to prepare ourselves to negotiate its inevitable dilemmas. Taylor’s book calls us to read like citizens, but it also shows us that we are already learning this practice from each other. “Democracy may not exist and yet it still might.”
This review originally appeared on the Los Angeles Review of Books

Besieged Puerto Rico Governor Goes Quiet Amid Protests
SAN JUAN, Puerto Rico—In the Spanish colonial fortress that serves as his official residence, Puerto Rico Gov. Ricardo Rosselló is under siege.
Motorcyclists, celebrities, horse enthusiasts and hundreds of thousands of ordinary Puerto Ricans have swarmed outside La Fortaleza (The Fort) in Old San Juan this week, demanding Rosselló resign over a series of leaked online chats insulting women, political opponents and even victims of Hurricane Maria.
Rosselló, the telegenic 40-year-old son of a former governor, has dropped his normally intense rhythm of public appearances and gone into relatively long periods of near-media silence, intensifying questions about his future.
For much of his 2 1/2 years in office, Rosselló has given three or four lengthy news conferences a week, comfortably fielding question after question in Spanish and English from the local and international press. And that’s on top of public appearances, one-on-one interviews and televised meetings with visiting politicians and members of his administration.
But since July 11, when Rosselló cut short a family vacation in France and returned home to face the first signs of what has become an island-wide movement to oust him, the governor has made four appearances, all but one in highly controlled situations.
New protests began Friday afternoon, with unionized workers organizing a march to La Fortaleza from the nearby waterfront. Horseback riders joined them with a self-declared cavalry march, while hundreds of other people came from around the city and surrounding areas. A string of smaller events was on the agenda across the island over the weekend, followed by what many expected to be a massive protest on Monday.
The chorus calling for Rosselló’s resignation was joined Friday by Puerto Rico’s non-voting member of Congress, Jenniffer Gonzalez; U.S. Sen. Rick Scott of Florida; and New York congresswomen Nydia Velázquez and Alexandra Ocasio-Cortez.
The crisis has even cut back Rosselló’s affable online presence. The governor normally started every day by tweeting “Good morning!” to his followers around 5 a.m. The last such bright-and-early message came on July 8. The tweets from his account have dwindled to a trickle since then, and each one is met by a flood of often-abusive responses from Puerto Ricans demanding he resign.
Rosselló’s secretary of public affairs, Anthony Maceira, told reporters Friday that the governor was in La Fortaleza working on signing laws and filling posts emptied by the resignations of fellow members of the leaked chat group.
The head of Rosselló’s pro-statehood political party said a meeting of its directors had been convened for coming days, although the agenda was not disclosed beyond “addressing every one of the complaints of our colleagues.”
Rosselló offered a press conference on July 11 to address the arrest of two of his former department heads on federal corruption charges. He also asked the people of Puerto Rico to forgive him for a profanity-laced and at times misogynistic online chat with nine other male members of his administration, short selections of which had leaked to local media. Two days later, at least 889 pages of the chat were published by Puerto Rico’s award-winning Center for Investigative Journalism, and things got much, much worse for Rosselló.
In the chats on the encrypted messaging app Telegram, Rosselló calls one New York female politician of Puerto Rican background a “whore,” describes another as a “daughter of a bitch” and makes fun of an obese man he posed with in a photo. The chat also contains vulgar references to Puerto Rican star Ricky Martin’s homosexuality and a series of emojis of a raised middle finger directed at a federal control board overseeing the island’s finances.
The next day, Sunday, Rosselló appeared in a San Juan church and asked the congregation for forgiveness, without informing the press. The church broadcasts its services online, however, and his remarks became public. On Monday, July 15, Rosselló gave a notably non-confrontational interview to a salsa music radio station. The governor’s spokesman said the questions had been “negotiated” between Rosselló’s press team and the station. That night, thousands swarmed Old San Juan to demand his resignation.
On July 16, Rosselló held a press conference and faced aggressive questioning about the chat scandal and the corruption arrests. Later that day, an ally tweeted a photo of Rosselló embracing Wilfredo Santiago, an obese man whom the governor had mocked in one of the most infamous sections of the chat.
Since then, it’s been silence. There has been a handful of tweets, press releases and statements, some saying he won’t resign but mostly about purportedly routine meetings of administration officials.
His official spokespeople aren’t answering many questions, and even his whereabouts are mostly unknown.
Rosselló was raised in the public eye, as the youngest son of Pedro Rosselló, who served as governor from 1993 to 2001. One of Puerto Rico’s most charismatic and controversial governors, the elder Rosselló launched a string of large-scale infrastructure projects that swelled the public debt and ensuing bankruptcy that his son has inherited.
Known widely as Ricky, the younger Rosselló started his political career in his father’s pro-statehood New Progressive Party. Trained in biomechanical engineering at the Massachusetts Institute of Technology, University of Michigan and Duke, he launched his campaign for governor in 2015 with little previous history of public service.
Deflecting questions about whether he owed his success to his connections, Rosselló portrayed himself as an affable technocrat with solutions to Puerto Rico’s debt and crumbling infrastructure, and by less than 3% of the total votes cast defeated David Bernier of Popular Democratic Party, which advocates greater Puerto Rican autonomy from the mainland United States.
Until now, Rosselló’s greatest challenge was Hurricane Maria, a Category 4 storm that struck Puerto Rico on Sept. 20, 2017, destroying the island’s power and communications systems. Rosselló came under heavy criticism for mismanaging the crisis, particularly for understating the deaths from the storm. While some of his deputies were vilified, Rosselló seemed to emerge relatively unscathed, perhaps due to his friendly and non-confrontational manner with critics, opponents and journalists alike.
The father of two young children, he often posts their photos online, along with images of his wife and their two rescue dogs, a Siberian Husky and a Yorkshire Terrier. Rosselló once halted a press conference to help local journalists move their equipment out of the rain.
Among the greatest shocks of the leaked chats for many Puerto Ricans was the puncturing of that image of low-key charm by the gross misogyny of online conversations.
“He was making an effort, carrying out his governor’s role,” said Jessica Castro, a 38-year-old San Juan resident attending a Friday evening protest with her family. “He was mocking everyone behind their backs, the people who believed in him. People are really disillusioned. He’s got to go.”

Chris Hedges's Blog
- Chris Hedges's profile
- 1889 followers
