Lily Salter's Blog, page 953

November 15, 2015

America is still paying for its wars: The enduring catastrophes of Iraq and Afghanistan

Let’s begin with the $12 billion in shrink-wrapped $100 bills, Iraqi oil money held in the U.S.  The Bush administration began flying it into Baghdad on C-130s soon after U.S. troops entered that city in April 2003.  Essentially dumped into the void that had once been the Iraqi state, at least $1.2 to $1.6 billion of it was stolen and ended up years later in a mysterious bunker in Lebanon.  And that’s just what happened as the starting gun went off. It’s never ended.  In 2011, the final report of the congressionally mandated Commission on Wartime Contracting estimated that somewhere between $31 billion and $60 billion taxpayer dollars had been lost to fraud and waste in the American “reconstruction” of Iraq and Afghanistan.  In Iraq, for instance, there was that $75 million police academy, initially hailed “as crucial to U.S. efforts to prepare Iraqis to take control of the country's security.”  It was, however, so poorly constructed that it proved a health hazard.  In 2006, “feces and urine rained from the ceilings in [its] student barracks” and that was only the beginning of its problems. When the bad press started, Parsons Corporation, the private contractor that built it, agreed to fix it for nothing more than the princely sum already paid.  A year later, a New York Timesreporter visited and found that “the ceilings are still stained with excrement, parts of the structures are crumbling, and sections of the buildings are unusable because the toilets are filthy and nonfunctioning.”  This seems to have been par for the course.  Typically enough, the Khan Bani Saad Correctional Facility, a $40 million prison Parsons also contracted to build, was never even finished. And these were hardly isolated cases or problems specific to Iraq.  Consider, for instance, those police stations in Afghanistan believed to be crucial to “standing up” a new security force in that country.  Despite the money poured into them and endless cost overruns, many were either never completed or never built, leaving new Afghan police recruits camping out.  And the police were hardly alone.  Take the $3.4 million unfinished teacher-training center in Sheberghan, Afghanistan, that an Iraqi company was contracted to build (using, of course, American dollars) and from which it walked away, money in hand. And why stick to buildings, when there were those Iraqi roads to nowhere paid for by American dollars? At least one of them did at least prove useful to insurgent groups moving their guerrillas around (like the $37 million bridge the U.S. Army Corps of Engineers built between Afghanistan and Tajikistan that helped facilitate the region's booming drug trade in opium and heroin).  In Afghanistan, Highway 1 between the capital Kabul and the southern city of Kandahar, unofficially dubbed the “highway to nowhere,” was so poorly constructed that it began crumbling in its first Afghan winter. And don’t think that this was an aberration.  The U.S. Agency for International Development (USAID) hired an American nonprofit, International Relief and Development (IRD), to oversee an ambitious road-building program meant to gain the support of rural villagers.  Almost $300 million later, it could point to “less than 100 miles of gravel road completed.” Each mile of road had, by then, cost U.S. taxpayers $2.8 million, instead of the expected $290,000, while a quarter of the road-building funds reportedly went directly to IRD for administrative and staff costs.  Needless to say, as the road program failed, USAID hired IRD to oversee other non-transportation projects. In these years, the cost of reconstruction never stopped growing.  In 2011, McClatchy News reported that “U.S. government funding for at least 15 large-scale programs and projects grew from just over $1 billion to nearly $3 billion despite the government's questions about their effectiveness or cost.” The Gas Station to Nowhere So much construction and reconstruction -- and so many failures.  There was the chicken-processing plant built in Iraq for $2.58 million that, except in a few Potemkin-Village-like moments, never plucked a chicken and sent it to market.  There was the sparkling new, 64,000-square-foot, state-of-the-art, $25 million headquarters for the U.S. military in Helmand Province, Afghanistan, that doubled in cost as it was being built and that three generals tried to stop.  They were overruled because Congress had already allotted the money for it, so why not spend it, even though it would never be used?  And don’t forget the $20 million that went into constructing roads and utilities for the base that was to hold it, or the$8.4 billion that went into Afghan opium-poppy-suppression and anti-drug programs and resulted in... bumper poppy crops and record opium yields, or the aid funds that somehow made their way directly into the hands of the Taliban (reputedly its second-largest funding source after those poppies). There were the billions of dollars in aid that no one could account for, and a significant percentage of the 465,000 small arms (rifles, machine guns, grenade launchers, and the like) that the U.S. shipped to Afghanistan and simply lost track of.  Most recently, there was the Task Force for Business Stability Operations, an $800-million Pentagon project to help jump-start the Afghan economy.  It was shut down only six months ago and yet, in response to requests from the Special Inspector General for Afghanistan Reconstruction, the Pentagon swears that there are “no Defense Department personnel who can answer questions about” what the task force did with its money.  As ProPublica’s Megan McCloskey writes, “The Pentagon’s claims are particularly surprising since Joseph Catalino, the former acting director of the task force who was with the program for two years, is still employed by the Pentagon as Senior Advisor for Special Operations and Combating Terrorism." Still, from that pile of unaccountable taxpayer dollars, one nearly $43 million chunk did prove traceable to a single project: the building of a compressed natural gas station.  (The cost of constructing a similar gas station in neighboring Pakistan: $300,000.)  Located in an area that seems to have had no infrastructure for delivering natural gas and no cars converted for the use of such fuel, it represented the only example on record in those years of a gas station to nowhere. All of this just scratches the surface when it comes to the piles of money that were poured into an increasingly privatized version of the American way of war and, in the form ofovercharges and abuses of every sort, often simply disappeared into the pockets of the warrior corporations that entered America’s war zones.  In a sense, a surprising amount of the money that the Pentagon and U.S. civilian agencies “invested” in Iraq and Afghanistan never left the United States, since it went directly into the coffers of those companies. Clearly, Washington had gone to war like a drunk on a bender, while the domestic infrastructure began to fray.  At $109 billion by 2014, the American reconstruction program in Afghanistan was already, in today's dollars, larger than the Marshall Plan (which helped put all of devastated Western Europe back on its feet after World War II) and still the country was a shambles. In Iraq, a mere $60 billion was squandered on the failed rebuilding of the country.  Keep in mind that none of this takes into account the staggering billions spent by the Pentagon in both countries to build strings of bases, ranging in size from American towns (with all the amenities of home) to tiny outposts.  There would be 505 of them in Iraq and at least 550 in Afghanistan.  Most were, in the end, abandoned, dismantled, or sometimes simply looted.  And don’t forget the vast quantities of fuel imported into Afghanistan to run the U.S. military machine in those years, some of which was siphoned off by American soldiers, to the tune of at least $15 million, and sold to local Afghans on the sly. In other words, in the post-9/11 years, “reconstruction” and “war” have really been euphemisms for what, in other countries, we would recognize as a massive system of corruption. And let’s not forget another kind of “reconstruction” then underway. In both countries, the U.S. was creating enormous militaries and police forces essentially from scratch to the tune of at least $25 billion in Iraq and $65 billion in Afghanistan.  What’s striking about both of these security forces, once constructed, is how similar they turned out to be to those police academies, the unfinished schools, and that natural gas station.  It can’t be purely coincidental that both of the forces Americans proudly “stood up” have turned out to be the definition of corrupt: that is, they were filled not just with genuine recruits but with serried ranks of “ghost personnel.” In June 2014, after whole divisions of the Iraqi army collapsed and fled before modest numbers of Islamic State militants, abandoning much of their weaponry and equipment, it became clear that they had been significantly smaller in reality than on paper.  And no wonder, as that army had enlisted 50,000 “ghost soldiers” (who existed only on paper and whose salaries were lining the pockets of commanders and others).  In Afghanistan, the U.S. is still evidently helping to pay for similarly stunning numbers of phantom personnel, though no specific figures are available.  (In 2009, an estimated more than 25% of the police force consisted of such ghosts.)  As John Sopko, the U.S. inspector general for Afghanistan,warned last June: "We are paying a lot of money for ghosts in Afghanistan... whether they are ghost teachers, ghost doctors or ghost policeman or ghost soldiers." And lest you imagine that the U.S. military has learned its lesson, rest assured that it’s still quite capable of producing nonexistent proxy forces.  Take the Pentagon-CIA program to train thousands of carefully vetted “moderate” Syrian rebels, equip them, arm them, and put them in the field to fight the Islamic State.  Congress ponied up $500 million for it, $384 million of which was spent before that project was shut down as an abject failure.  By then, less than 200 American-backed rebels had been trained and even less put into the field in Syria -- and they were almost instantly kidnapped or killed, or they simply handed over their equipment to the al-Qaeda-linked al-Nusra Front.  At one point, according to the congressional testimony of the top American commander in the Middle East, only four or five American-produced rebels were left “in the field.”  The cost-per-rebel sent into Syria, by the way, is now estimated at approximately $2 million. A final footnote: the general who oversaw this program is, according to the New York Times, still a “rising star” in the Pentagon and in line for a promotion. Profli-gate You’ve just revisited the privatized, twenty-first-century version of the American way of war, which proved to be a smorgasbord of scandal, mismanagement, and corruption as far as the eye could see.  In the tradition of Watergate, perhaps the whole system could be dubbed Profli-gate, since American war making across the Greater Middle East has represented perhaps the most profligate and least effective use of funds in the history of modern warfare.  In fact, here’s a word not usually associated with the U.S. military: the war system of this era seems to function remarkably like a monumental scam, a swindle, a fraud. The evidence is in: the U.S. military can win battles, but not a war, not even against minimally armed minority insurgencies; it can “stand up” foreign militaries, but only if they are filled with phantom feet and if the forces themselves are as hollow as tombs; it can pour funds into the reconstruction of countries, a process guaranteed to leave them more prostrate than before; it can bomb, missile, and drone-kill significant numbers of terrorists and other enemies, even as their terror outfits and insurgent movements continue to grow stronger under the shadow of American air power.  Fourteen years and five failed states later in the Greater Middle East, all of that seems irrefutable. And here’s something else irrefutable: amid the defeats, corruption, and disappointments, there lurks a kind of success.  After all, every disaster in which the U.S. military takes part only brings more bounty to the Pentagon.  Domestically, every failure results in calls for yet more military interventions around the world.  As a result, the military is so much bigger and better funded than it was on September 10, 2001.  The commanders who led our forces into such failures have repeatedly been rewarded and much of the top brass, civilian and military, though they should have retired in shame, have taken ever more golden parachutes into the lucrative worlds of defense contractors, lobbyists, and consultancies. All of this couldn’t be more obvious, though it’s seldom said.  In short, there turns out to be much good fortune in the disaster business, a fact which gives the whole process the look of a classic swindle in which the patsies lose their shirts but the scam artists make out like bandits. Add in one more thing: these days, the only part of the state held in great esteem by conservatives and the present batch of Republican presidential candidates is the U.S. military.  All of them, with the exception of Rand Paul, swear that on entering the Oval Office they will let that military loose, sending in more troops, or special ops forces, or air power, and funding the various services even more lavishly; all of this despite overwhelming evidence that the U.S. military is incapable of spending a dollar responsibly or effectively monitoring what it's done with the taxpayer funds in its possession.  (If you don’t believe me, forget everything in this piece and just check out the finances of the most expensive weapons system in history, the F-35 Lightning II, which should really be redubbed the F-35 Overrun for its madly spiraling costs.) But no matter. If a system works (particularly for those in it), why change it?  And by the way, in case you’re looking for a genuine steal, I have a fabulous gas station in Afghanistan to sell you...Let’s begin with the $12 billion in shrink-wrapped $100 bills, Iraqi oil money held in the U.S.  The Bush administration began flying it into Baghdad on C-130s soon after U.S. troops entered that city in April 2003.  Essentially dumped into the void that had once been the Iraqi state, at least $1.2 to $1.6 billion of it was stolen and ended up years later in a mysterious bunker in Lebanon.  And that’s just what happened as the starting gun went off. It’s never ended.  In 2011, the final report of the congressionally mandated Commission on Wartime Contracting estimated that somewhere between $31 billion and $60 billion taxpayer dollars had been lost to fraud and waste in the American “reconstruction” of Iraq and Afghanistan.  In Iraq, for instance, there was that $75 million police academy, initially hailed “as crucial to U.S. efforts to prepare Iraqis to take control of the country's security.”  It was, however, so poorly constructed that it proved a health hazard.  In 2006, “feces and urine rained from the ceilings in [its] student barracks” and that was only the beginning of its problems. When the bad press started, Parsons Corporation, the private contractor that built it, agreed to fix it for nothing more than the princely sum already paid.  A year later, a New York Timesreporter visited and found that “the ceilings are still stained with excrement, parts of the structures are crumbling, and sections of the buildings are unusable because the toilets are filthy and nonfunctioning.”  This seems to have been par for the course.  Typically enough, the Khan Bani Saad Correctional Facility, a $40 million prison Parsons also contracted to build, was never even finished. And these were hardly isolated cases or problems specific to Iraq.  Consider, for instance, those police stations in Afghanistan believed to be crucial to “standing up” a new security force in that country.  Despite the money poured into them and endless cost overruns, many were either never completed or never built, leaving new Afghan police recruits camping out.  And the police were hardly alone.  Take the $3.4 million unfinished teacher-training center in Sheberghan, Afghanistan, that an Iraqi company was contracted to build (using, of course, American dollars) and from which it walked away, money in hand. And why stick to buildings, when there were those Iraqi roads to nowhere paid for by American dollars? At least one of them did at least prove useful to insurgent groups moving their guerrillas around (like the $37 million bridge the U.S. Army Corps of Engineers built between Afghanistan and Tajikistan that helped facilitate the region's booming drug trade in opium and heroin).  In Afghanistan, Highway 1 between the capital Kabul and the southern city of Kandahar, unofficially dubbed the “highway to nowhere,” was so poorly constructed that it began crumbling in its first Afghan winter. And don’t think that this was an aberration.  The U.S. Agency for International Development (USAID) hired an American nonprofit, International Relief and Development (IRD), to oversee an ambitious road-building program meant to gain the support of rural villagers.  Almost $300 million later, it could point to “less than 100 miles of gravel road completed.” Each mile of road had, by then, cost U.S. taxpayers $2.8 million, instead of the expected $290,000, while a quarter of the road-building funds reportedly went directly to IRD for administrative and staff costs.  Needless to say, as the road program failed, USAID hired IRD to oversee other non-transportation projects. In these years, the cost of reconstruction never stopped growing.  In 2011, McClatchy News reported that “U.S. government funding for at least 15 large-scale programs and projects grew from just over $1 billion to nearly $3 billion despite the government's questions about their effectiveness or cost.” The Gas Station to Nowhere So much construction and reconstruction -- and so many failures.  There was the chicken-processing plant built in Iraq for $2.58 million that, except in a few Potemkin-Village-like moments, never plucked a chicken and sent it to market.  There was the sparkling new, 64,000-square-foot, state-of-the-art, $25 million headquarters for the U.S. military in Helmand Province, Afghanistan, that doubled in cost as it was being built and that three generals tried to stop.  They were overruled because Congress had already allotted the money for it, so why not spend it, even though it would never be used?  And don’t forget the $20 million that went into constructing roads and utilities for the base that was to hold it, or the$8.4 billion that went into Afghan opium-poppy-suppression and anti-drug programs and resulted in... bumper poppy crops and record opium yields, or the aid funds that somehow made their way directly into the hands of the Taliban (reputedly its second-largest funding source after those poppies). There were the billions of dollars in aid that no one could account for, and a significant percentage of the 465,000 small arms (rifles, machine guns, grenade launchers, and the like) that the U.S. shipped to Afghanistan and simply lost track of.  Most recently, there was the Task Force for Business Stability Operations, an $800-million Pentagon project to help jump-start the Afghan economy.  It was shut down only six months ago and yet, in response to requests from the Special Inspector General for Afghanistan Reconstruction, the Pentagon swears that there are “no Defense Department personnel who can answer questions about” what the task force did with its money.  As ProPublica’s Megan McCloskey writes, “The Pentagon’s claims are particularly surprising since Joseph Catalino, the former acting director of the task force who was with the program for two years, is still employed by the Pentagon as Senior Advisor for Special Operations and Combating Terrorism." Still, from that pile of unaccountable taxpayer dollars, one nearly $43 million chunk did prove traceable to a single project: the building of a compressed natural gas station.  (The cost of constructing a similar gas station in neighboring Pakistan: $300,000.)  Located in an area that seems to have had no infrastructure for delivering natural gas and no cars converted for the use of such fuel, it represented the only example on record in those years of a gas station to nowhere. All of this just scratches the surface when it comes to the piles of money that were poured into an increasingly privatized version of the American way of war and, in the form ofovercharges and abuses of every sort, often simply disappeared into the pockets of the warrior corporations that entered America’s war zones.  In a sense, a surprising amount of the money that the Pentagon and U.S. civilian agencies “invested” in Iraq and Afghanistan never left the United States, since it went directly into the coffers of those companies. Clearly, Washington had gone to war like a drunk on a bender, while the domestic infrastructure began to fray.  At $109 billion by 2014, the American reconstruction program in Afghanistan was already, in today's dollars, larger than the Marshall Plan (which helped put all of devastated Western Europe back on its feet after World War II) and still the country was a shambles. In Iraq, a mere $60 billion was squandered on the failed rebuilding of the country.  Keep in mind that none of this takes into account the staggering billions spent by the Pentagon in both countries to build strings of bases, ranging in size from American towns (with all the amenities of home) to tiny outposts.  There would be 505 of them in Iraq and at least 550 in Afghanistan.  Most were, in the end, abandoned, dismantled, or sometimes simply looted.  And don’t forget the vast quantities of fuel imported into Afghanistan to run the U.S. military machine in those years, some of which was siphoned off by American soldiers, to the tune of at least $15 million, and sold to local Afghans on the sly. In other words, in the post-9/11 years, “reconstruction” and “war” have really been euphemisms for what, in other countries, we would recognize as a massive system of corruption. And let’s not forget another kind of “reconstruction” then underway. In both countries, the U.S. was creating enormous militaries and police forces essentially from scratch to the tune of at least $25 billion in Iraq and $65 billion in Afghanistan.  What’s striking about both of these security forces, once constructed, is how similar they turned out to be to those police academies, the unfinished schools, and that natural gas station.  It can’t be purely coincidental that both of the forces Americans proudly “stood up” have turned out to be the definition of corrupt: that is, they were filled not just with genuine recruits but with serried ranks of “ghost personnel.” In June 2014, after whole divisions of the Iraqi army collapsed and fled before modest numbers of Islamic State militants, abandoning much of their weaponry and equipment, it became clear that they had been significantly smaller in reality than on paper.  And no wonder, as that army had enlisted 50,000 “ghost soldiers” (who existed only on paper and whose salaries were lining the pockets of commanders and others).  In Afghanistan, the U.S. is still evidently helping to pay for similarly stunning numbers of phantom personnel, though no specific figures are available.  (In 2009, an estimated more than 25% of the police force consisted of such ghosts.)  As John Sopko, the U.S. inspector general for Afghanistan,warned last June: "We are paying a lot of money for ghosts in Afghanistan... whether they are ghost teachers, ghost doctors or ghost policeman or ghost soldiers." And lest you imagine that the U.S. military has learned its lesson, rest assured that it’s still quite capable of producing nonexistent proxy forces.  Take the Pentagon-CIA program to train thousands of carefully vetted “moderate” Syrian rebels, equip them, arm them, and put them in the field to fight the Islamic State.  Congress ponied up $500 million for it, $384 million of which was spent before that project was shut down as an abject failure.  By then, less than 200 American-backed rebels had been trained and even less put into the field in Syria -- and they were almost instantly kidnapped or killed, or they simply handed over their equipment to the al-Qaeda-linked al-Nusra Front.  At one point, according to the congressional testimony of the top American commander in the Middle East, only four or five American-produced rebels were left “in the field.”  The cost-per-rebel sent into Syria, by the way, is now estimated at approximately $2 million. A final footnote: the general who oversaw this program is, according to the New York Times, still a “rising star” in the Pentagon and in line for a promotion. Profli-gate You’ve just revisited the privatized, twenty-first-century version of the American way of war, which proved to be a smorgasbord of scandal, mismanagement, and corruption as far as the eye could see.  In the tradition of Watergate, perhaps the whole system could be dubbed Profli-gate, since American war making across the Greater Middle East has represented perhaps the most profligate and least effective use of funds in the history of modern warfare.  In fact, here’s a word not usually associated with the U.S. military: the war system of this era seems to function remarkably like a monumental scam, a swindle, a fraud. The evidence is in: the U.S. military can win battles, but not a war, not even against minimally armed minority insurgencies; it can “stand up” foreign militaries, but only if they are filled with phantom feet and if the forces themselves are as hollow as tombs; it can pour funds into the reconstruction of countries, a process guaranteed to leave them more prostrate than before; it can bomb, missile, and drone-kill significant numbers of terrorists and other enemies, even as their terror outfits and insurgent movements continue to grow stronger under the shadow of American air power.  Fourteen years and five failed states later in the Greater Middle East, all of that seems irrefutable. And here’s something else irrefutable: amid the defeats, corruption, and disappointments, there lurks a kind of success.  After all, every disaster in which the U.S. military takes part only brings more bounty to the Pentagon.  Domestically, every failure results in calls for yet more military interventions around the world.  As a result, the military is so much bigger and better funded than it was on September 10, 2001.  The commanders who led our forces into such failures have repeatedly been rewarded and much of the top brass, civilian and military, though they should have retired in shame, have taken ever more golden parachutes into the lucrative worlds of defense contractors, lobbyists, and consultancies. All of this couldn’t be more obvious, though it’s seldom said.  In short, there turns out to be much good fortune in the disaster business, a fact which gives the whole process the look of a classic swindle in which the patsies lose their shirts but the scam artists make out like bandits. Add in one more thing: these days, the only part of the state held in great esteem by conservatives and the present batch of Republican presidential candidates is the U.S. military.  All of them, with the exception of Rand Paul, swear that on entering the Oval Office they will let that military loose, sending in more troops, or special ops forces, or air power, and funding the various services even more lavishly; all of this despite overwhelming evidence that the U.S. military is incapable of spending a dollar responsibly or effectively monitoring what it's done with the taxpayer funds in its possession.  (If you don’t believe me, forget everything in this piece and just check out the finances of the most expensive weapons system in history, the F-35 Lightning II, which should really be redubbed the F-35 Overrun for its madly spiraling costs.) But no matter. If a system works (particularly for those in it), why change it?  And by the way, in case you’re looking for a genuine steal, I have a fabulous gas station in Afghanistan to sell you...

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 15, 2015 10:00

Let’s listen to Bill Maher: On Paris, religion and race, Maher walks a fascinating and tricky line

Bill Maher has made his mark as the comedian who refuses to toe the party line—any party’s line.  He has come under attack by both the right and the left for his positions. This week’s show exemplifies his unflinching desire to muddy the waters of extremist thinking and get viewers to ask tough questions and refuse pre-packaged scripts. He hit the spotlight after September 11 when he rejected the idea that the 9/11 attackers were cowards. Talking with conservative pundit Dinesh D’Souza, Maher stated: "We have been the cowards. Lobbing cruise missiles from 2,000 miles away. That's cowardly. Staying in the airplane when it hits the building. Say what you want about it. Not cowardly.” The comment cost him his ABC show. But he soon landed back on his feet with HBO for “Real time with Bill Maher.” This week’s show, which tackled both the Paris attacks and campus protests over racial discrimination, reminds us why Maher is a comedian we need to watch.  In the wake of the crises on the campuses of University of Missouri and Yale and on the heels of the Paris attacks, Maher rejected the fundamentalist thinking that often tends to frame these issues.  With regard to the student protests, he attacks racism, but defends free speech.  And in connection to the Paris attacks, he asks why liberals refuse to condemn the oppressive fundamentalism connected to the version of Islam practiced by terrorists. While we might disagree with his positions, Maher makes some provocative points.  Even more important he asks viewers to resist intellectual extremism and dogmatic ideologies.  This means that we can condemn Islamic extremism without condemning all Islamic people.  And it means that we can fight structural racism while also wondering if the student protesters’ demands are reasonable. Let’s be clear. Bill Maher can say some outrageous things.  He once compared his dogs to “retarded children.”  But it would be a mistake to dismiss his interventions because they come from a comedian known for being caustic and controversial.   Again and again Maher is willing to ask the questions no one wants to ask.  And one of his key themes is frustration over simple-minded responses to complex issues. After opening with a sign of solidarity with the French people, he asked: How can we respond in a way that allows us to forcefully condemn the attackers while avoiding a full-scale condemnation of Islam?  In an interview with Asra Nomani, Maher wonders why liberals “will not stand up against Sharia Law, which is the law in so many Muslim countries, which is the law of oppression?” Discussing extremist thinking with Nomani, he states, “I am absolutely sure that ISIS thinks that everything they do—every horrific crime, every atrocity—is an act of justice, and an act for god.” Maher’s point it that there is an Islamic extremism that is real and the left has lost the vocabulary for speaking about it meaningfully.  In an effort to avoid demonizing an entire religion, he argues, there has been silence on the very real threats of Islamic extremism.  It is an issue that drives Maher nuts and it’s one that will immediately get him called out as an Islamaphobe. Listening to Maher rant about liberals who are soft on terror, it might even seem like he is on the side of right-wing nut jobs like Ann Coulter. But he’s not on her side at all.  The trouble is that in moments of crisis such nuance almost inevitably gets lost. He has an unfailing ability to stick his finger in our wounds and ask us why we are surprised that it hurts.  Speaking about University of Missouri, Maher reminded viewers that the university’s president was “a clueless white guy” but “not a war criminal.”  “The question I’m asking is, do we purge clueless people from their jobs. Is that where we are with the battle against racism? Maybe the answer’s yes.” For what it’s worth, the panel concluded that, yes, the firing was the right outcome. He then turned to the Yale case.  After quoting from an op-ed that said students were losing sleep, not going to class, skipping meals, and not doing homework as a result of the controversy, Maher characteristically asked, “About an email over a Halloween costume that doesn’t even exist? Over an email? Who raised these little monsters?” When Maher takes the free-speech position on instances of hate speech, racism, and intolerance he always excites the right. Right-wing outlets like Breitbart will cite him as evidence that the struggles for social equality are a cover for intolerance. But they will only cite part of what he says.   They will omit mentioning the part of his show where Maher explicitly goes after the idea of the white male as a victim. In a rant on rising suicide rates for white males Maher stated: “It’s hard out there for a wimp, and that’s why tonight I’d like to remind white people of something very important they may have forgotten: You’re white, cheer the fuck up. Jesus, look at history. It’s always a great time to be white.” He went on to list examples of white privilege:  “Cops don’t shoot you for having your hands in your pockets. When people follow you around a store, it’s because they want to help you find something. Major party presidential candidates aren’t proposing to deport you. You can walk through an entire wedding reception without anyone trying to order a drink from you. And how about this perk? If you’re white, you’re much more likely to be not in prison.”  That’s the sort of thinking we will never see from Coulter. In the same show Maher criticized some student protesters, praised others, and called out white privilege.  In the same show he called liberals extremists for not going after Islamic extremism.  It’s tricky terrain for comedy and it’s likely to get misunderstood.  But Maher doesn’t care.  If there is one ongoing passion in his work, it is that he won’t back down and he won’t make things easy. Maher’s trademark comedy refuses to be channeled easily into ideological silos.  And whether we agree with him or not, his desire to ask tough questions and derail fundamentalist positions is a welcome intervention in a media landscape dominated by extremes.Bill Maher has made his mark as the comedian who refuses to toe the party line—any party’s line.  He has come under attack by both the right and the left for his positions. This week’s show exemplifies his unflinching desire to muddy the waters of extremist thinking and get viewers to ask tough questions and refuse pre-packaged scripts. He hit the spotlight after September 11 when he rejected the idea that the 9/11 attackers were cowards. Talking with conservative pundit Dinesh D’Souza, Maher stated: "We have been the cowards. Lobbing cruise missiles from 2,000 miles away. That's cowardly. Staying in the airplane when it hits the building. Say what you want about it. Not cowardly.” The comment cost him his ABC show. But he soon landed back on his feet with HBO for “Real time with Bill Maher.” This week’s show, which tackled both the Paris attacks and campus protests over racial discrimination, reminds us why Maher is a comedian we need to watch.  In the wake of the crises on the campuses of University of Missouri and Yale and on the heels of the Paris attacks, Maher rejected the fundamentalist thinking that often tends to frame these issues.  With regard to the student protests, he attacks racism, but defends free speech.  And in connection to the Paris attacks, he asks why liberals refuse to condemn the oppressive fundamentalism connected to the version of Islam practiced by terrorists. While we might disagree with his positions, Maher makes some provocative points.  Even more important he asks viewers to resist intellectual extremism and dogmatic ideologies.  This means that we can condemn Islamic extremism without condemning all Islamic people.  And it means that we can fight structural racism while also wondering if the student protesters’ demands are reasonable. Let’s be clear. Bill Maher can say some outrageous things.  He once compared his dogs to “retarded children.”  But it would be a mistake to dismiss his interventions because they come from a comedian known for being caustic and controversial.   Again and again Maher is willing to ask the questions no one wants to ask.  And one of his key themes is frustration over simple-minded responses to complex issues. After opening with a sign of solidarity with the French people, he asked: How can we respond in a way that allows us to forcefully condemn the attackers while avoiding a full-scale condemnation of Islam?  In an interview with Asra Nomani, Maher wonders why liberals “will not stand up against Sharia Law, which is the law in so many Muslim countries, which is the law of oppression?” Discussing extremist thinking with Nomani, he states, “I am absolutely sure that ISIS thinks that everything they do—every horrific crime, every atrocity—is an act of justice, and an act for god.” Maher’s point it that there is an Islamic extremism that is real and the left has lost the vocabulary for speaking about it meaningfully.  In an effort to avoid demonizing an entire religion, he argues, there has been silence on the very real threats of Islamic extremism.  It is an issue that drives Maher nuts and it’s one that will immediately get him called out as an Islamaphobe. Listening to Maher rant about liberals who are soft on terror, it might even seem like he is on the side of right-wing nut jobs like Ann Coulter. But he’s not on her side at all.  The trouble is that in moments of crisis such nuance almost inevitably gets lost. He has an unfailing ability to stick his finger in our wounds and ask us why we are surprised that it hurts.  Speaking about University of Missouri, Maher reminded viewers that the university’s president was “a clueless white guy” but “not a war criminal.”  “The question I’m asking is, do we purge clueless people from their jobs. Is that where we are with the battle against racism? Maybe the answer’s yes.” For what it’s worth, the panel concluded that, yes, the firing was the right outcome. He then turned to the Yale case.  After quoting from an op-ed that said students were losing sleep, not going to class, skipping meals, and not doing homework as a result of the controversy, Maher characteristically asked, “About an email over a Halloween costume that doesn’t even exist? Over an email? Who raised these little monsters?” When Maher takes the free-speech position on instances of hate speech, racism, and intolerance he always excites the right. Right-wing outlets like Breitbart will cite him as evidence that the struggles for social equality are a cover for intolerance. But they will only cite part of what he says.   They will omit mentioning the part of his show where Maher explicitly goes after the idea of the white male as a victim. In a rant on rising suicide rates for white males Maher stated: “It’s hard out there for a wimp, and that’s why tonight I’d like to remind white people of something very important they may have forgotten: You’re white, cheer the fuck up. Jesus, look at history. It’s always a great time to be white.” He went on to list examples of white privilege:  “Cops don’t shoot you for having your hands in your pockets. When people follow you around a store, it’s because they want to help you find something. Major party presidential candidates aren’t proposing to deport you. You can walk through an entire wedding reception without anyone trying to order a drink from you. And how about this perk? If you’re white, you’re much more likely to be not in prison.”  That’s the sort of thinking we will never see from Coulter. In the same show Maher criticized some student protesters, praised others, and called out white privilege.  In the same show he called liberals extremists for not going after Islamic extremism.  It’s tricky terrain for comedy and it’s likely to get misunderstood.  But Maher doesn’t care.  If there is one ongoing passion in his work, it is that he won’t back down and he won’t make things easy. Maher’s trademark comedy refuses to be channeled easily into ideological silos.  And whether we agree with him or not, his desire to ask tough questions and derail fundamentalist positions is a welcome intervention in a media landscape dominated by extremes.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 15, 2015 09:10

Jim Jones, deadly white savior: The tragic legacy of the Jonestown massacre

Religion Dispatches White Nights, Black Paradise , a new novel available from Infidel Books this week, shares a vision of the Jonestown massacre that is rarely emphasized in mainstream literary narratives. One of the largest murder-suicides in modern history has lived on in the popular imagination as a uniquely American tragedy. But the legacy of the People’s Temple is, more particularly, a story of race and religion: 75 percent of the 918 people who perished at Jonestown in 1978 were African-American, and black people—especially black women—were the foot soldiers of that movement. I recently met with author Sikivu Hutchinson on the campus of the University of Southern California in Los Angeles, where she’s currently a visiting scholar at the USC Center for Feminist Research. We talked about the founding of the People’s Temple, why so many African-Americans decided to follow Jim Jones, the “ultimate white savior,” until the bitter, unspeakable end, and what reverberations the tragedy holds for black religiosity today. This is not only Hutchinson’s first novel, but also the first to be written by an African-American woman on the topic. Hutchinson has long examined the intersections of race, gender and religion, and her other books include  Moral Combat: Black Atheists, Gender Politics and the Value Wars and Godless Americana: Race and Religious Rebels . She has previously written for RD on the failed promise of Jonestown’s mixed-race utopia. The interview has been edited for length and clarity. What was your motivation for writing a novel on the People’s Temple and the Jonestown massacre? I wanted to frame the People’s Temple, its social justice trajectory and the initial involvement of African-American women, all the way from its beginnings in the Midwest. I wanted to explore what compelled them to get involved, stay involved and in some instances, go down with the ship. I see Jonestown as a cautionary tale in terms of why black women are so invested in and indebted to organized religion. I hadn’t seen that in any of the existing literature. I wanted to convey the complexity of their political alignment and the fact that you had African-Americans who were more Civil Rights-oriented, who were coming from the black church, social justice organizing perspective and the Temple represented that aspect for them. And you also have a revolutionary element of secularist, nonbelievers who were disillusioned by the fade-out of the Black Power movement. I wanted to know how the People’s Temple became validated within black people’s perspectives. Because that’s really what anchored the movement. It was the investment of everyday working-class and middle-class African-Americans but also the investment of black politicians, of black power brokers, of black activists. You have said that Jim Jones fashioned himself as the “ultimate white savior.” Why do you think he felt this draw to proselytizing African-Americans, black women especially?  Part of his personal lore was that his father was a klansman, and he would tout that as his motivation for trying to align himself with African-Americans—and ultimately even identifying as African-American. He was the first white person in the state of Indiana to adopt a black child. He always had people of color around him in the early days, and he was quite vocal about pushing back against Jim Crow ideology. He was an orderly in a hospital that refused to treat black patients, and he protested against that. He also desegregated several theaters in Indianapolis. He was also very much opposed by the white Pentecostal power structure for bringing in African-Americans. All those elements were integral to how African-Americans were brought into the movement and why they stayed—because they saw this white man going to the barricades for them, identifying with them, having family members who looked like them and constantly making racial inclusion a part of the superficial cultural propaganda of the Temple. When the People’s Temple moved to San Francisco in the 1970s, black communities were being disempowered and displaced. A lot of parasitic development was going on, and people were being pushed out of the city. So the Temple was at the forefront of organizing around those disruptions and pushing back, providing programs and social services to the community, needs that African-Americans felt black churches in the Bay Area weren’t meeting. The People’s Temple not only fills the breach created by the decline of the black church—as far as it being a movement organizer in social justice and a provider of social welfare—but also the breach caused by the decline of the Black Power movement and Civil Rights movement. There was also the family dynamic that the People’s Temple fostered where there would be several generations of a family connected to the Temple. You had all of these elements pushing African-Americans into the movement and anchoring them there—and that made it more difficult for people to leave the movement when all the abuse, harassment, and terrorism was occurring. Despite the fact that the majority of the People’s Temple membership was black, Jim Jones choose mostly white women for leadership roles. Why did this happen? Had there been prominent African-Americans who exercised real power in the church, then there would have been a lot more questions raised about the funding apparatus—in addition to the cult of personality that he had developed. Jones was a predator who preyed on people sexually but also mentally and emotionally. To reel them in there would be the evocation of the benevolent white man bucking the power structure of white supremacy, wholly invested in and identifying with blackness. He’s telling them we’re going to take all these Social Security payments, the property, the welfare benefits that our members are providing as part of their tithing and say we’re building this black nation, that that’s the ultimate goal. This was something that was internalized and accepted by a lot of the African-American members. You’re careful to never describe the People’s Temple as a cult. Why is that? It’s a demonizing term. It strips religious ideology of its complexity. It’s a term that relies on a dichotomy where the established religious orthodoxy is correct, and then there’s a certain set of religious beliefs, practices or movements that are somehow corrupt and debased because they are not coming from an Abrahamic tradition. And being an atheist, I question the Abrahamic traditions. Just because those are thousands of years old, just because they’ve been validated by scholars and religious leaders, that does not give them infallibility. Why has it been so easy for society to erase the memory of race with regard to the People’s Temple, to think of it as more or less a white movement? The truth of Jonestown has been lost because it’s been demonized to a certain extent. The impression was that you had these “zombie-esque acolytes running over there and bowing down to this white man.” It just creates a very sordid and unsavory picture. Also, it took place in San Francisco, a city that has always been mythicized as a white space. You had a white pastor, so many white bohemian liberal-to-radical members involved quite prominently and that becomes a big part of the ethos. That this is a “free love” church, that it was a new age-y kind of confab, when it’s really coming from the heart of Pentecostal, charismatic spirituality. Black women were then and still are one of the most religious segments of the population. How is the white Jesus paternal figure they’re presented with in mainstream Christianity different from how Jim Jones presented himself? The picture was different because Jones was very conscious about articulating his opposition to biblical dogma and rhetoric around the inhumanity of African-Americans and African bodies. He was up front in saying that the Bible is problematic when it comes to enfranchising and humanizing Africans. He was aware of the terrorist past of race relations in the United States, that Judeo-Christian reign within the United States has always undermined black agency and self-determination. So that was the key difference. He was also very vocal about his atheism, although that atheism was qualified over time. He began to say he was the true god, that he was the real deity and all those others were false deities that were racist and white supremacist. So he presented himself as an antidote to that, saying he loved the black people, he worshiped blackness, he had a black son, he was rooted in the black community with all these black politicians validating him. So he used all this as armor against the claim that he was just another white savior trying to hoodwink these downtrodden African-Americans. On the other side of the coin, you have the increased visibility of the black atheist community. Is that growth we’re seeing today influenced by the way the Bible was historically used to justify the degradation of black people? I absolutely think that is the motivation for many African-American nonbelievers. That the whole Curse of Ham lore, the justification of slavery, the justification of rape and commodification of black women’s bodies— all of that plays a big role in the embrace of atheism and secular humanism by African-American women. I believe that black atheism is expanding and getting more visible, certainly in terms of the virtual sphere. What remains to be seen is whether that is going to turn into anything broad or movement-based as far as African-American nonbelievers and secularists interfacing with other political movements. Do you feel this moving away from churches as authority figures is represented in the Black Lives Matter movement? Absolutely, you can see some of that in the “nones” demographic. There’s been an emergence of those folks who are not explicitly aligned with organized religion, aren’t involved in a church, aren’t coming from that traditional background of black church dogma—but who still explicitly identify as spiritual. Could a tragedy of Jonestown’s scale still happen today? The seeds of Jonestown are still there underneath the surface and given the grip that religiosity and charismatic spirituality have on scores of people, it could absolutely happen again. The severity of economic injustice and political disenfranchisement that really motivated black folks to get involved, stay involved and in some instances, decide to die, those conditions still exist and could still facilitate that kind of tragedy.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 15, 2015 08:00

November 14, 2015

Here’s what I’ve got for you, Kid: Lucky for my daughter, she’s half made up of her dad so the bad knees and slow metabolism aren’t a sure thing

A few months ago I paid $99 to spit into a cup. I sent the cup to a lab, and a few weeks later I got an email detailing my ancestral composition (mostly British, Irish and German. Absolutely no surprises there; I look, much to my dismay, like a less-murderous Ilse Koch).  It was nice, having a tidy chart of my ethnic provenance, a map of my little store of DNA. I mean, it’s a dead boring chart, ethnicity-wise, but still.  I’ve passed this genetic bundle on to a new human. My daughter is, delightfully, a real 50-50 split between her father (Italian, Irish) and myself (Bitch of Buchenwald). She has my hair and eyes, and his full lips and upturned nose. Making a next step in a genetic through-line leads one to meditate on the nature of that through-line – of heritage, heritability, what it means to be of a people or from a place. In other words, what else is in this bundle I’m giving her?  Where I’m from is a non-starter in terms of my own identity, and certainly of hers. I grew up two towns east of Hartford, Connecticut (or, two units of dullness away from Dulltown, USA, if you prefer). I have no familial ties to the area – my parents both hail from Houston, and nobody in my family lives in Connecticut anymore. I left my town as quickly as I could and never looked back. I feel a pang, a sense of having missed out on something, when I hear people describe a sense of being from a place – like, you have a connection to the place where you were born? The sights and smells and rhythms of your home are a part of you, and not something to shed as quickly and ruthlessly as possible? That sounds nice. It is my hope, raising our child in Brooklyn, that she will feel an enthusiasm for her natal lands that I did not. It is entirely possible that she will disappear into the wilds of Montana to live out her days with only a Husky for companionship and curse us for raising her in the mass of humanity that is New York.  And what about ethnicity, anyway? My connections to the Old Countries have long been stretched past the breaking point; both sides of my family have been in Texas so long that, my parents having moved North, Texas itself is the “old country.” Like a lot of suburban white people, my folkways consisted of, more or less, going to the mall and watching television. Go back far enough in parts of my family, and you get into some business I am truly not looking forward to addressing with my daughter.  I do not, in all seriousness, know how to tell her what kind of people she’s from. Boring, shading back into monsters? Will that do?  Other items in the grab bag: Bad knees. Slow metabolism. Stubby hands. Skin problems (already old hat over here; try Vanicream!). If she winds up with my frame, she’ll look pregnant her whole life. Oh and a funhouse’s worth of mental disorders from schizophrenia to psychotic breaks to plain ol’ depression, BUT EVERYTHING IS GOING TO BE JUST FINE DON’T YOU WORRY. I mean, it skipped me, so far, but the game’s not over quite yet – there’s still time for me to be bundled off to the cuckoo’s nest. I hope if it happens when she’s an adult, she puts me somewhere nice.  What’s not in the bag? I recently bought Martha Stewart’s handbook of how to do all the things around one’s house.  I bought this book because I know how to do none of the things. This is not because we had any kind of housekeeper (we did not), but because things were more or less left undone or done wrong – not in a hoarders-level kind of wrong, mind you, but with the exception of my father showing me how to sew on a button, the level of Important Life Skills being passed along was minimal at best. As a result I admire – worship, maybe – competence and skill in others; nothing pleases me more than watching somebody who knows what the hell they’re doing do a thing, whether it’s fixing a leak, hanging wallpaper or just playing pool. I crave this competence; I lack it sorely. But then, I’m only about a quarter of the way through the Martha book.  My daughter is, as I write this, only 3 – certainly nowhere near formed, but aspects of who she might be as a person are shining through. She loves arranging things in tidy rows, a neatnik (or possibly OCD) thing she definitely didn’t get from me – but I see the pride shining in her neatnik (OCD?) father’s eyes. We brought her recently to a book event I did at a bookstore that has a bar in it, and she was gloriously in her element. Hugging strangers at a bar? That’s my kid, all right.  I’m trying to up my game in hopes of having some skills – besides dumb arts-related stuff – to pass on to my daughter. We’re unlikely to become Doomsday Prepper-level DIYers, but I can make sure she has as full a complement of Basic Grown-up Life Skills as possible.  We have a whole big beautiful city and a whole big beautiful world to give her, and my hope for her is that she will navigate it with grace and aplomb – especially the ugly parts. I can’t do much about the past, but I can raise her to be a kind and conscious person, aware of her privilege in the world and her duty to refit society in a way that levels the playing field. I can and will teach her to sew on a button. And lucky for her, she’s half made-up of her daddy. I mean, he’s Canadian, after all. Emily Flake is the author of "Mama Tried: Dispatches From the Seamy Underbelly of Modern Parenting."A few months ago I paid $99 to spit into a cup. I sent the cup to a lab, and a few weeks later I got an email detailing my ancestral composition (mostly British, Irish and German. Absolutely no surprises there; I look, much to my dismay, like a less-murderous Ilse Koch).  It was nice, having a tidy chart of my ethnic provenance, a map of my little store of DNA. I mean, it’s a dead boring chart, ethnicity-wise, but still.  I’ve passed this genetic bundle on to a new human. My daughter is, delightfully, a real 50-50 split between her father (Italian, Irish) and myself (Bitch of Buchenwald). She has my hair and eyes, and his full lips and upturned nose. Making a next step in a genetic through-line leads one to meditate on the nature of that through-line – of heritage, heritability, what it means to be of a people or from a place. In other words, what else is in this bundle I’m giving her?  Where I’m from is a non-starter in terms of my own identity, and certainly of hers. I grew up two towns east of Hartford, Connecticut (or, two units of dullness away from Dulltown, USA, if you prefer). I have no familial ties to the area – my parents both hail from Houston, and nobody in my family lives in Connecticut anymore. I left my town as quickly as I could and never looked back. I feel a pang, a sense of having missed out on something, when I hear people describe a sense of being from a place – like, you have a connection to the place where you were born? The sights and smells and rhythms of your home are a part of you, and not something to shed as quickly and ruthlessly as possible? That sounds nice. It is my hope, raising our child in Brooklyn, that she will feel an enthusiasm for her natal lands that I did not. It is entirely possible that she will disappear into the wilds of Montana to live out her days with only a Husky for companionship and curse us for raising her in the mass of humanity that is New York.  And what about ethnicity, anyway? My connections to the Old Countries have long been stretched past the breaking point; both sides of my family have been in Texas so long that, my parents having moved North, Texas itself is the “old country.” Like a lot of suburban white people, my folkways consisted of, more or less, going to the mall and watching television. Go back far enough in parts of my family, and you get into some business I am truly not looking forward to addressing with my daughter.  I do not, in all seriousness, know how to tell her what kind of people she’s from. Boring, shading back into monsters? Will that do?  Other items in the grab bag: Bad knees. Slow metabolism. Stubby hands. Skin problems (already old hat over here; try Vanicream!). If she winds up with my frame, she’ll look pregnant her whole life. Oh and a funhouse’s worth of mental disorders from schizophrenia to psychotic breaks to plain ol’ depression, BUT EVERYTHING IS GOING TO BE JUST FINE DON’T YOU WORRY. I mean, it skipped me, so far, but the game’s not over quite yet – there’s still time for me to be bundled off to the cuckoo’s nest. I hope if it happens when she’s an adult, she puts me somewhere nice.  What’s not in the bag? I recently bought Martha Stewart’s handbook of how to do all the things around one’s house.  I bought this book because I know how to do none of the things. This is not because we had any kind of housekeeper (we did not), but because things were more or less left undone or done wrong – not in a hoarders-level kind of wrong, mind you, but with the exception of my father showing me how to sew on a button, the level of Important Life Skills being passed along was minimal at best. As a result I admire – worship, maybe – competence and skill in others; nothing pleases me more than watching somebody who knows what the hell they’re doing do a thing, whether it’s fixing a leak, hanging wallpaper or just playing pool. I crave this competence; I lack it sorely. But then, I’m only about a quarter of the way through the Martha book.  My daughter is, as I write this, only 3 – certainly nowhere near formed, but aspects of who she might be as a person are shining through. She loves arranging things in tidy rows, a neatnik (or possibly OCD) thing she definitely didn’t get from me – but I see the pride shining in her neatnik (OCD?) father’s eyes. We brought her recently to a book event I did at a bookstore that has a bar in it, and she was gloriously in her element. Hugging strangers at a bar? That’s my kid, all right.  I’m trying to up my game in hopes of having some skills – besides dumb arts-related stuff – to pass on to my daughter. We’re unlikely to become Doomsday Prepper-level DIYers, but I can make sure she has as full a complement of Basic Grown-up Life Skills as possible.  We have a whole big beautiful city and a whole big beautiful world to give her, and my hope for her is that she will navigate it with grace and aplomb – especially the ugly parts. I can’t do much about the past, but I can raise her to be a kind and conscious person, aware of her privilege in the world and her duty to refit society in a way that levels the playing field. I can and will teach her to sew on a button. And lucky for her, she’s half made-up of her daddy. I mean, he’s Canadian, after all. Emily Flake is the author of "Mama Tried: Dispatches From the Seamy Underbelly of Modern Parenting."

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 15:30

“Vagina voter”: Witnessing the sexism hurled at Hillary in ’08 — and the assumptions made about her supporters — changed my life

Hillary Clinton and her myriad personal and political experiences have made me a braver person than I used to be. Looking back on the 2008 primary campaign for the Democratic presidential nomination, I can connect the events that ultimately changed how I see myself as a political person. As a volunteer on Hillary’s campaign, for eight months, I made numerous phone calls to super-delegates and members of the National Organization for Women. Until then, I’d never followed Hillary’s life or career too closely. I only became a volunteer after I saw the sexism that was thrown her way by fellow Democrats as she campaigned; that sexism prompted me to learn more about her as a person and as a candidate. I didn’t realize at the time how strongly sexism would ultimately shape Hillary’s campaign and its outcome, as well as my view of the political world. At the time, I was working on a documentary film about the women’s liberation movement and had a full-time job in the film industry as a visual effects editor. My daughter was five years old; the time demands of motherhood were still new for me. Nevertheless, I managed to schedule my campaign volunteer work during lunch breaks and in the few hours I had after I got home. It was the first time I ever volunteered for a presidential political campaign. Was it hard? Yes. But it was transformational. It was explained to me once in the early days of my visual effects work on feature films that although the work seemed difficult and abstract, one day I would understand its technical complexity in a simple way. The metaphor used was the light bulb being turned on in a darkened room. One day the switch would flip to “on” and I would fully understand. My growing awareness of how sexism impacted Hillary’s presidential campaign wasn’t achieved as easily as a switch turning on; it was a slow rotation, as with a dimmer switch. With each incremental turn, a cultural undercurrent that involved women and presidential politics was illuminated. It changed me profoundly. The first turn of that political dimmer switch happened in my kitchen in 2007. I was watching Hillary’s online announcement to the nation that she was running for president. In the video, sitting on a couch and looking into the camera, she said, “I’m beginning a conversation with you, with America.” I had never heard of anyone announcing a presidential run in such an understated way, and it felt awkward. There was no man on a stage with his wife dutifully standing next to him to project an image of family. There was no mention of God. It was just her, alone, and she wanted to talk about our country. She wanted to “chat.” The image of her looking back at me was unique. That metaphorical dimmer switch turned up one millimeter. The next event was in January of 2008. It was the evening of the first primary election in the much-anticipated Iowa caucus. (Iowa, at this point, was one of only four states that had never elected a woman to any national office.) I was watching Chris Matthews on the MSNBC news show Hardball, as I always did. Hillary had come in third in the caucus, and it was a shocker, as she had been expected to win. Matthews was full of bravado as he questioned whether Hillary should stay in the race. That hit me in my gut. Why would a pundit suggest she should quit at the start of the primaries? After all, both Bill Clinton and George W. Bush lost Iowa’s coveted first place win when they ran for the presidency. I didn’t remember people calling for them to quit the race. Another millimeter. Then on the heels of the Iowa loss was her New Hampshire win. That night, I got my daughter out of bed, and we watched Hillary’s victory speech together. I said to her, “She is going to be our next president.” That was odd. I had never said those words before. I could feel that dimmer switch turning up another millimeter. But something happened the next day on Hardball that would continue to happen on many shows after every state Hillary won. Chris Matthews looked glum on the news panel of reporters and commentators who were discussing Hillary’s New Hampshire win, when he remarked, “the reason she may be a front-runner is her husband messed around. That’s how she got to be senator from New York.” He said it confidently, as if he were declaring a fact, somehow sure that a woman could never win a presidential primary or a U.S. Senate race on her own merit without voter sympathy about a cheating husband. This dislike and, often times, hatred for her by liberal progressives was something for which I was naively unprepared. I expected the venom to be delivered by Republicans and conservative pundits, but not from her supposed allies. The sexualization of women who dare enter the male halls of authority is a common tactic to suppress female ambitions for the White House. And men aren’t the only ones who resort to it. When the liberal Air America radio show host Randi Rhodes called Hillary a “big fucking whore” and another Air America host, Stephanie Miller, continually referred to Hillary as “Mrs. Clinton” instead of Senator Clinton, these subtle and not-so- subtle attacks succeeded in doing two things: They relegated Hillary to a mere sexual being, and they erased her substantive political experience, both as United States senator and as First Lady. Plenty of examples of that abound. Conservative MSNBC host Tucker Carlson said in July of 2007, “She scares me. I cross my legs every time she talks.” When discussing Hillary’s performance at one of the debates, MSNBC commentator Mike Barnicle said Hillary’s attitude made her “... [look] like everyone’s first wife standing outside a probate court.” This created a bonding moment for the all male panel, as they laughed at that image comparing a serious presidential candidate to that of an annoying spouse. Another millimeter. The Hillary hate was also marketed. When the Hillary Clinton nutcracker came along in 2008, it was advertised with the feature of “serrated stainless steel thighs that, well, crack nuts.” That, coupled with Tucker’s revelation, should have clued me in to the pairing of men and Oval Office politics as a clubhouse that had a “No Girls Allowed” sign hanging on the door. At the time, I was a daily listener to National Public Radio and a reader of progressive blogs, but their common use of sports analogies awakened my senses to a different way of interpreting political coverage. NPR correspondents would open news shows with biased lines about Obama inching closer to Hillary’s delegate count with phrases such as, “He’s within striking distance.” Daily Kos bloggers wrote articles about the state primary dates with themes that reflected a boxing game “... the DNC put a stop to these contests, thwarting her ability to land a knockout punch here.” Why care about sports analogies? Because they are common in our traditionally male presidential campaign history. Reporters use sports symbolism to cover a race more easily; policy issues are complex, and contests are simple. When women enter races, they are expected to be one of the guys and participate in the language of sports, but that is a man’s game. Men have been building campaign traditions since the founding of this country. By boiling down the primary race into a sports contest, of sorts, it repudiated Hillary’s solid experience—a major experience difference between her and Obama and a suggestion that to win, she needed to “be one of the boys.” I could see that a powerful male context was flowing through American presidential politics. The dimmer switch about how male-oriented our political system is was turning up full force but wasn’t yet on all the way. However, the building blocks of our nation’s politics were in greater view for me. When my progressive friends offered that their line in the sand with Hillary was her Senate vote for the Iraq War Resolution, I noted that there had been no similar line in the sand for John Kerry in 2004 or for Senator Joe Biden when he was nominated as Obama’s vice presidential running mate in 2008. Both of these men voted for the Iraq War. This revelation was usually met with silence. The dimmer switch was a millimeter away from full. I also came to realize that a presidential nominee has to be likable and, alternately, an aggressor. This is easier for men to portray than for women because of historical archetypes. For the first time we saw a First Lady—the most traditional of political mother images—run for president, and people had to take the Norman Rockwell ideal of a fatherly leader of our nation who sometimes reluctantly declares war and replace it with a motherly image of a woman. In Hillary’s case, she was both a mother and senator who voted for the Iraq War Resolution. This is a combination of two archetypes, caregiver and ruler. For many, this created chaos. With all the gendered criticism of her, as well as a reluctance to acknowledge her experience and qualifications, I stopped listening to Air America, NPR, and Chris Matthews. I un-bookmarked the Daily Kos. Within a few months I even discontinued my cable service. I had to re-think everything I believed about partisan politics. I found the hatred for Hillary interesting because I thought that progressives would have been proud of Clinton’s experience as First Lady, notably her speech to the Fourth World Conference on Women in 1995 in Beijing. At that now-famous event, Hillary made a high-profile speech about global women’s rights that included the now oft-quoted line, “Human rights are women’s rights and women’s rights are human rights.” She spoke passionately about the fact that even though there are many people who would try to silence the words of women on issues concerning the human rights of women and girls, that freedom of speech on these issues was extremely important. Prior to her trip, the White House administration was nervous about China’s reaction to her speech, especially as she singled out that country’s silencing of women. She ignored their fear and proceeded with her plan. Her speech sent positive shock waves across the globe and was met with tremendous applause by both liberals and conservatives. Looking back now we can see how forward-thinking her speech was on policy issues for women. It was the perfect blend of two images: leader and mother. A few people angrily asked me why Clinton didn’t quit the race for president, as they thought she was standing in the way of Obama. I think what they really wanted to know was why I wasn’t quitting Hillary. Now the dimmer switch was turned up to full because I had to answer this for myself. And, this is where my shift in understanding took place. It was June of 2008, and it was the end of the primary race. My old hallmarks of progressivism that formed part of my identity had dissolved, and I saw partisan politics more objectively. I had developed a new way of looking at the political world, especially with regard to women. All the writers, anchors, and politicians just looked like a deck of cards to me. In my mind I took the deck, placed it between my thumb and fore- finger, and jettisoned the cards into the air. I didn’t look to see where the cards fell because I had already walked through the political looking glass. I was in the wilderness. I felt alone, and it was just a little bit cold. But in reality, I was with eighteen million voters who stayed with her, those cracks in the male ceiling of the Oval Office. We could all see differently now after having experienced that campaign. Hillary’s presence and words were a powerful message to girls and women, and that is why she didn’t quit. And as a mother of a daughter, that is one of the reasons I didn’t quit her. To be sure, Hillary was fighting to win, but she also knew that we needed the memory and the images to move forward for those who came after her. Like her Beijing speech, she was ahead of the curve, but this was a rougher road. The accusations that were hurled at Hillary were hurled at all of us. We were all called “bitter clingers,” “vagina voters,” “working class,” “old,” and “bitches” right along with her and suffered the same dismissal as she did. And some of the people who threw out those slurs were feminists. Hillary’s 2008 campaign is now a snapshot that is a part of our collective cultural memory from past events that we all share. These memories help us form our identities as individuals and as citizens. Boys and men have the totality of presidential cultural memory reflected to them in the United States: Franklin D. Roosevelt holding up his hat, Dwight Eisenhower with arms held aloft, and John F. Kennedy with Marilyn Monroe. The history of male presidents is the gendered bedrock of power upon which we form our national identity, and women, without similar memories, sense their lack of power. I grieved when Hillary lost the nomination. The night after she won in South Dakota, one of the last primary states in a campaign already lost, I dreamt about her. In the dream, Hillary was in the White House. I was with many women in a room waiting to meet with her. When it was my turn, Hillary stood in front of me. I held out my arms as if to receive something. She placed several Middle Eastern shawls and fabrics into my empty arms. I knew women had made them. I took them. Several years later after the election, I finished my film about the women’s liberation movement and started to speak about how important it is to remember that movement and include it in our cultural memory. If we had had a cultural memory about female leaders in 2008, Hillary may not have been seen as an interloper in the male Oval Office. As soon as I released my film in 2013, I received an invitation to screen it in Islamabad, Pakistan, as a guest of the International Islamic University. Was I afraid to go? Yes. But after having lived through the 2008 presidential campaign, complete with its gendered rhetoric and undercurrent of dismissiveness of women, I knew how important it was for me to go. Hillary gave me the strength, and I traveled alone. I screened my film and spoke to an amazing group of Pakistani women who were in the midst of shaping feminism in their own country and wanted to learn more about American feminism. Later, with several of these Pakistani women, I went shopping in one of their open markets. I bought some beautiful fabrics, and as I took them in my arms, I remembered the dream and remembered that Hillary also had been to Pakistan. That dream wasn’t about me personally. Nor was the 2008 campaign just about the loss of my preferred candidate. There was a bigger picture developing here, and it was an image of Americans connecting with women in faraway places from the symbolic power of a woman in the Oval Office in a way that can’t happen if we elect another man to the White House. I know Hillary is a big part of this picture. I can see that clearly now because the lights are turned up brightly. Excerpted from "Love Her, Love Her Not: The Hillary Paradox," edited by Joanne Cronrath Bamberger. Copyright © 2015. Reprinted by permission of She Writes Press.Hillary Clinton and her myriad personal and political experiences have made me a braver person than I used to be. Looking back on the 2008 primary campaign for the Democratic presidential nomination, I can connect the events that ultimately changed how I see myself as a political person. As a volunteer on Hillary’s campaign, for eight months, I made numerous phone calls to super-delegates and members of the National Organization for Women. Until then, I’d never followed Hillary’s life or career too closely. I only became a volunteer after I saw the sexism that was thrown her way by fellow Democrats as she campaigned; that sexism prompted me to learn more about her as a person and as a candidate. I didn’t realize at the time how strongly sexism would ultimately shape Hillary’s campaign and its outcome, as well as my view of the political world. At the time, I was working on a documentary film about the women’s liberation movement and had a full-time job in the film industry as a visual effects editor. My daughter was five years old; the time demands of motherhood were still new for me. Nevertheless, I managed to schedule my campaign volunteer work during lunch breaks and in the few hours I had after I got home. It was the first time I ever volunteered for a presidential political campaign. Was it hard? Yes. But it was transformational. It was explained to me once in the early days of my visual effects work on feature films that although the work seemed difficult and abstract, one day I would understand its technical complexity in a simple way. The metaphor used was the light bulb being turned on in a darkened room. One day the switch would flip to “on” and I would fully understand. My growing awareness of how sexism impacted Hillary’s presidential campaign wasn’t achieved as easily as a switch turning on; it was a slow rotation, as with a dimmer switch. With each incremental turn, a cultural undercurrent that involved women and presidential politics was illuminated. It changed me profoundly. The first turn of that political dimmer switch happened in my kitchen in 2007. I was watching Hillary’s online announcement to the nation that she was running for president. In the video, sitting on a couch and looking into the camera, she said, “I’m beginning a conversation with you, with America.” I had never heard of anyone announcing a presidential run in such an understated way, and it felt awkward. There was no man on a stage with his wife dutifully standing next to him to project an image of family. There was no mention of God. It was just her, alone, and she wanted to talk about our country. She wanted to “chat.” The image of her looking back at me was unique. That metaphorical dimmer switch turned up one millimeter. The next event was in January of 2008. It was the evening of the first primary election in the much-anticipated Iowa caucus. (Iowa, at this point, was one of only four states that had never elected a woman to any national office.) I was watching Chris Matthews on the MSNBC news show Hardball, as I always did. Hillary had come in third in the caucus, and it was a shocker, as she had been expected to win. Matthews was full of bravado as he questioned whether Hillary should stay in the race. That hit me in my gut. Why would a pundit suggest she should quit at the start of the primaries? After all, both Bill Clinton and George W. Bush lost Iowa’s coveted first place win when they ran for the presidency. I didn’t remember people calling for them to quit the race. Another millimeter. Then on the heels of the Iowa loss was her New Hampshire win. That night, I got my daughter out of bed, and we watched Hillary’s victory speech together. I said to her, “She is going to be our next president.” That was odd. I had never said those words before. I could feel that dimmer switch turning up another millimeter. But something happened the next day on Hardball that would continue to happen on many shows after every state Hillary won. Chris Matthews looked glum on the news panel of reporters and commentators who were discussing Hillary’s New Hampshire win, when he remarked, “the reason she may be a front-runner is her husband messed around. That’s how she got to be senator from New York.” He said it confidently, as if he were declaring a fact, somehow sure that a woman could never win a presidential primary or a U.S. Senate race on her own merit without voter sympathy about a cheating husband. This dislike and, often times, hatred for her by liberal progressives was something for which I was naively unprepared. I expected the venom to be delivered by Republicans and conservative pundits, but not from her supposed allies. The sexualization of women who dare enter the male halls of authority is a common tactic to suppress female ambitions for the White House. And men aren’t the only ones who resort to it. When the liberal Air America radio show host Randi Rhodes called Hillary a “big fucking whore” and another Air America host, Stephanie Miller, continually referred to Hillary as “Mrs. Clinton” instead of Senator Clinton, these subtle and not-so- subtle attacks succeeded in doing two things: They relegated Hillary to a mere sexual being, and they erased her substantive political experience, both as United States senator and as First Lady. Plenty of examples of that abound. Conservative MSNBC host Tucker Carlson said in July of 2007, “She scares me. I cross my legs every time she talks.” When discussing Hillary’s performance at one of the debates, MSNBC commentator Mike Barnicle said Hillary’s attitude made her “... [look] like everyone’s first wife standing outside a probate court.” This created a bonding moment for the all male panel, as they laughed at that image comparing a serious presidential candidate to that of an annoying spouse. Another millimeter. The Hillary hate was also marketed. When the Hillary Clinton nutcracker came along in 2008, it was advertised with the feature of “serrated stainless steel thighs that, well, crack nuts.” That, coupled with Tucker’s revelation, should have clued me in to the pairing of men and Oval Office politics as a clubhouse that had a “No Girls Allowed” sign hanging on the door. At the time, I was a daily listener to National Public Radio and a reader of progressive blogs, but their common use of sports analogies awakened my senses to a different way of interpreting political coverage. NPR correspondents would open news shows with biased lines about Obama inching closer to Hillary’s delegate count with phrases such as, “He’s within striking distance.” Daily Kos bloggers wrote articles about the state primary dates with themes that reflected a boxing game “... the DNC put a stop to these contests, thwarting her ability to land a knockout punch here.” Why care about sports analogies? Because they are common in our traditionally male presidential campaign history. Reporters use sports symbolism to cover a race more easily; policy issues are complex, and contests are simple. When women enter races, they are expected to be one of the guys and participate in the language of sports, but that is a man’s game. Men have been building campaign traditions since the founding of this country. By boiling down the primary race into a sports contest, of sorts, it repudiated Hillary’s solid experience—a major experience difference between her and Obama and a suggestion that to win, she needed to “be one of the boys.” I could see that a powerful male context was flowing through American presidential politics. The dimmer switch about how male-oriented our political system is was turning up full force but wasn’t yet on all the way. However, the building blocks of our nation’s politics were in greater view for me. When my progressive friends offered that their line in the sand with Hillary was her Senate vote for the Iraq War Resolution, I noted that there had been no similar line in the sand for John Kerry in 2004 or for Senator Joe Biden when he was nominated as Obama’s vice presidential running mate in 2008. Both of these men voted for the Iraq War. This revelation was usually met with silence. The dimmer switch was a millimeter away from full. I also came to realize that a presidential nominee has to be likable and, alternately, an aggressor. This is easier for men to portray than for women because of historical archetypes. For the first time we saw a First Lady—the most traditional of political mother images—run for president, and people had to take the Norman Rockwell ideal of a fatherly leader of our nation who sometimes reluctantly declares war and replace it with a motherly image of a woman. In Hillary’s case, she was both a mother and senator who voted for the Iraq War Resolution. This is a combination of two archetypes, caregiver and ruler. For many, this created chaos. With all the gendered criticism of her, as well as a reluctance to acknowledge her experience and qualifications, I stopped listening to Air America, NPR, and Chris Matthews. I un-bookmarked the Daily Kos. Within a few months I even discontinued my cable service. I had to re-think everything I believed about partisan politics. I found the hatred for Hillary interesting because I thought that progressives would have been proud of Clinton’s experience as First Lady, notably her speech to the Fourth World Conference on Women in 1995 in Beijing. At that now-famous event, Hillary made a high-profile speech about global women’s rights that included the now oft-quoted line, “Human rights are women’s rights and women’s rights are human rights.” She spoke passionately about the fact that even though there are many people who would try to silence the words of women on issues concerning the human rights of women and girls, that freedom of speech on these issues was extremely important. Prior to her trip, the White House administration was nervous about China’s reaction to her speech, especially as she singled out that country’s silencing of women. She ignored their fear and proceeded with her plan. Her speech sent positive shock waves across the globe and was met with tremendous applause by both liberals and conservatives. Looking back now we can see how forward-thinking her speech was on policy issues for women. It was the perfect blend of two images: leader and mother. A few people angrily asked me why Clinton didn’t quit the race for president, as they thought she was standing in the way of Obama. I think what they really wanted to know was why I wasn’t quitting Hillary. Now the dimmer switch was turned up to full because I had to answer this for myself. And, this is where my shift in understanding took place. It was June of 2008, and it was the end of the primary race. My old hallmarks of progressivism that formed part of my identity had dissolved, and I saw partisan politics more objectively. I had developed a new way of looking at the political world, especially with regard to women. All the writers, anchors, and politicians just looked like a deck of cards to me. In my mind I took the deck, placed it between my thumb and fore- finger, and jettisoned the cards into the air. I didn’t look to see where the cards fell because I had already walked through the political looking glass. I was in the wilderness. I felt alone, and it was just a little bit cold. But in reality, I was with eighteen million voters who stayed with her, those cracks in the male ceiling of the Oval Office. We could all see differently now after having experienced that campaign. Hillary’s presence and words were a powerful message to girls and women, and that is why she didn’t quit. And as a mother of a daughter, that is one of the reasons I didn’t quit her. To be sure, Hillary was fighting to win, but she also knew that we needed the memory and the images to move forward for those who came after her. Like her Beijing speech, she was ahead of the curve, but this was a rougher road. The accusations that were hurled at Hillary were hurled at all of us. We were all called “bitter clingers,” “vagina voters,” “working class,” “old,” and “bitches” right along with her and suffered the same dismissal as she did. And some of the people who threw out those slurs were feminists. Hillary’s 2008 campaign is now a snapshot that is a part of our collective cultural memory from past events that we all share. These memories help us form our identities as individuals and as citizens. Boys and men have the totality of presidential cultural memory reflected to them in the United States: Franklin D. Roosevelt holding up his hat, Dwight Eisenhower with arms held aloft, and John F. Kennedy with Marilyn Monroe. The history of male presidents is the gendered bedrock of power upon which we form our national identity, and women, without similar memories, sense their lack of power. I grieved when Hillary lost the nomination. The night after she won in South Dakota, one of the last primary states in a campaign already lost, I dreamt about her. In the dream, Hillary was in the White House. I was with many women in a room waiting to meet with her. When it was my turn, Hillary stood in front of me. I held out my arms as if to receive something. She placed several Middle Eastern shawls and fabrics into my empty arms. I knew women had made them. I took them. Several years later after the election, I finished my film about the women’s liberation movement and started to speak about how important it is to remember that movement and include it in our cultural memory. If we had had a cultural memory about female leaders in 2008, Hillary may not have been seen as an interloper in the male Oval Office. As soon as I released my film in 2013, I received an invitation to screen it in Islamabad, Pakistan, as a guest of the International Islamic University. Was I afraid to go? Yes. But after having lived through the 2008 presidential campaign, complete with its gendered rhetoric and undercurrent of dismissiveness of women, I knew how important it was for me to go. Hillary gave me the strength, and I traveled alone. I screened my film and spoke to an amazing group of Pakistani women who were in the midst of shaping feminism in their own country and wanted to learn more about American feminism. Later, with several of these Pakistani women, I went shopping in one of their open markets. I bought some beautiful fabrics, and as I took them in my arms, I remembered the dream and remembered that Hillary also had been to Pakistan. That dream wasn’t about me personally. Nor was the 2008 campaign just about the loss of my preferred candidate. There was a bigger picture developing here, and it was an image of Americans connecting with women in faraway places from the symbolic power of a woman in the Oval Office in a way that can’t happen if we elect another man to the White House. I know Hillary is a big part of this picture. I can see that clearly now because the lights are turned up brightly. Excerpted from "Love Her, Love Her Not: The Hillary Paradox," edited by Joanne Cronrath Bamberger. Copyright © 2015. Reprinted by permission of She Writes Press.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 14:30

“This is the first direct hit on music”: Bono reacts to Paris terror attack on Eagles of Death Metal concert

Saturday and Sunday evening U2 concerts in Paris have been canceled in the wake of Friday's terror attacks, in which 129 people were killed and more than 350 injured by Islamic State attackers across several sites in the city. The band was scheduled to perform at Bercy Arena this weekend, about three miles from Bataclan, the venue where 118 people were killed by the end of a deadly hostage siege that took place during an Eagles of Death Metal concert. Rolling Stone reports that U2 frontman Bono spoke to Irish radio DJ Dave Fanning by phone today and said that the band had been rehearsing when the news of the attacks broke out, and that while the decision to cancel the shows was not the band's, they are supportive. "It's up to the French authorities and the city to decide when we can go back." Bono then said the band's first thoughts are with the victims, especially those at Bataclan for the concert. "If you think about it, the majority of victims last night are music fans. This is the first direct hit on music that we've had in this so-called War on Terror or whatever it's called," Bono told Fanning. "It's very upsetting. These are our people. This could be me at a show. You at a show, in that venue." Read more at Rolling Stone.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 13:39

Donald Trump’s callous response to Paris terror attack: Victims should have been armed

Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN: Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN: Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN: Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN:

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 13:34

Blame it on the baby boomers: Yes, pretty much everything

Manners are the happy ways of doing things . . . ’tis the very beginning of civility,—to make us, I mean, endurable to each other. —Ralph Waldo Emerson, The Conduct of Life As civilization began, it became apparent that the human race was imperfectly suited for it. Living together took some work. All these years later, there are innumerable examples to show that we’re still not terribly good at being civilized, from the student commuter who dashes in front of an elderly woman to claim a seat on the subway to how societies treat their poor and unemployed. The unchanging reality is that people tend to think of themselves first, yet the task of coexistence is made easier if they don’t. And so efforts arose many years ago to teach folks how to get along. The concept of grace—refined ease of movement and manner, as a way of pleasing, assisting, and honoring others—wove through this endeavor. Indeed, the term getting along itself, in the sense of being on harmonious terms, implies graceful behavior. It carries a hint of a dance, a peaceable duet, or the falling-in-step impulse that horses have with one another, which helps make them manageable. Grace and manners, the general principles of social behavior, have historically been entwined; each adds luster to the other. To trace the development of grace through time, where grace isn’t specifically mentioned, I’ve looked for an emphasis on the art of getting along. By that I mean manners that are aimed at harmonious interactions and creating a climate of warmth and appreciation, as opposed to formalities about fish forks and introductions, which are in the more detail-oriented domain of etiquette. Some of the world’s most influential books have been instruction manuals on the art of getting along, or what we’ve come to know as the social graces. These include the oldest writings of the ancient era, the runaway best sellers of the Renaissance, and the must-reads of American colonists, revolutionaries, and early twentieth-century strivers with an eye for elegance and civilized living. Yet instruction in grace mysteriously dropped out of our lives a few decades ago. Well, “mysteriously” isn’t quite right. There is a pendulum swing in the history of manners, when one era comes up with rules and they grow more and more strict until another generation says, oh, just forget about it—this is ridiculous. And grace gets thrown out for being an act, insincere, phony. “We have the residue now, with well-meaning parents who say to their children, ‘Just be yourself,’” said Judith Martin, when I asked her why the social graces were in decline. Martin is the author of the internationally syndicated Miss Manners newspaper column and many books on etiquette. “What does that mean? Who would they be if they weren’t themselves? Parents don’t teach their children how to act out being glad for a present, or how to seem pleased to see someone they may not want to see. “Etiquette has long struggled with the opposing ideas of grace and naturalness, of appearing natural and being natural, which are two entirely different things,” she continued. This inherent paradox, of feeling one thing and saying another, leaves etiquette open to the charge of insincerity. “There is a disconnect in what you feel and what you ought to project, which is the opposite of sincerity. For example, the hostess who says, ‘Oh, don’t worry about it,’ when you’ve just broken her favorite lamp. Of course she cares about it, but the primary goal is putting the other person at ease. “People say etiquette is artificial. But what they really object to is the obviously artificial,” Martin said. “Yes, it is artificial and it’s often better than the raw expression of natural desires. Look at dance: Is human movement better when it’s totally untutored or is it better when you put thought and work into it?” Social grace, just like physical grace, requires work. That was the point of the conduct books from centuries past: to make it plain that correct behavior required effort and discipline. Being with people is an art like any other art, or a practice, if you will, just like cooking or riding a bicycle. The more you realize what smooths things over, what pleases people, and the more you want to be graceful and practice being graceful, the better and more convincing you will become. Grace will cease to be something you “act out.” But as with any learned activity, there are different degrees of polish here. There is the hostess who reacts to her broken lamp by saying, “Oh, don’t worry about it” through clenched teeth, making you feel terrible. And then there is one who reacts with grace, putting on a better act, perhaps. Maybe she’s a Meryl Streep, imperceptibly masking her true feelings with an Oscar-worthy portrayal of nonchalance. Or maybe she really hated that lamp and is glad it’s headed for the trash. Or maybe she is really and truly a happy-go-lucky angel on earth whose every impulse is upright and pure. It makes no difference to the embarrassed guest who just wants to be forgiven. He’s grateful for grace any way it comes. Grace lies in the manner in which the rules are followed, Martin says. “Do you follow etiquette rules to the letter, or do you make it seem as if they arise naturally from good feelings and it’s easy for you to say, ‘Oh, never mind, don’t worry about it’? It’s not easy for a dancer to leap into the air either, and we don’t see the bloody toes and the sweat from a distance. And in the same way, if she’s being graceful, we don’t see the hostess thinking, ‘Oh my gosh, this is going to cost me a fortune to fix.’” Let’s face it, if we all exposed our true feelings all the time, the world would be unbearable. Grace, as Martin put it, “is that covering through which we make the world pleasant.” And yet we’re in one of those extremes of the pendulum swing where honesty is overvalued and the brilliant act, the self-discipline, the training that produces grace has faded away. An accumulation of blows has led to its downfall, but they stem from a reaction against the overcomplication of everyday life that picked up strength in the 1950s and ’60s. The modern means of self-improvement turned from building up one’s character (a rather slow, internal, and never-ending process) to the far easier focus on things we can buy. Buying our way into the good life. With the surge in department stores and shopping malls, with ever-present advertising, with our voyeurism via television into the lives and possessions of others, shopping became the modern means of self-betterment. This was a 180-degree turn from the previous idea. America’s Founding Fathers, for example, were obsessed with inner self-improvement. Striving for “moral perfection,” a twenty-year-old Benjamin Franklin worked methodically to acquire a list of virtues, from silence and sincerity to tranquility and humility. He assessed himself each evening and tracked his progress on charts. John Adams, in a typical diary entry, resolved to become more conscientious and socially pleasant: “I find my self very much inclin’d to an unreasonable absence of mind, and to a morose, unsociable disposition. Let it therefore be my constant endeavor to reform these great faults.” But two hundred years on, such vestiges of a Puritan past had been swept aside by a greater interest in cars, appliances, and shiny hair. The spread of the suburbs after World War II, with their backyard weenie roasts, patios, and cheese dips, was also a way of escaping an overcomplicated, formal life. It encouraged a sportier, more casual lifestyle for a middle class newly freed from decades of deprivation. Add to that the great wave of Baby Boomers, born into prosperity and surrounded by products, a Me Generation showered with attention, not inclined to modesty, and little interested in the artifice of social graces and their required self-control. In them, the age-old tendency of the young to rebel against their elders attained an unprecedented critical mass. And with that came even more informality, more “be yourself” free rein. The courtesies of their parents’ era were a drag. Child-rearing practices were also changing. In the new, less formal times, manners instruction for children simply went out of style, and the subtleties of grace were deemed passé, or worse: elitist. Anything implying snobbery was swept aside by a growing middle class, the youth counterculture, and a surging progressive tide. Change was sorely needed, as the civil rights, antiwar, and women’s movements demonstrated. But it wasn’t only social institutions that were rocked. So was the cradle. A nation crawling with babies was hungry for advice, the simpler the better. The easygoing child-centered approach advocated by Benjamin Spock in his enormously influential, best-selling Common Sense Book of Baby and Child Care, which first came out in 1946, gave parents permission to forgo the feeding schedules and strict discipline of former times and simply enjoy their kids. Hugs were in, spankings were out. But if you’re tracking the demise of grace, you can find a few nicks and cuts in his pages. Since people like children with “sensibly good” manners, Spock writes, “parents owe it to their children to make them likable.” But he also put forth the view that “good manners come naturally” if a child feels good about himself. Yet self-esteem is not the answer to everything. In fact, some researchers blame the self-esteem movement of the 1980s for the rise in narcissism among college students today as compared with those of thirty years ago. Narcissists have a grandiose view of themselves but care little about others; the argument is that parents who fill their children’s ears with how special they are (as opposed to, say, how hard they work or how kind they are) create adults with little patience for those who don’t recognize their superiority. We’ve all encountered plenty of people, young and old, with high opinions of themselves and precious little grace. It is one thing to empower a child with self-worth and confidence and to guide her in becoming a good person. But children who are not taught to behave with consideration for others and to respect other people’s feelings will not develop empathy and compassion. While likable is a perfectly fine quality, it’s a low bar to set for parents. It refers only to how others view the child, and in a bland way at that. Being likable means you’re receiving something—someone’s approval. Compare it with agreeable, which is about giving. It’s other-directed, referring to getting along, being warm, supportive, and helpful, while diminishing the focus on yourself. “Be pretty if you can, be witty if you must, but be agreeable if it kills you!” declared the 1930s Home Institute booklet Charm. Interestingly, Spock’s view of the primacy of likability flips the long-standing Anglo-American notion, prevalent among the Puritans and up through the nineteenth and early decades of the twentieth centuries, that one builds character through service to others, whether God or your fellow man. In this older view, the less you fixate on yourself the better, apart from controlling unruly impulses. Putting priority on others is the right—and graceful—thing to do.

A Culture of Coarseness

What has most threatened grace is what I can only describe as a culture of coarseness. We’re insensitive to our effect on other people. We don’t think about how others feel when we shoot down their ideas in a meeting, when we court laughs at their expense, when we criticize them in front of colleagues. Or when we make it known how little they matter once someone more interesting comes along. I was having lunch with a colleague once when she saw a man she knew passing by on the sidewalk. Waving vigorously through the window to get his attention, she urged him to join us. But the moment he got to our table, before she’d had a chance to introduce us (I’m choosing to believe that was her plan), her cell phone rang. She’d placed it on the table in case this should happen, so of course she took the call, having long forgotten the conversation she’d interrupted by inviting in a guy off the street, and leaving me and a stranger in awkward silence while she also forgot about us. Our devices are draining us of grace. “We need to e-mail!” a friend I haven’t seen in a while calls over her shoulder, because there’s no time to talk. E-mail and texting are convenient, but they also crumple us up physically and make us unaware socially, closed off from those around us. Riding the subway can be like nursery school, what with the manspreaders who don’t want to share the bench they’re sprawling on with wide-open knees and a slump, and the woman who takes up two seats with all her bags and doesn’t much care if you have to stand. Or maybe she doesn’t notice you because she’s very busy texting, like the toy store owner sitting behind the counter who couldn’t be moved to help me find a birthday present for my nephew. Silly me, I thought that she was entering important data on her tablet; it was my savvier preteen daughter who detected instantly the gestures of a stealth texter. With the hours spent hunched over keyboards, no wonder we’re awkward when we get up. Hips tighten, necks droop, our backs round. I watch people walking and standing. Most of us sag in the front, with shoulders pitched forward and chests caving, probably from too much sitting and driving and not enough walking, or walking incorrectly. Our footfalls are heavy; we gaze at the ground or at what’s in our hands. We’ve lost the ability to carry ourselves with upright buoyancy and ease. Grace is not only the furthest thing from our minds, it’s beyond the reach of our bodies. Instead, we’re drawn to disgrace. No teaser is bigger Internet click bait than the one that promises bad behavior: “Mogul Throws Fit Over Spilt Champagne”; Lindsay Lohan gets kicked out of a hotel; Justin Bieber moons his fans on Instagram. Reality TV thrives on disgrace. Fans watch it for the awkward moments, for people to be told they’re fired, they suck, they’re the weakest link. The appeal of American Idol used to be Simon Cowell bullying a contestant who had volunteered himself for public shaming. Would we ever be so stupid? Of course not. Survivor competitors drag one another through the dirt, physically and verbally; the mothers on Dance Moms put the toddler antics of subway riders to shame. Viewers can puff themselves up in comparison, engage in some vicarious ribbing without responsibility. The glee of disgrace, of course, exists beyond TV. In May 2014, Evan Spiegel, CEO and founder of Snapchat, the ephemeral photo-sharing app, issued an apology after the release of e-mails he’d written to his frat brothers while attending Stanford. Those missives had cheerfully chronicled getting sorority girls (“sororisluts”) drunk and musing about whether he’d peed on his date. Typical frat boy fun, some said. Are we too easily outraged? Or are we numb to what is truly outrageous (torture, for starters), because we’re overoutraged? Internet outrage has become a fact of life, a ritual of righteous indignation practiced after the inappropriate tweet. Outrage is such a satisfying cycle: First there is a celebrity faux pas; then the offended take to Twitter, the defenders counterattack, the bloggers repost, a Facebook fight erupts, and after all the time invested in following this trail—trust me, even your respected local newspaper is following this trail—why, there’s a new dumb thing to get mad about. We’re in an environment of grabbing and taking: taking advantage, taking control, taking for oneself. Grace, by contrast, is associated with giving. The three Charites of Greek mythology, you’ll recall, are the givers of charm, beauty, and ease. In so many fields of activity—sports, entertainment, business—-success isn’t just winning, it’s crushing. Total domination is the desired image to project. Power is valued over grace; taking is celebrated. Giving is considered a lesser quality, even a weakness. These are the days of category-killing control and sensory bombardments by any means necessary. It’s as if society at large has been captivated by the steroid aesthetic of today’s sports. Asked by business analysts if he was going to retire at sixty-five, Boeing CEO Jim McNerney said no, despite it being company custom, and by way of explanation—offered to people he wanted to impress, no less—he chose to depict himself as a monster. “The heart will still be beating, the employees will still be cowering,” he said. “I’ll be working hard. There’s no end in sight.” This prompted another memorable public apology. Yet McNerney’s original phrasing was telling, right up to his last words. There’s no end in sight. Perpetual power: Why give it up if you’re on a roll? Why give up anything if you’re in a position to take? If those down the rungs have anything to relinquish—if they can be made to cower, to give back benefits and raises and job security—then that must be done, because it can be done. Bigger may be better, but gigantic is best, whether it’s profits, or the wedding of Kanye West and Kim Kardashian, or the tech effects of a Hollywood blockbuster. (Just look at how the intimate, human-scale charm of The Wizard of Oz gave way to the massive 3-D spectacle of Oz the Great and Powerful, with its CGI landscape, booming soundtrack, explosions, and strained seriousness.) In all of this, being compassionate and humble, generous and considerate, elegantly restrained rather than a show-off, at ease instead of in-your-face—in short, being graceful—seems rather behind the times. “Go out of your way to do something nice for somebody—it will make your heart warm,” urged a 1935 guide, Personality Preferred! How to Grow Up Gracefully. This book, like others of its era, took a holistic view of grace as a way of being that one acquired through habits of the body, mind, and spirit. “Grace isn’t just a set of behaviors you dust off and display on special occasions,” author Elizabeth Woodward explained to her young readers. “It’s how you carry yourself every day.” Woodward, an editor at Ladies’ Home Journal, wrote her book after getting hundreds of thousands of letters from young women seeking advice. Before the upheavals in the mid-twentieth century, growing-up advice to young people, such as Woodward’s book, generally followed a course set in antiquity. Making one’s way in the world was seen as an art, something to be practiced and perfected. It was in some ways like a lifelong dance, with rules and steps and choreography, as well as the need for rehearsal. This art of living incorporated not only what people said and how they behaved at dinner or in the parlor, but how they moved in many ways, large and small. Control of the body through posture and proper body language has long been a part of “conduct books.” In How to Grow Up Gracefully and publications like it, for example, it is essential to the graceful life. Excerpted from "The Art of Grace: On Moving Well Through Life" by Sarah L. Kaufman. Copyright © 2015 by Sarah Kaufman. Reprinted by permission of W.W. Norton & Co. Manners are the happy ways of doing things . . . ’tis the very beginning of civility,—to make us, I mean, endurable to each other. —Ralph Waldo Emerson, The Conduct of Life As civilization began, it became apparent that the human race was imperfectly suited for it. Living together took some work. All these years later, there are innumerable examples to show that we’re still not terribly good at being civilized, from the student commuter who dashes in front of an elderly woman to claim a seat on the subway to how societies treat their poor and unemployed. The unchanging reality is that people tend to think of themselves first, yet the task of coexistence is made easier if they don’t. And so efforts arose many years ago to teach folks how to get along. The concept of grace—refined ease of movement and manner, as a way of pleasing, assisting, and honoring others—wove through this endeavor. Indeed, the term getting along itself, in the sense of being on harmonious terms, implies graceful behavior. It carries a hint of a dance, a peaceable duet, or the falling-in-step impulse that horses have with one another, which helps make them manageable. Grace and manners, the general principles of social behavior, have historically been entwined; each adds luster to the other. To trace the development of grace through time, where grace isn’t specifically mentioned, I’ve looked for an emphasis on the art of getting along. By that I mean manners that are aimed at harmonious interactions and creating a climate of warmth and appreciation, as opposed to formalities about fish forks and introductions, which are in the more detail-oriented domain of etiquette. Some of the world’s most influential books have been instruction manuals on the art of getting along, or what we’ve come to know as the social graces. These include the oldest writings of the ancient era, the runaway best sellers of the Renaissance, and the must-reads of American colonists, revolutionaries, and early twentieth-century strivers with an eye for elegance and civilized living. Yet instruction in grace mysteriously dropped out of our lives a few decades ago. Well, “mysteriously” isn’t quite right. There is a pendulum swing in the history of manners, when one era comes up with rules and they grow more and more strict until another generation says, oh, just forget about it—this is ridiculous. And grace gets thrown out for being an act, insincere, phony. “We have the residue now, with well-meaning parents who say to their children, ‘Just be yourself,’” said Judith Martin, when I asked her why the social graces were in decline. Martin is the author of the internationally syndicated Miss Manners newspaper column and many books on etiquette. “What does that mean? Who would they be if they weren’t themselves? Parents don’t teach their children how to act out being glad for a present, or how to seem pleased to see someone they may not want to see. “Etiquette has long struggled with the opposing ideas of grace and naturalness, of appearing natural and being natural, which are two entirely different things,” she continued. This inherent paradox, of feeling one thing and saying another, leaves etiquette open to the charge of insincerity. “There is a disconnect in what you feel and what you ought to project, which is the opposite of sincerity. For example, the hostess who says, ‘Oh, don’t worry about it,’ when you’ve just broken her favorite lamp. Of course she cares about it, but the primary goal is putting the other person at ease. “People say etiquette is artificial. But what they really object to is the obviously artificial,” Martin said. “Yes, it is artificial and it’s often better than the raw expression of natural desires. Look at dance: Is human movement better when it’s totally untutored or is it better when you put thought and work into it?” Social grace, just like physical grace, requires work. That was the point of the conduct books from centuries past: to make it plain that correct behavior required effort and discipline. Being with people is an art like any other art, or a practice, if you will, just like cooking or riding a bicycle. The more you realize what smooths things over, what pleases people, and the more you want to be graceful and practice being graceful, the better and more convincing you will become. Grace will cease to be something you “act out.” But as with any learned activity, there are different degrees of polish here. There is the hostess who reacts to her broken lamp by saying, “Oh, don’t worry about it” through clenched teeth, making you feel terrible. And then there is one who reacts with grace, putting on a better act, perhaps. Maybe she’s a Meryl Streep, imperceptibly masking her true feelings with an Oscar-worthy portrayal of nonchalance. Or maybe she really hated that lamp and is glad it’s headed for the trash. Or maybe she is really and truly a happy-go-lucky angel on earth whose every impulse is upright and pure. It makes no difference to the embarrassed guest who just wants to be forgiven. He’s grateful for grace any way it comes. Grace lies in the manner in which the rules are followed, Martin says. “Do you follow etiquette rules to the letter, or do you make it seem as if they arise naturally from good feelings and it’s easy for you to say, ‘Oh, never mind, don’t worry about it’? It’s not easy for a dancer to leap into the air either, and we don’t see the bloody toes and the sweat from a distance. And in the same way, if she’s being graceful, we don’t see the hostess thinking, ‘Oh my gosh, this is going to cost me a fortune to fix.’” Let’s face it, if we all exposed our true feelings all the time, the world would be unbearable. Grace, as Martin put it, “is that covering through which we make the world pleasant.” And yet we’re in one of those extremes of the pendulum swing where honesty is overvalued and the brilliant act, the self-discipline, the training that produces grace has faded away. An accumulation of blows has led to its downfall, but they stem from a reaction against the overcomplication of everyday life that picked up strength in the 1950s and ’60s. The modern means of self-improvement turned from building up one’s character (a rather slow, internal, and never-ending process) to the far easier focus on things we can buy. Buying our way into the good life. With the surge in department stores and shopping malls, with ever-present advertising, with our voyeurism via television into the lives and possessions of others, shopping became the modern means of self-betterment. This was a 180-degree turn from the previous idea. America’s Founding Fathers, for example, were obsessed with inner self-improvement. Striving for “moral perfection,” a twenty-year-old Benjamin Franklin worked methodically to acquire a list of virtues, from silence and sincerity to tranquility and humility. He assessed himself each evening and tracked his progress on charts. John Adams, in a typical diary entry, resolved to become more conscientious and socially pleasant: “I find my self very much inclin’d to an unreasonable absence of mind, and to a morose, unsociable disposition. Let it therefore be my constant endeavor to reform these great faults.” But two hundred years on, such vestiges of a Puritan past had been swept aside by a greater interest in cars, appliances, and shiny hair. The spread of the suburbs after World War II, with their backyard weenie roasts, patios, and cheese dips, was also a way of escaping an overcomplicated, formal life. It encouraged a sportier, more casual lifestyle for a middle class newly freed from decades of deprivation. Add to that the great wave of Baby Boomers, born into prosperity and surrounded by products, a Me Generation showered with attention, not inclined to modesty, and little interested in the artifice of social graces and their required self-control. In them, the age-old tendency of the young to rebel against their elders attained an unprecedented critical mass. And with that came even more informality, more “be yourself” free rein. The courtesies of their parents’ era were a drag. Child-rearing practices were also changing. In the new, less formal times, manners instruction for children simply went out of style, and the subtleties of grace were deemed passé, or worse: elitist. Anything implying snobbery was swept aside by a growing middle class, the youth counterculture, and a surging progressive tide. Change was sorely needed, as the civil rights, antiwar, and women’s movements demonstrated. But it wasn’t only social institutions that were rocked. So was the cradle. A nation crawling with babies was hungry for advice, the simpler the better. The easygoing child-centered approach advocated by Benjamin Spock in his enormously influential, best-selling Common Sense Book of Baby and Child Care, which first came out in 1946, gave parents permission to forgo the feeding schedules and strict discipline of former times and simply enjoy their kids. Hugs were in, spankings were out. But if you’re tracking the demise of grace, you can find a few nicks and cuts in his pages. Since people like children with “sensibly good” manners, Spock writes, “parents owe it to their children to make them likable.” But he also put forth the view that “good manners come naturally” if a child feels good about himself. Yet self-esteem is not the answer to everything. In fact, some researchers blame the self-esteem movement of the 1980s for the rise in narcissism among college students today as compared with those of thirty years ago. Narcissists have a grandiose view of themselves but care little about others; the argument is that parents who fill their children’s ears with how special they are (as opposed to, say, how hard they work or how kind they are) create adults with little patience for those who don’t recognize their superiority. We’ve all encountered plenty of people, young and old, with high opinions of themselves and precious little grace. It is one thing to empower a child with self-worth and confidence and to guide her in becoming a good person. But children who are not taught to behave with consideration for others and to respect other people’s feelings will not develop empathy and compassion. While likable is a perfectly fine quality, it’s a low bar to set for parents. It refers only to how others view the child, and in a bland way at that. Being likable means you’re receiving something—someone’s approval. Compare it with agreeable, which is about giving. It’s other-directed, referring to getting along, being warm, supportive, and helpful, while diminishing the focus on yourself. “Be pretty if you can, be witty if you must, but be agreeable if it kills you!” declared the 1930s Home Institute booklet Charm. Interestingly, Spock’s view of the primacy of likability flips the long-standing Anglo-American notion, prevalent among the Puritans and up through the nineteenth and early decades of the twentieth centuries, that one builds character through service to others, whether God or your fellow man. In this older view, the less you fixate on yourself the better, apart from controlling unruly impulses. Putting priority on others is the right—and graceful—thing to do.

A Culture of Coarseness

What has most threatened grace is what I can only describe as a culture of coarseness. We’re insensitive to our effect on other people. We don’t think about how others feel when we shoot down their ideas in a meeting, when we court laughs at their expense, when we criticize them in front of colleagues. Or when we make it known how little they matter once someone more interesting comes along. I was having lunch with a colleague once when she saw a man she knew passing by on the sidewalk. Waving vigorously through the window to get his attention, she urged him to join us. But the moment he got to our table, before she’d had a chance to introduce us (I’m choosing to believe that was her plan), her cell phone rang. She’d placed it on the table in case this should happen, so of course she took the call, having long forgotten the conversation she’d interrupted by inviting in a guy off the street, and leaving me and a stranger in awkward silence while she also forgot about us. Our devices are draining us of grace. “We need to e-mail!” a friend I haven’t seen in a while calls over her shoulder, because there’s no time to talk. E-mail and texting are convenient, but they also crumple us up physically and make us unaware socially, closed off from those around us. Riding the subway can be like nursery school, what with the manspreaders who don’t want to share the bench they’re sprawling on with wide-open knees and a slump, and the woman who takes up two seats with all her bags and doesn’t much care if you have to stand. Or maybe she doesn’t notice you because she’s very busy texting, like the toy store owner sitting behind the counter who couldn’t be moved to help me find a birthday present for my nephew. Silly me, I thought that she was entering important data on her tablet; it was my savvier preteen daughter who detected instantly the gestures of a stealth texter. With the hours spent hunched over keyboards, no wonder we’re awkward when we get up. Hips tighten, necks droop, our backs round. I watch people walking and standing. Most of us sag in the front, with shoulders pitched forward and chests caving, probably from too much sitting and driving and not enough walking, or walking incorrectly. Our footfalls are heavy; we gaze at the ground or at what’s in our hands. We’ve lost the ability to carry ourselves with upright buoyancy and ease. Grace is not only the furthest thing from our minds, it’s beyond the reach of our bodies. Instead, we’re drawn to disgrace. No teaser is bigger Internet click bait than the one that promises bad behavior: “Mogul Throws Fit Over Spilt Champagne”; Lindsay Lohan gets kicked out of a hotel; Justin Bieber moons his fans on Instagram. Reality TV thrives on disgrace. Fans watch it for the awkward moments, for people to be told they’re fired, they suck, they’re the weakest link. The appeal of American Idol used to be Simon Cowell bullying a contestant who had volunteered himself for public shaming. Would we ever be so stupid? Of course not. Survivor competitors drag one another through the dirt, physically and verbally; the mothers on Dance Moms put the toddler antics of subway riders to shame. Viewers can puff themselves up in comparison, engage in some vicarious ribbing without responsibility. The glee of disgrace, of course, exists beyond TV. In May 2014, Evan Spiegel, CEO and founder of Snapchat, the ephemeral photo-sharing app, issued an apology after the release of e-mails he’d written to his frat brothers while attending Stanford. Those missives had cheerfully chronicled getting sorority girls (“sororisluts”) drunk and musing about whether he’d peed on his date. Typical frat boy fun, some said. Are we too easily outraged? Or are we numb to what is truly outrageous (torture, for starters), because we’re overoutraged? Internet outrage has become a fact of life, a ritual of righteous indignation practiced after the inappropriate tweet. Outrage is such a satisfying cycle: First there is a celebrity faux pas; then the offended take to Twitter, the defenders counterattack, the bloggers repost, a Facebook fight erupts, and after all the time invested in following this trail—trust me, even your respected local newspaper is following this trail—why, there’s a new dumb thing to get mad about. We’re in an environment of grabbing and taking: taking advantage, taking control, taking for oneself. Grace, by contrast, is associated with giving. The three Charites of Greek mythology, you’ll recall, are the givers of charm, beauty, and ease. In so many fields of activity—sports, entertainment, business—-success isn’t just winning, it’s crushing. Total domination is the desired image to project. Power is valued over grace; taking is celebrated. Giving is considered a lesser quality, even a weakness. These are the days of category-killing control and sensory bombardments by any means necessary. It’s as if society at large has been captivated by the steroid aesthetic of today’s sports. Asked by business analysts if he was going to retire at sixty-five, Boeing CEO Jim McNerney said no, despite it being company custom, and by way of explanation—offered to people he wanted to impress, no less—he chose to depict himself as a monster. “The heart will still be beating, the employees will still be cowering,” he said. “I’ll be working hard. There’s no end in sight.” This prompted another memorable public apology. Yet McNerney’s original phrasing was telling, right up to his last words. There’s no end in sight. Perpetual power: Why give it up if you’re on a roll? Why give up anything if you’re in a position to take? If those down the rungs have anything to relinquish—if they can be made to cower, to give back benefits and raises and job security—then that must be done, because it can be done. Bigger may be better, but gigantic is best, whether it’s profits, or the wedding of Kanye West and Kim Kardashian, or the tech effects of a Hollywood blockbuster. (Just look at how the intimate, human-scale charm of The Wizard of Oz gave way to the massive 3-D spectacle of Oz the Great and Powerful, with its CGI landscape, booming soundtrack, explosions, and strained seriousness.) In all of this, being compassionate and humble, generous and considerate, elegantly restrained rather than a show-off, at ease instead of in-your-face—in short, being graceful—seems rather behind the times. “Go out of your way to do something nice for somebody—it will make your heart warm,” urged a 1935 guide, Personality Preferred! How to Grow Up Gracefully. This book, like others of its era, took a holistic view of grace as a way of being that one acquired through habits of the body, mind, and spirit. “Grace isn’t just a set of behaviors you dust off and display on special occasions,” author Elizabeth Woodward explained to her young readers. “It’s how you carry yourself every day.” Woodward, an editor at Ladies’ Home Journal, wrote her book after getting hundreds of thousands of letters from young women seeking advice. Before the upheavals in the mid-twentieth century, growing-up advice to young people, such as Woodward’s book, generally followed a course set in antiquity. Making one’s way in the world was seen as an art, something to be practiced and perfected. It was in some ways like a lifelong dance, with rules and steps and choreography, as well as the need for rehearsal. This art of living incorporated not only what people said and how they behaved at dinner or in the parlor, but how they moved in many ways, large and small. Control of the body through posture and proper body language has long been a part of “conduct books.” In How to Grow Up Gracefully and publications like it, for example, it is essential to the graceful life. Excerpted from "The Art of Grace: On Moving Well Through Life" by Sarah L. Kaufman. Copyright © 2015 by Sarah Kaufman. Reprinted by permission of W.W. Norton & Co. 

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 13:30