Helen H. Moore's Blog, page 911

December 27, 2015

Maybe Jon Stewart can shame them: The right loves guns more than schoolkids

Just one day after 14 people in San Bernardino, California, were murdered by terrorists armed with assault rifles, the United States Senate voted on a bill that would have allowed the attorney general to deny the sale or transfer of a firearm to anyone on the terrorist watch list. If you knew nothing about our nation’s politics on gun ownership, you would be forgiven for assuming that a bill like this would pass on an uncontested, unanimous vote. There is universal condemnation of terrorism, and it would seem utter common sense to try to prevent terrorists from acquiring weapons designed to kill scores of people within seconds. One poll of North Carolina Republicans put support in favor of the measure at 86 percent. A casual observer would have been shocked to learn that not only did the bill fail, it didn’t even garner close to the votes needed for passage. Only 45 senators voted for the measure, with one Democrat and one Republican voting opposite their party’s traditional stances on gun rights. Even if you think the “deny potential terrorists assault rifles” bill was largely political theater, it’s important to remember that such theater usually ends with Congress going along with the script. Deceptively edited videos of Planned Parenthood workers nearly force a shutdown of the entire federal government, widespread fraud at VA hospitals creates a media firestorm and results in bipartisan reform, and 9/11 rescue workers get healthcare benefits only after Jon Stewart shames politicians into voting for it. Again, you may think that there are policy reasons against using the terrorist watch list to determine who can or cannot own a firearm, but far more flawed legislation has sailed through Congress on the strength of a national tragedy. This is not the case with mass murder perpetrated by terrorists with assault rifles. Congress will mobilize to create agencies and departments to address the threat of terrorism, it will authorize unfettered mass surveillance of Americans’ emails and cellphone conversations, and it will send hundreds of thousands of Americans across the world to fight and potentially die to prevent any further attacks at home. Congress will even forgo obvious concerns of constitutionality, as was the case with the original Patriot Act. But when Congress is faced with the demand that the individual’s interest in owning a firearm be balanced against society’s interest in protecting itself from continual and routine mass murder, Congress will not bat an eye, even when first-graders are purposely targeted and murdered at their schools. To blame all of Congress, though, would be misguided. While in the past rural Democrats tended to support gun rights measures, the Democratic Party as a whole has united in emphasizing greater regulation and oversight of gun sales and ownership. Those in Congress who would erect no further barriers to potentially dangerous individuals attempting to buy assault rifles are now near universally Republican. The question thus becomes, why? The most obvious answer would seem to be the toxic mix of gun lobbies like the NRA and a well-organized base of gun rights supporters. With plenty of funding and logistical support, they exert a level of pressure on Republican members of Congress that few other lobbies can match. But lobbies and activists are not gods, and they are ultimately only as effective as the elected representatives who decide to play along. Indeed, congressional Republicans know very well that for every poll showing that 80 percent or more of Americans approve of universal background checks, there’s another saying that the majority of Americans oppose stricter gun laws. As a whole, Americans have not made up their minds as to whether we need more or fewer guns in order to reduce gun violence, even if the rest of the developed world figured out the correct answer decades ago. As Adam Gopnik of the New Yorker put it in the aftermath of Sandy Hook:
On gun violence and how to end it, the facts are all in, the evidence is clear, the truth there for all who care to know it—indeed, a global consensus is in place, which, in disbelief and now in disgust, the planet waits for us to join. Those who fight against gun control, actively or passively, with a shrug of helplessness, are dooming more kids to horrible deaths and more parents to unspeakable grief just as surely as are those who fight against pediatric medicine or childhood vaccination. It’s really, and inarguably, just as simple as that.
More important, congressional Republicans know that there is a big difference between what voters tell pollsters and what voters end up doing about it. Even Republican voters who would like to see some kind of tightening of access to assault rifles aren’t likely to organize together to put political pressure on their representatives. And yet, the curious case of allowing individuals on the terrorist watch list to purchase assault rifles should have produced a different result. I don’t have a hard time believing that 98 percent of congressional Democrats really and truly believe in banning potential terrorists from buying guns. I do have a hard time believing that 98 percent of congressional Republicans believe that a potential terrorist should be able to buy an assault rifle on demand, even when all the extenuating policy and constitutional concerns are taken into account. This is where the paradigm breaks down. It’s no longer about the NRA or gun activists or Republican voters who might want to make it harder to buy an assault rifle but don’t end up mobilizing. It’s about individual congressional Republicans making an active policy choice to allow individuals deemed so dangerous they cannot be allowed on a plane (even after passing through an airport’s physical security checkpoint) to easily purchase assault rifles and semi-automatic handguns. Again, there are flaws—perhaps even constitutional concerns—with using the terrorist watch list as the basis for such restrictions, but that is simply a question of due process. There is no doubt that Congress possesses the power to prevent individuals who pose a demonstrable threat of committing mass murder from acquiring an assault rifle. To assert otherwise is to believe in a right so broad that even our easily bypassed nationwide background check system should be struck down as unconstitutional. The right to own an assault rifle would be absolute, a right beyond all other rights. This is why political observers of all backgrounds do this nation a great disservice when they inaccurately diagnose who ultimately has the power to dramatically reduce the number of mass shootings in America. The immediate reason we endure more mass shootings (even per capita) than other industrialized nations is because we allow mass shooters far easier access to weapons of mass murder. The individuals who ultimately favor such active policy choices are today’s congressional Republicans. It may be more satisfying to blame the NRA or even those who tacitly agree that instant, universal access to every kind of firearm imaginable is worth 30,000 dead Americans every year. But special interests and activist sentiment run their course through every heated issue in American politics. Guns may reach the apex of the immutable laws of politics, but they do not transcend them. Eventually, it comes down to a single legislator’s decision. There is nowhere else to place responsibility. States and localities can tinker at the margins with their own forms of gun regulation, but that’s like telling the state of Kentucky to prevent the sale and distribution of cheap heroin all on its own. And while we may be able to out-innovate the problem of gun violence with smart gun technology, government will ultimately have to require such technology to some degree, such as with the adoption of seat belts and air bags. There is no substitute for concerted national action on the issue of mass murder perpetrated with firearms available upon demand, and such action will only take place when a handful of congressional Republicans deem it fit. What makes this so distressing is that it’s hard to imagine just how many Americans would have to die from gun violence in order to change the outlook of even a small minority of congressional Republicans. Even if next year the number of Americans who were killed by guns were to quadruple, congressional Republicans would simply double down on the pervasive logic that even more guns are needed in the hands of good people to combat the bad people with guns. When people start with a conclusion and work their way backwards to policy responses, the system breaks down. You cannot cure a disease that you fail to properly diagnose. As Adam Gopnik would say, it is as simple as that. As a kid, I took riflery classes in West Virginia and could understand why people thought of the single-bullet .22 rifles we fired as tools. As an adult, I once went to a shooting range to fire a semi-automatic handgun, and I was struck at how— ergonomically—a handgun was simply a vessel for shooting oneself and someone physically close to you. Today, I look at an assault rifle, and it is obvious that its chief, intended purpose is the killing of a cluster of human beings otherwise unable to protect themselves in time. Only when a small but determined band of congressional Republicans are able to articulate something similar to the above will we begin the long and arduous process of treating an American’s God-given right to life and the pursuit of happiness with the same respect that we treat the oft-misconstrued right to own a semi-automatic firearm.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 08:59

America is complicit in a new Middle Eastern tragedy: Saudi Arabia is obliterating Yemen — with our help

Global Post

SANABAN, Yemen — Ayman al-Sanabani beamed as he entered his family’s home on his wedding day. He was greeting his new bride, Gamila, who was in a bedroom surrounded by friends. Ayman sat beside her for several minutes, receiving warm words of congratulations.

It would be the young couple’s first and only encounter as husband and wife.

The terrifying power of a bomb is how it can alter life so dramatically, so completely, so instantaneously. How it can crush concrete, rip apart flesh, and snuff out life. The moments before the pilot pulls the trigger and sends the missile screeching down choreograph the final dance with fate: another step forward into a room, a turn around a corner, a walk outside to get some air — trivial actions that determine everything afterward.

This power is a fact of life in Yemen now. It is brought forth by a coalition of Arab countries led by Saudi Arabia and supported by the United States. The airstrikes have been relentless since March, a period now of eight months. They are supposed to be targeting a local rebel group, but appear largely indiscriminate, regularly hitting civilian targets. Thousands of people have been killed. Human rights groups say some of these strikes amount to war crimes.

Twenty-six new graves outside the now-ruined home of the al-Sanabani family. All together 43 people died when an airstrike tore through a wedding party on Oct. 7. Sharif Abdel Kouddous/GlobalPost

The al-Sanabani home sits on the crest of a small hill overlooking this village some 90 miles south of the capital, where low-slung houses are clustered near plots of yellowed farmland that are dotted by small trees. In the near horizon, reddish-brown mountains loom over the landscape. On any given day, it’s a beautiful place.

It was Oct. 7. Ayman and two of his brothers were all getting married in a joint ceremony. Hundreds of relatives and neighbors had come to take part. Their three-story house was brightly decorated. Colored lights draped down from the roof toward two large tents, which were erected to accommodate the vast numbers of guests. Children scampered outside, shooting fireworks into the night sky.

Fighter jets roared overhead but the guests paid little attention to the menacing sounds. Sanaban had never been targeted before. It was considered a safe place.

Shortly before 9:30 p.m., the three grooms — 22-year-old Abdel Rahman, 24-year-old Ayman, and 25-year-old Moayad — greeted their brides, who had just arrived in a large convoy from a nearby village.

Ayman left the bedroom where his new wife was sitting with her friends. He was climbing up to the second floor landing with his older brother when the missile struck. It was a direct hit, demolishing half the house in an instant. Gas tanks ignited, sending fire blazing through the rest of it. The air quickly filled with black smoke, dust, and screams. The women trapped indoors jumped out of windows to escape.

Ayman was blown across the hallway and hurled down the stairwell. He pulled himself up amid the chaos and tried to help others evacuate.

He would eventually find out, one by one, that his 18-year-old wife, Gamila; his younger brother, Abdel Rahman; his younger sister, Iman; his father, Mohamed; and his mother, Faiza, were all killed. They were among his 16 family members who died that night, including aunts, uncles, nephews and nieces.

The toll was not confined to them. In all, at least 43 people were killed in the attack, including 16 children. Dozens more were wounded, many of them sliced open by flying shrapnel and debris, others severely burned.

Among those injured was 15-year-old Abdullah al-Sanabani, a child prodigy who in 2012 won an international competition and a free visit to NASA headquarters for inventing a solar-powered remote controlled car that could flip over and become a boat. He was evacuated for treatment and now lies in a hospital in Boston in critical condition. In addition to undergoing numerous skin grafts, his right arm was amputated above the elbow and the two toes on his left foot were removed.

Immediately after the strike, survivors fled the scene out of fear that a second missile would follow, a tactic known coldly as a "double tap." After 30 minutes of quiet, they went back to start digging out the bodies. With no electricity, they used flashlights and headlamps to work in the darkness. It took until 6 a.m. the next day to pull all the corpses from the rubble. It took even longer to collect the shredded body parts, which they put into plastic bags. A piece of someone's hand was only discovered three days later.

Many of the bodies were either too charred or disfigured for family members to identify, known only by what they were wearing, a distinctive ring or watch. Others were identified through a grim process of elimination, by calculating who was missing.

"What can I say? My life has been made into nothing," Ayman says three weeks later, standing in the wreckage of his family home. His large green eyes appear permanently bloodshot. He speaks softly, with a mid-distance stare that never seems to focus on anything. Relatives and neighbors — some of them on crutches, others bandaged — whisper that he is not all there anymore, his mind still trying to fathom an unfathomable loss. "If I had burned like them it would have been better," he says.

Down the hill from the wreckage is an open plot of land with 26 fresh graves lined in neat rows. The white and gold headstones label the dead as martyrs.

"Why did a wedding become a target?” asks Alaa Ali al-Sanabani, a relative of the victims who was at the house the night of the attack. "We are asking for an independent investigation from an international body."

Saudi Arabia has denied responsibility. "We did not have any operations there at that time," Brig. Gen. Ahmed al-Asiri, the spokesperson for the coalition, told GlobalPost, adding somewhat impossibly that the strike instead came from the local rebel group the Saudis are fighting.

There is little rebel presence in Sanaban — no military posts visible in the village, no traces of any ground clashes. Meanwhile, multiple survivors interviewed separately said they heard fighter jets overhead minutes before the attack. Aside from US drones operating sporadically in some parts of the country, the Saudi coalition is the only air power flying above Yemen.

"The Saudis act with impunity, so it doesn't matter," said Hisham al-Omeisy, a political analyst based in Sanaa. "It's not a big deal that they hit a wedding. Since the beginning of the war they have denied pretty much everything."

THE BACKGROUND

Saudi Arabia launched its war in Yemen on March 26 to drive back a rebel group known as the Houthis. The Houthis arose in the late 1980s as a religious and cultural revivalist movement of Zaidism, a heterodox Shiite sect found almost exclusively in northern Yemen. The Houthis became more politically active in 2003, vocally opposing President Ali Abdullah Saleh for his backing of the US invasion of Iraq.

Saleh was an ally of the United States and Saudi Arabia. He was also an authoritarian ruler known for extravagant corruption. A UN study estimated the leader amassed up to $60 billion during his 33 years in power. Saleh managed to navigate his way through Yemen’s complex web of tribal, regional and geopolitical divides. It was a feat so delicate and dangerous he famously described it as "dancing on the heads of snakes."

The Yemeni leader successfully positioned himself as an ally of the United States in the ongoing “war on terror” by allowing US forces to operate inside Yemen, and their Predator drones to target Al Qaeda militants based in the country.

Saleh used his Special Operations Forces, trained and equipped by the United States, in his own battles with the northern Houthis, against whom he fought six brutal wars between 2004 and 2010.

While Saleh’s guile allowed him to remain in power, it did little to benefit the Yemeni people. They became some of the hungriest and most severely malnourished on the planet. So when Yemenis watched Egyptians and Tunisians take down their own corrupt leaders in the Arab Spring uprisings, they were quick to follow. The Houthis joined them. After securing promises of immunity for his crimes, Saleh finally agreed to step down in November 2011.

His vice president, Abed Rabbo Mansour Hadi, assumed office as interim president in a transition brokered by members of the Gulf Cooperation Council, which includes Saudi Arabia. It was backed by the United States.

Sidelined in the agreement, the Houthis positioned themselves as an opposition group, gaining support beyond their northern base for their criticisms of the transition, which was flawed and riddled with corruption. Saleh loyalists, incredibly, began forming alliances of convenience with the Houthis.

Last year the well-armed Houthis swept down from the north and took over large parts of the country, including Sanaa. In January 2015, they effectively ousted Hadi and his cabinet members, who fled to Saudi Arabia on March 25.

The next day, Saudi Arabia put together a coalition and began its military campaign with support from the United States. The Saudis and the Americans hoped to restore the friendly Yemeni government they knew. Saudi Arabia also hoped to counter what it perceives as a growing regional threat posed by Iran. Saudi Arabia believes Iran is backing the Houthis, although the level of that support is disputed.

More than 5,700 people, including at least 2,577 civilians — 637 of them children — have been killed in the eight months Saudi Arabia has been bombing Yemen,according to the United Nations. The UN expects the actual toll to be even higher because many of the dead or injured never reach medical facilities and so go unrecorded.

The Houthis and their allies have been implicated in the deaths of hundreds of civilians, often by indiscriminate shelling and the planting of landmines. But the UN says the majority of civilian deaths have come from the coalition's aerial bombing campaign, which has been relentless.

A joint report by the UN's Office for the Coordination of Humanitarian Affairs and the UK-based charity Action on Armed Violence in September concluded that 60 percent of civilian deaths and injuries were caused by airstrikes. Meanwhile, a report by the Office of the UN High Commissioner for Human Rights found that "almost two-thirds of reported civilian deaths had allegedly been caused by coalition airstrikes, which were also responsible for almost two-thirds of damaged or destroyed civilian public buildings," the OHCHR spokesman, Rupert Colville, said in a news briefing in September.

Al-Asiri, the coalition spokesman, dismissed the UN's claims. "I think this is not a very accurate report that they are publishing," he said. "If we want to discuss this statement let's first make sure that there are civilian casualties caused by airstrikes. Where is the evidence for that?"

In Yemen the evidence is everywhere.

THE CAPITAL

An airstrike destroyed the home of Hafzallah al-Ayani, a vegetable merchant, in the UNESCO world heritage site of the Old City of Sanaa. An estimated 130 houses surrounding the area were damaged. The entire al-Anyi family was killed as they sat down to dinner. Rawan Shaif/GlobalPost

In Sanaa's historic old city, which has been inhabited for more than 2,500 years, an airstrike smashed into the house of Hafzallah al-Ayani at about 11:30 p.m. on Sept. 19, burying him, his wife and their eight children beneath the rubble of their home. They were sitting together having dinner, the children aged between 4 years old and 17. Three neighbors sitting outside the house next door were also killed. The father of one of those victims, Mohamed Assaba, has kept three foot-long missile fragments as evidence. They are incredibly dense and heavy, with the terrifying jagged edges of bomb shrapnel.

All of the surrounding ancient buildings in the al-Felahi neighborhood have been badly damaged, and many of the residents have been forced to leave.

The ancient quarter of the Yemeni capital, Sanaa, which is listed as a UNESCO world heritage site.Rawan Shaif/GlobalPost

Saoud al-Alafi, 42, lived next door to the al-Ayani family, not more than 15 yards away. "If we had any Houthi leaders here at all I could maybe understand why they targeted it, but there are only civilians here," al-Alafi said. "Their aim is to terrorize us."

He describes what happened Sept. 19: An initial airstrike in the distance prompted him to step outside his front door to search the sky for clues. It was then that he heard the warplane overhead, followed by the deafening screech of a missile. He says he didn't hear the blast, only that he was thrown to the ground. When he stood up the air was filled with dust and smoke. It took him and other residents until 8:30 a.m. the following morning to dig out the bodies. Two of the children still had food in their mouths.

"One of the first steps to solve a problem is to recognize there is a problem," said Farea al-Muslimi, a Yemen analyst at the Carnegie Middle East Center. "If the Saudis don't see that there have been hundreds and thousands of Yemenis killed in the last few months by airstrikes I think the problem is much, much worse than everyone thinks."

Traveling through Yemen's northern Houthi-controlled cities and towns offers a panorama of the vicious aerial assault. Homes, schools, mosques, retail stores, restaurants, marketplaces, government offices, gas stations, power plants, telecommunications facilities, factories, bridges, roads, and UNESCO World Heritage sites have all been hit.

Some of the airstrikes display a high degree of precision. On the road north toward Saada, all four bridges — none of them spanning more than 20 yards in length — were struck directly in the center, causing them to buckle and rendering them impassable. The lack of any visible missile craters nearby indicates they were hit with pinpoint accuracy in a single strike.

Asiri, the coalition spokesman, brushed off criticism that the coalition has targeted civilian infrastructure. "Please don't be too naive, we are in a war," he said. "We are talking about military operations, we are not talking about a soccer game."

HAYDAN

A hospital guard sweeps away debris from an airstrike. The roof of the hospital had been painted with the logo of Doctors Without Borders so that jets conducting airstrikes would know this facility was being run by the international aid organization. Rawan Shaif/GlobalPost

But even in war there are rules. Medical facilities, for example, are afforded a special protected status under international humanitarian law, and are supposed to be off-limits from attacks of any kind.

On the night of Oct. 26, doctors at the only hospital in Haydan had just finished another arduous day of work. Haydan is a village in the north, less than 20 miles east of the Saudi Arabian border. A man had been brought in that day with severe wounds to his head, shoulder and abdomen. Akram Ghoutheya, an assistant doctor who had been at the hospital for three years, worked frantically for one and a half hours with other members of the medical team to stabilize the patient before sending him to a better-equipped hospital in Saada City, the provincial capital, about 40 miles away.

Huddled in Yemen's northern highlands, Haydan is stunningly picturesque. The mountainous terrain is decorated with terraced farming and is lush with verdant qat trees. Stone houses on hilltops overlook the valley below. The natural beauty, however, is now marred by the bombardments.

On the main market street, no building has been left untouched. The airstrikes have damaged every single store. Rooftops have collapsed, facades have been ripped off. Rubble lines the unpaved roads. The town's school, electrical plant and water infrastructure were all bombed. Most of the residents have fled, leaving the streets desolate. Those who remain stay indoors after sunset. They say not a day goes by without one or two air raids. On some days, dozens of missiles rain down.

Located on the edge of town, residents considered the hospital Haydan's only safe zone. The international medical humanitarian organization Doctors Without Borders (MSF) supported the hospital and regularly shared its GPS coordinates with the Saudi-led coalition. The roof of the facility was clearly identified with the MSF logo.

The entryway to the Doctors Without Borders hospital in the northern district of Haydan, Sadaa. The town of Haydan had been repeatedly bombed the week leading up to the attack on the hospital. Nothing remains of the clinic but fragments of rubble and glass. Rawan Shaif/GlobalPost

It was around 10:30 p.m. when Akram finally sat down for dinner with about a dozen other staff members in the hospital's living quarters, located in the back of the building. Minutes later, a missile smashed into the emergency room, not more than 20 yards away.

Akram was hit on the head by flying glass and debris but was only lightly injured. Terrified, he helped evacuate the only two patients at the hospital — a father and his infant son — and fled out the back, taking cover in a plot of qat trees as the rest of the staff scattered in all directions.

Five minutes later a second strike hit. Several more followed. In total, between three and six missiles targeted the hospital, completely demolishing the emergency room, outpatient and inpatient departments, the lab and the maternity ward.

"It felt like Armegeddon. People were screaming like never before because they felt that now nowhere was safe," Akram says. "This was the one place of sanctuary."

Most of the facility now lies in ruins, reduced to chunks of rubble and twisted rebar. A pack of stray dogs has made it home. They pick their way through the shredded concrete, avoiding the flocks of crows that perch on bent metal gurneys.

"It was a shock to see Haydan targeted," says Mike Seawright, the MSF emergency coordinator in Saada. "It's not just MSF, when the health care structure is treated as part of the conflict, this is against international law. The premise of a hospital is sacrosanct."

A family living next door to the MSF hospital that was destroyed in Haydan. Their home was also bombed and is now in ruins. "We have nowhere else to go," the mother said. Rawan Shaif/GlobalPost

The Saudis have offered contradictory accounts of what happened. Immediately after the attack, coalition spokesman Asiri denied that the coalition was conducting airstrikes in the vicinity of the hospital at the time. Hours later, Saudi Arabia's ambassador to the UN told VICE News that the hospital was hit by mistake because MSF had provided incorrect coordinates. The next day the ambassador reversed his account and denied the coalition was operating near the hospital. When asked by GlobalPost, Asiri would not comment, saying only that the incident was still being investigated.

International rights organizations, including Amnesty International and Human Rights Watch, have said the attack may amount to a war crime.

While no one was killed in the bombing, the destruction of the hospital will no doubt have fatal consequences. As the only functioning medical facility for miles, it was a lifeline for the surrounding towns and villages and provided medical care for about 200,000 people. At times, the hospital would receive as many as 50 cases or more a day, according to doctors who worked there. The closest medical facility is now the Gomhouri hospital in Saada City, which is about 40 miles eastward on a road that winds slowly through the mountains.

"The effect of the hospital bombing will be huge on everyone in the area," said Walid Abkar, a doctor who works at the hospital and was inside at the time of the attack. "People will die in large numbers, from wounds and from illness, especially children."

Aside from war injuries, the hospital received patients suffering from a variety of ailments, including malnutrition, dehydration, malaria and pneumonia. "We don't know what to do now," Walid said. "We have nothing here, if we build another [hospital] they will just bomb it again. You need a safe space for treatment."

The MSF hospital in Haydan is just one of dozens of similar facilities that have been hit. Nearly 70 health institutions have been damaged or destroyed during the conflict, according to the UN.

"The world has no safe places anymore," Akram said. "No world body can stop this. Not the [UN] Security Council, nothing. Saudi Arabia has bought them all."

US COMPLICITY

It’s easy to see why Akram would think that.

In September, UN human rights chief Zeid Ra'ad Al Hussein released a report that detailed the heavy civilian toll in Yemen. He recommended establishing an independent international inquiry into human rights abuses and violations of international law in the conflict.

The Netherlands responded with a draft resolution that would have mandated a UN mission to document violations by all sides over the previous year. But in the face of stiff resistance from Saudi Arabia and its Gulf partners, and little support from Western governments — including the United States — the Dutch withdrew the proposal.

Instead, the UN Human Rights Council passed by consensus a new resolution drafted by Saudi Arabia that made no reference to any independent international inquiry. The text only calls for the UN to provide "technical assistance" for a national commission of inquiry set up by the Yemeni government of President Hadi, which is backed by Saudi Arabia and a party to the war.

“By failing to set up a serious UN inquiry on war-torn Yemen, the Human Rights Council squandered an important chance to deter further abuses,” Philippe Dam, the deputy director at Human Rights Watch in Geneva, said in a statement. “The US, UK, and France appear to have capitulated to Saudi Arabia with little or no fight, astoundingly allowing the very country responsible for serious violations in Yemen to write the resolution and protect itself from scrutiny."

The United States has backed the Saudi-led coalition with arms sales as well as direct military support and coordination, raising questions about the level of American complicity in the airstrikes.

Since the escalation of the conflict in March, the United States has provided the coalition with vital intelligence, surveillance, reconnaissance, and logistics information, according to US Central Command (CENTCOM), which oversees all military operations in the Middle East.

Eight days after the bombing campaign began, the US began providing crucial aerial refueling to Saudi Arabia and its partners. As of Nov. 20, US tankers had flown 489 refueling sorties to top off the tanks of coalition warplanes 2,554 times, according to numbers provided to GlobalPost by the Defense Department.

The US military is also advising the coalition through what is known as the "Joint Combined Planning Cell," which was authorized by US President Barack Obama, according to Capt. P. Bryant Davis, a CENTCOM media operations officer. The joint cell is based in Riyadh, where US military personnel regularly meet with senior Saudi military leadership.

In addition to logistical support and intelligence sharing, the joint cell provides "targeting assistance" to the Saudi coalition, though CENTCOM stressed that the "selection and final vetting of targets" is done by coalition members, not the United States.

"There's actually a small number of US military personnel sitting in Riyadh in a military capacity helping to coordinate airstrikes. That's a game changer," says Belkis Wille, the Yemen researcher for Human Rights Watch. "It goes beyond the US just being a supporter of the coalition … they are actually a part of this armed conflict."

When asked what steps the US military takes to prevent civilian casualties in Yemen, CENTCOM said the joint cell recommends that the Saudi military "investigate all incidents of civilian casualties allegedly caused by airstrikes and has asked that the coalition reveal the results of these investigations publicly."

Since the beginning of the war, Human Rights Watch has documented more than two dozen airstrikes that the group said “appeared to be in violation of the laws of war." The rights group said it has not been able to ascertain that Saudi Arabia or other coalition members are investigating a single airstrike.

Officials at CENTCOM declined to answer whether the US military in any way reviews the toll on civilians afflicted by coalition airstrikes.

Meanwhile, the US continues to send billions of dollars worth of weapons to Saudi Arabia and its Gulf allies.

In November, the State Department approved a $1.29 billion deal to replenish Saudi Arabia's air force arsenal, depleted by its bombing campaign in Yemen. The sale includes thousands of air-to-ground munitions such as laser-guided bombs, bunker buster bombs and "general purpose" bombs with guidance systems.

Saudi Arabia has been one of the US arms industry's most avid customers. Between October 2010 and October 2014, the US signed off on more than $90 billion in weapons deals with the Saudi government, according to the Congressional Research Service. US arms manufacturers have also sold billions of dollars’ worth of material to the other Gulf states that are participating in the bombing of Yemen, including the United Arab Emirates and Qatar.

The Pentagon’s Defense Security Cooperation Agency said the latest acquisition will "enable Saudi Arabia to meet regional threats and safeguard the world's largest oil reserves."

The US continues to send billions of dollars worth of weapons to Saudi Arabia and its Gulf allies.

Congress has 30 days to block the sale. In October, Democratic members of the US Senate Foreign Relations Committee managed to delay a separate planned transfer of weapons, including thousands of precision-guided munitions, to Saudi Arabia. Meanwhile, thirteen members of Congress sent a letter to Obama urging greater efforts to avoid civilian casualties in Yemen, "in order to protect innocent lives and reduce the potential for backlash against US interests."

State Department officials told GlobalPost that when deciding whether or not to approve weapons sales to Saudi Arabia, it weighs both political and economic interests, as well as human rights considerations. “We have to take all these factors into account and clearly human rights is definitely a concern … we have asked the Saudi government to investigate all credible reports [of civilian casualties]."

Human rights advocates, however, say the United States should be conducting its own reviews.

"If an airstrike takes place, and there's reason to believe that it was a US bomb that killed dozens of civilians, the US actually has an obligation to investigate that specific strike and we have so far not seen any announcement that the US is carrying out that type of investigative function in any airstrike," said HRW’s Wille.

CLUSTER BOMBS

Hasna Al-Hanash, 3, and her Father. Hasna was injured alongside her grandmother when unexploded cluster munitions fell all around them. GlobalPost/Rawan Shaif

The US and other countries have also sold internationally banned cluster munitions to Saudi Arabia and its coalition partners. And those cluster bombs are being used in Yemen.

Neither the United States, nor Saudi Arabia — nor any other member of the coalition bombing Yemen — is party to the 2008 international treaty banning cluster munitions. The treaty has been signed by more than 100 governments because of the devastating effects cluster bombs can have on civilian communities.

The village of al-Mifaa is essentially a group of mud brick houses nestled in farmland some 10 miles northwest of Saada City. It was there that Hasna Gomaa sat by her 3 year-old granddaughter, watching her play on a swing made of rubber piping and cloth. It was 11:30 a.m. on Oct. 27.

She heard a soft boom overhead, though it wasn't nearly loud enough to be a missile strike. She paid it little mind. What she didn't know was that dozens of cluster bombs were raining down toward her and her grandchild.

Cluster bombs contain dozens of submunitions that are released in mid-air and scatter indiscriminately over a wide area.

The bomblets fell all around Gomaa and her granddaughter. One hit the tree branches above them while several others exploded next to them. Three-year-old Hasna, named after her grandmother, was thrown off the swing as shrapnel flew into her leg. The elder Hasna was also hit, with shrapnel slicing through her right thigh and left ankle.

One of the tubes used to carry sub-munitions in cluster bombs, found in Saada. Rawan Shaif/GlobalPost

"So many fell on us," the grandmother, who is in her 50s, later said. "If you saw it you would have wondered how we are still alive."

They were both bleeding profusely. The girl’s father, Mohamed Ahsan, rushed outside and carried his daughter and mother into a nearby hole the family had dug to escape airstrikes. They wrapped little Hasna's leg in a scarf to try to stem the bleeding. The family stayed crouched in the makeshift bomb shelter for several harrowing minutes, unsure if another attack would come. When Hasna fell unconscious they climbed out to rush her to a nearby hospital.

Three days later, Mohamed is holding little Hasna outside their house. Her left leg is wrapped in thick bandages and she cries out in pain when he shifts her in his arms. Her grandmother and namesake limps beside them. "If I had died it would have been OK, but not her," she says.

Abdel Aziz al-Nahari was not as fortunate. The same cluster bombs sent shrapnel into his chest and abdomen and he began to bleed internally. He now lies on a cot in Saada's Gomhouri hospital. The right side of his body is bandaged from armpit to thigh. He is too frail to talk. A tube protrudes from his chest, draining blood. He has undergone three operations and needs additional surgery to remove the shrapnel still stuck inside him.

Faisal al-Hanash saw the bombs exploding in the sky. He says metal pipes filled with bomblets that came out of two separate rockets were spinning as they fell through the air, spreading their deadly cargo over at least a square mile. He holds up one of the meter-long pipes as proof. "This is an illegal weapon, why are they using it on us?" he asks.

The cluster bombs landed all over the farmland where the family grows cucumbers, tomatoes and pomegranates. Many ripped holes through the thin plastic sheets that cover crops before exploding on the ground, destroying some of the plants. Damage from shrapnel like this is evident in several parts of the village.

A cluster munition believed to be made in Brazil lies half buried in a cucumber field in Sadaa, Yemen. Rawan Shaif/GlobalPost

Cluster bomblets have a high "dud" rate — meaning a high percentage of them fail to explode on impact and become de-facto mines.

Residents of the village of al-Haneya, which is close to al-Mifaa, say dozens of cluster bombs landed on their farmland days before on Oct. 21. They had no choice but to try to remove them if they wanted to farm their crops. Nineteen year-old Ahmed Gomaa was trying to push an unexploded bomblet away using a long stick when it exploded. He was hit with shrapnel in the forearm and leg and now walks with a crutch.

"I was afraid but I had to do it to be able to work," he says, lying down in his family house. "People continued trying to remove them even after I was injured." His father, Abdullah Gomaa, sits beside him.

"I am afraid to walk in the fields now," his father says. "This is a crime, we can't farm our land because of this."

While human rights groups and the UN have repeatedly warned of atrocities in Yemen, the conflict shows no signs of relenting. The exiled president has lost credibility across the political spectrum and Saudi Arabia’s stated goal of returning his government to rule is unrealistic at best.

The coalition has forced the Houthis to retreat from some southern areas, including the port city of Aden. But fierce ground fighting is ongoing in cities like Taiz and elsewhere. Neither the Houthis nor the Saudis appear capable of securing a clear military victory over the whole country. In the mean time, groups like Al Qaeda and the nascent Islamic State are taking advantage of the power vacuum. Al Qaeda now controls Yemen’s fifth-largest city.

With no obvious exit strategy, the coalition continues its heavy bombing. Yemenis feel the international community has forsaken them. They say the world’s media has largely ignored them.

AQBAN

Hudeidah, the country's fourth-largest city, and home to 400,000 people, is world-renowned for its fishing industry. But its fishermen are now the targets of airstrikes. Rawan Shaif/GlobalPost

One of the deadliest attacks by the Saudis in recent weeks received hardly any coverage in the foreign press. It took place not on Yemen's mainland but at sea.

The small Red Sea island of Aqban, some 25 miles west of mainland Yemen’s coast, is shaped like a diving whale. Protected from the open sea by a coral reef, its crystal blue waters provide the ideal sanctuary for Yemeni fishermen to anchor and rest when heavy winds come in.

This is where Abdo al-Baghawi's boat was headed on the morning of Oct. 22. Al-Baghawi is 52. He has a wiry frame and a bushy beard. He’s been fishing these waters for 30 years. It had been an unremarkable night's work. The crew of a dozen or so men had set out on a zawraq — a traditional, wooden Yemeni boat — just before sunset the day before and fished all night, as is their routine.

On the boat with Abdo was his cousin's son, Ali, whose hazel eyes and boyish looks made him appear far younger than his 39 years. Mohamed Suleiman, a compact 26-year old who lived in a neighboring village, was also with them. Other zawraqs were working not far away. Most of the men aboard were from a cluster of villages near Beit Faqih, about 40 miles southwest of the port city of Hudeidah. They had all fished alongside each other for many years.

"Only God knows why they attacked us. Can't they see us with all this surveillance technology?"ALI, A YEMENI FISHERMAN

After daybreak they hauled in their nets and set course for the island, where they would sleep through the morning and afternoon before fishing again the following night.

They reached Aqban at about 10 a.m. There were at least seven other zawraqs and a couple dozen smaller wooden skiffs accompanying them. The small flotilla dropped anchor in the calm waters a few hundred meters from shore. Abdo lay down to rest with the others. The fishermen were fast asleep when the first missile struck, violently yanking them out of their dreams and into a living nightmare.

The first airstrike hit the boat adjacent to Abdo's at about 11:30 a.m., shattering the hull into small fragments of broken wood. "Like a deck of cards being thrown in the air," is how Mohamed later described it.

Jolted awake, Abdo looked around in horror and confusion. There was nothing left of the boat next to him but the fishing net. Seventeen of the 20 men on board had been killed. He heard two men screaming but he couldn't see any bodies in the water.

Mohamed Suleiman lies convalescing in a hut in his home village near Beit al-Faqih in Hudeidah province. His spine was partially broken when coalition warplanes targeted the fishing boat he was sleeping in on Oct. 22. At least 42 were killed in the attack. Rawan Shaif/GlobalPost

He didn't know whether to jump in the sea or try to sail away. Amid the panic, a crewman shouted, "the next strike will be for us." They all said the shehada — the Muslim affirmation of faith that is recited when one expects to die. Moments later the second missile slammed into them.

Abdo found himself under water. He didn't know what was happening. His foot had been fractured but the pain didn't register. He said another prayer and surfaced. The bow was all that remained of the zawraq. He swam toward the wreckage trying to find other survivors, screaming names but no one answered. Eight of the thirteen men on board were dead.

He decided to swim for Aqban. Then he saw Ali and a few others not far away also struggling to make it to the beach. With two boats destroyed, the fishermen on the remaining vessels were scrambling off of theirs, diving into the water in a panic before the next strike.

Mohamed was also blown into the water by the force of the blast. Something was wrong with his back and he couldn't move properly. Struggling to stay afloat he grabbed onto a piece of wood and looked around. There were corpses floating next to him. One man was decapitated. Another man had his arm torn off. His spine partially broken, Mohamed clung helplessly to the floating debris until a skiff finally picked him up and took him to shore.

The men all collapsed on the beach. The pain from their injuries now made itself known. Some were burned and screaming in agony. Many of them couldn't walk and were crawling on the sand. Ali, whose right knee was broken and left thigh split open, passed out.

The air assault did not stop. For the next hour and a half, missiles rained down every 10 minutes, destroying the remaining boats and pounding the island itself. After about five strikes, Abdo said he saw an Apache helicopter swoop in and strafe the shallow waters 30 meters from shore, killing at least one of his colleagues, Mohamed Abdullah Hadi.

At about 1 p.m. the assault finally ended. Other fishing boats eventually arrived to evacuate those left alive.

At least 42 fishermen died in the attack. The Ministry of Justice in Houthi-controlled Hudeidah listed their names in a report. The report, obtained by GlobalPost, documented the casualties from four of the boats. The International Committee of the Red Cross confirmed the toll. Many of the bodies were only found days later, floating off the islands. Photographs of their corpses show them grotesquely bloated and disfigured.

Survivors say many bodies are still missing. They believe the toll is well over 100. Tamim al-Shami, the Houthi spokesman for the Ministry of Health in Sanaa, said 140 fishermen were killed, but those figures could not be independently confirmed. The International Committee of the Red Cross says the likelihood of those presumed missing being found are "very slim."

Ali al-Baghawi worked as a fisherman for 21 years. He was injured in an aerial attack at sea that killed dozens of his friends and colleagues. Though fishing was his livelihood he now says he will never return to the water. Rawan Shaif/GlobalPost

"Only God knows why they attacked us," Ali says. "Can't they see us with all this surveillance technology?" His arm is scarred by shrapnel and his right leg is wrapped from thigh to ankle, the bone held together by clamps attached to a protruding metal rod. A fisherman for 21 years, he now says he feels nauseous when he thinks of the sea and will never go back.

Saudi Arabia claimed the seven boats were smuggling weapons and military equipment. It released aerial footage showing the boats in the water and one of them being destroyed in a massive airstrike. "We are sure 100 percent that they were smuggling weapons from the big ships to small boats," Asiri, the coalition’s spokesman, told GlobalPost.

Survivors interviewed separately say they never saw any weapons on Aqban and that there were no boats among them other than fishing vessels. They say the small skiffs routinely accompany zarwaqs when going out to fish. In Hudeiah's harbor, scores of skiffs can be seen anchored near the larger boats.

"I never felt scared in Aqban, it was always safe," Mohamed says. He lies convalescing on a mattress in a small hut in his home village. His back is wrapped in a brace and he is unable to move. "I never saw any weapons, it was just us fishermen."

Two days before the attack, an Apache helicopter had passed overhead as the fishermen were out at sea, but Abdo thought nothing of it. Coalition warships had been patrolling the waters for months and they had never had any trouble before. "I wasn't scared," Abdo says. "I didn't think they would hunt us the way we hunt the fish."

Sharif Abdel Kouddous is a fellow at The Nation Institute. Additional reporting for this piece was provided by Amal al-Yarisi.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 08:00

December 26, 2015

Why I lie about my age: Surely it’s OK to be sexual at any age, without being labeled a “puma” or a “cougar”

I began to lie about my age a few years ago. It was subtle at first. I took off my milestone years from my LinkedIn and Facebook – when I graduated from college or when I started my first job. I evaded the question at my last two birthday parties. Was I turning 26? 27? 28? I let the question linger and blow out one symbolic candle on the cake. When I lie about my age, I don’t feel like a 28-year-old or a 31-year-old – I just feel like an adult. It's liberating. I stop measuring myself against the milestones of life stages: childhood, adolescence, college, marriage, children… And my failures don't make me a boomerang or a cougar or a prostitot or any other of those nasty terms we used to describe people who don't act their age. Let's revive that old, patronizing rule to “never ask a woman her age” and include men while we’re at it. True, it’s condescending and outdated. But the rule makes sense now more than ever. Age means more today than it ever has historically, most likely because death and pregnancy are so much more predictable. With condoms, IUDs and the pill, women are having babies in a much narrower age range than ever before. Today, few women who first became pregnant at 15 continue to have babies until they turn 45. Fertility rates are lower, teenage pregnancies are down, and getting pregnant today is less often chance than it is choice. Thanks to modern medicine, we die later and more predictably. Vaccines, hygiene, better nutrition and fewer violent conflicts all decrease our odds of dying young. For Americans, the probability of dying during the next year doubles every eight years. In other words, a 33-year-old faces a 1 in 1,500 chance of dying before he turns 34, while a 42-year-old has odds of about 1 in 750. At 100, that climbs to about a 1 in 2 chance. All this to say that our lifespans today are more certain – and this has had social consequences. Childhood was invented as a distinct life stage in the 17th century. Before that, children were seen as incomplete or deficient adults. In 1904, a social Darwinist created adolescence in his appropriately titled book "Adolescence." Child labor laws, compulsory education and minimum age laws for drinking, driving and marriage further segmented the life course. We live longer and more predictable lives. And, today, our lifespans are charted in lockstep precision from cradle to grave. And what do we call someone who dares to march to a different drum? Boomerangs, adultescents, kidults, cougars, sugar daddies. Men are particularly prone to Peter Pan syndrome or caricatured as momma's boys (Matthew McConaughey in "Failure to Launch," anyone?). Forty-year-old virgins also make for some easy comedy. "Parasite single" is a particularly unkind Japanese term to describe young unmarried women who live with their parents into their late 20s and early 30s. We don't need to live like this. Women and men alike are trashed for jumping around the lifespan. Sexuality can't only be appropriate between the ages of 20-45. Any younger? Tween girls are little Lolitas. Young men are cubs or catnip. Any older? Cougars or silver foxes. A friend of mine recently celebrated her "dirty 30" birthday. Friends gave her nostalgia gifts and a tote bag that said "29ish" in pink cursive. Her sister gave her a tank top advertising her new Puma status as a "cougar in training." If cougars are in their 40s and 50s, pumas are in their 30s. All predatory mountain cats feed on younger prey, which, apparently, is an apt description for women who date younger men. Young women also have it bad. Teen mothers are particularly easy targets (babies having babies!). "Too sexy, too soon!" scream the headlines at ABC and the Washington Post. "Barely Legal" and its ilk make youth and innocence seductive. Perversion just isn’t as perverted if the age of consent isn’t the age of sexual awakening. But what if an experienced 17-year-old expresses her sexuality? We get baby prostitutes or jailbait or, worse yet, a “prostitot.” Juliet was 13 when she fell in love with Romeo but no one called her any names. Think back toward "Sunset Boulevard" or "The Graduate." These are cautionary tales about sex and women over 45.  Perhaps one of the creepiest scenes in film history was when Norma Desmond shimmied forward to the camera and uttered the famous lines -- “I’m ready for my close up, Mr. DeMille." What a hideous woman. She should accept that she is old. Or, when the sly Mrs. Robinson seduces a man even though she is “twice his age.” (In fact, when "The Graduate" was filmed in 1967 Dustin Hoffman was 29 and Anne Bancroft was 35. ) Let’s stop looking at our lives as tick marks on a timeline so we can stop knocking down the people who don’t measure up. How about we never again ask a woman – or a man, for that matter – their age. Then perhaps we can stop acting our age and start living our lives. And if all this doesn’t work, smoke and mirrors (and a little white lie) never hurt anyone – especially a 29-year-old like me.I began to lie about my age a few years ago. It was subtle at first. I took off my milestone years from my LinkedIn and Facebook – when I graduated from college or when I started my first job. I evaded the question at my last two birthday parties. Was I turning 26? 27? 28? I let the question linger and blow out one symbolic candle on the cake. When I lie about my age, I don’t feel like a 28-year-old or a 31-year-old – I just feel like an adult. It's liberating. I stop measuring myself against the milestones of life stages: childhood, adolescence, college, marriage, children… And my failures don't make me a boomerang or a cougar or a prostitot or any other of those nasty terms we used to describe people who don't act their age. Let's revive that old, patronizing rule to “never ask a woman her age” and include men while we’re at it. True, it’s condescending and outdated. But the rule makes sense now more than ever. Age means more today than it ever has historically, most likely because death and pregnancy are so much more predictable. With condoms, IUDs and the pill, women are having babies in a much narrower age range than ever before. Today, few women who first became pregnant at 15 continue to have babies until they turn 45. Fertility rates are lower, teenage pregnancies are down, and getting pregnant today is less often chance than it is choice. Thanks to modern medicine, we die later and more predictably. Vaccines, hygiene, better nutrition and fewer violent conflicts all decrease our odds of dying young. For Americans, the probability of dying during the next year doubles every eight years. In other words, a 33-year-old faces a 1 in 1,500 chance of dying before he turns 34, while a 42-year-old has odds of about 1 in 750. At 100, that climbs to about a 1 in 2 chance. All this to say that our lifespans today are more certain – and this has had social consequences. Childhood was invented as a distinct life stage in the 17th century. Before that, children were seen as incomplete or deficient adults. In 1904, a social Darwinist created adolescence in his appropriately titled book "Adolescence." Child labor laws, compulsory education and minimum age laws for drinking, driving and marriage further segmented the life course. We live longer and more predictable lives. And, today, our lifespans are charted in lockstep precision from cradle to grave. And what do we call someone who dares to march to a different drum? Boomerangs, adultescents, kidults, cougars, sugar daddies. Men are particularly prone to Peter Pan syndrome or caricatured as momma's boys (Matthew McConaughey in "Failure to Launch," anyone?). Forty-year-old virgins also make for some easy comedy. "Parasite single" is a particularly unkind Japanese term to describe young unmarried women who live with their parents into their late 20s and early 30s. We don't need to live like this. Women and men alike are trashed for jumping around the lifespan. Sexuality can't only be appropriate between the ages of 20-45. Any younger? Tween girls are little Lolitas. Young men are cubs or catnip. Any older? Cougars or silver foxes. A friend of mine recently celebrated her "dirty 30" birthday. Friends gave her nostalgia gifts and a tote bag that said "29ish" in pink cursive. Her sister gave her a tank top advertising her new Puma status as a "cougar in training." If cougars are in their 40s and 50s, pumas are in their 30s. All predatory mountain cats feed on younger prey, which, apparently, is an apt description for women who date younger men. Young women also have it bad. Teen mothers are particularly easy targets (babies having babies!). "Too sexy, too soon!" scream the headlines at ABC and the Washington Post. "Barely Legal" and its ilk make youth and innocence seductive. Perversion just isn’t as perverted if the age of consent isn’t the age of sexual awakening. But what if an experienced 17-year-old expresses her sexuality? We get baby prostitutes or jailbait or, worse yet, a “prostitot.” Juliet was 13 when she fell in love with Romeo but no one called her any names. Think back toward "Sunset Boulevard" or "The Graduate." These are cautionary tales about sex and women over 45.  Perhaps one of the creepiest scenes in film history was when Norma Desmond shimmied forward to the camera and uttered the famous lines -- “I’m ready for my close up, Mr. DeMille." What a hideous woman. She should accept that she is old. Or, when the sly Mrs. Robinson seduces a man even though she is “twice his age.” (In fact, when "The Graduate" was filmed in 1967 Dustin Hoffman was 29 and Anne Bancroft was 35. ) Let’s stop looking at our lives as tick marks on a timeline so we can stop knocking down the people who don’t measure up. How about we never again ask a woman – or a man, for that matter – their age. Then perhaps we can stop acting our age and start living our lives. And if all this doesn’t work, smoke and mirrors (and a little white lie) never hurt anyone – especially a 29-year-old like me.I began to lie about my age a few years ago. It was subtle at first. I took off my milestone years from my LinkedIn and Facebook – when I graduated from college or when I started my first job. I evaded the question at my last two birthday parties. Was I turning 26? 27? 28? I let the question linger and blow out one symbolic candle on the cake. When I lie about my age, I don’t feel like a 28-year-old or a 31-year-old – I just feel like an adult. It's liberating. I stop measuring myself against the milestones of life stages: childhood, adolescence, college, marriage, children… And my failures don't make me a boomerang or a cougar or a prostitot or any other of those nasty terms we used to describe people who don't act their age. Let's revive that old, patronizing rule to “never ask a woman her age” and include men while we’re at it. True, it’s condescending and outdated. But the rule makes sense now more than ever. Age means more today than it ever has historically, most likely because death and pregnancy are so much more predictable. With condoms, IUDs and the pill, women are having babies in a much narrower age range than ever before. Today, few women who first became pregnant at 15 continue to have babies until they turn 45. Fertility rates are lower, teenage pregnancies are down, and getting pregnant today is less often chance than it is choice. Thanks to modern medicine, we die later and more predictably. Vaccines, hygiene, better nutrition and fewer violent conflicts all decrease our odds of dying young. For Americans, the probability of dying during the next year doubles every eight years. In other words, a 33-year-old faces a 1 in 1,500 chance of dying before he turns 34, while a 42-year-old has odds of about 1 in 750. At 100, that climbs to about a 1 in 2 chance. All this to say that our lifespans today are more certain – and this has had social consequences. Childhood was invented as a distinct life stage in the 17th century. Before that, children were seen as incomplete or deficient adults. In 1904, a social Darwinist created adolescence in his appropriately titled book "Adolescence." Child labor laws, compulsory education and minimum age laws for drinking, driving and marriage further segmented the life course. We live longer and more predictable lives. And, today, our lifespans are charted in lockstep precision from cradle to grave. And what do we call someone who dares to march to a different drum? Boomerangs, adultescents, kidults, cougars, sugar daddies. Men are particularly prone to Peter Pan syndrome or caricatured as momma's boys (Matthew McConaughey in "Failure to Launch," anyone?). Forty-year-old virgins also make for some easy comedy. "Parasite single" is a particularly unkind Japanese term to describe young unmarried women who live with their parents into their late 20s and early 30s. We don't need to live like this. Women and men alike are trashed for jumping around the lifespan. Sexuality can't only be appropriate between the ages of 20-45. Any younger? Tween girls are little Lolitas. Young men are cubs or catnip. Any older? Cougars or silver foxes. A friend of mine recently celebrated her "dirty 30" birthday. Friends gave her nostalgia gifts and a tote bag that said "29ish" in pink cursive. Her sister gave her a tank top advertising her new Puma status as a "cougar in training." If cougars are in their 40s and 50s, pumas are in their 30s. All predatory mountain cats feed on younger prey, which, apparently, is an apt description for women who date younger men. Young women also have it bad. Teen mothers are particularly easy targets (babies having babies!). "Too sexy, too soon!" scream the headlines at ABC and the Washington Post. "Barely Legal" and its ilk make youth and innocence seductive. Perversion just isn’t as perverted if the age of consent isn’t the age of sexual awakening. But what if an experienced 17-year-old expresses her sexuality? We get baby prostitutes or jailbait or, worse yet, a “prostitot.” Juliet was 13 when she fell in love with Romeo but no one called her any names. Think back toward "Sunset Boulevard" or "The Graduate." These are cautionary tales about sex and women over 45.  Perhaps one of the creepiest scenes in film history was when Norma Desmond shimmied forward to the camera and uttered the famous lines -- “I’m ready for my close up, Mr. DeMille." What a hideous woman. She should accept that she is old. Or, when the sly Mrs. Robinson seduces a man even though she is “twice his age.” (In fact, when "The Graduate" was filmed in 1967 Dustin Hoffman was 29 and Anne Bancroft was 35. ) Let’s stop looking at our lives as tick marks on a timeline so we can stop knocking down the people who don’t measure up. How about we never again ask a woman – or a man, for that matter – their age. Then perhaps we can stop acting our age and start living our lives. And if all this doesn’t work, smoke and mirrors (and a little white lie) never hurt anyone – especially a 29-year-old like me.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 26, 2015 15:00

My daughter’s heroes could all be my mother: What my teenager is teaching me about growing old

I don't recall when people started calling my firstborn an "old soul," but I'm pretty sure she hadn't started walking or talking yet. I think at first it was just about her eyes, the way they convey a certain combination of wisdom and boredom that always seemed out of sync with her age. Now, as she approaches 16, she somehow seems even further beyond her years. And it turns out my kid is the one who's teaching me how to grow old gracefully. I should mention that in many respects my daughter is a typical high schooler. Unlike her mother, she recognizes the music playing at H&M without having to ask Siri to identify it. She has conversations about people who are famous on Vine. She can read a text without holding her phone — a phone that does not have its brightness set to the highest possible level — at arm's length and squinting. To the outside world, she resembles "Unbreakable Kimmy Schmidt's" Xanthippe. In her heart, though, she's more Betty White. Last night she watched "The Proposal." Again. In the past year, my daughter has dragged me to see "Iris" and "The Second Best Exotic Marigold Hotel," read the Grace Coddington memoir, and, after getting hooked on Netflix's "Grace and Frankie," asked me in awe, "Do you know about Lily Tomlin? She's incredible." After watching the Elaine Stritch documentary "Shoot Me," she changed her iPhone's lock screen wallpaper to an image of the late Broadway legend, scowling on a New York City street in an oversize fur coat and hat. She's lately been on an Angela Lansbury and Tina Turner kick. Her favorite Peggy Carter? Elderly Peggy Carter. When I asked her recently why she's so fascinated with older ladies, my daughter simply shrugged and replied, "Uh, because they're queens?" My child, admittedly, has the privilege of being a fangirl of the geriatric set while inhabiting an often more advantageous age group herself. Sure, she can't yet drive or vote, and rando dudes on the street love to order her to "Smile, baby." But a large portion of the popular culture revolves around the tastes of her and her peers, and none of the products in her makeup case contain the word "mature skin." It's possible that in 50 or 60 years — when, if she's lucky, she'll be a feisty old bird herself — she'll be less enchanted with the idea of being aged. Yet as one who is considerably closer to senior citizenship than she is, I find great inspiration in my daughter's frequent and sincere pronouncements of "You know who's a BAMF? Maggie Smith, that's who." Like most of the women I know — with the possible exception of nursery school teachers and social workers — I routinely face the expectation that being over 40 renders a female irrelevant. Her lines or gray hairs are taken as demerits against the value of her ideas and opinions, calibrated on a scale in which the number of candles on her birthday cake represents her declining worth as a productive, interesting person. And when 32-year-old Anne Hathaway is getting iced out of potential roles for being thought too long in the tooth, you can understand why actual older women often feel discouraged, frustrated and prone to retinol binging. The fear of becoming invisible is real. The prospect of growing slower, jowlier, more-often-than-even-now-tuned-out is not pleasant. Last week I had to get my knee looked at. I tried to tell myself it was because my knee does a lot of running and not because my knee is just as middle-aged as the rest of me, but come on. My entourage now includes a knee guy. I feel like I'm not coming back from that one. But my daughter — my daughter who has dewy skin and an ability to eat like a garbage-can-foraging raccoon without a heartburn-related care in the world — makes me feel better about everything. My daughter who sincerely thinks hanging out with her 83-year-old grandmother is a hot time, gives me the hope that judging women based on the year they were born is actually a very old-fashioned idea. And she reminds me that being productive and useful in the world, finding new adventures within it yet to have — these are choices that no one can take away. And if 88-year-old Rosalynn Carter is still putting herself out there, what's there for the rest of us to be afraid of? Time isn't the enemy. Time is a gift. How fortunate I would be to someday become the kind of old lady my daughter admires. And my wise child is so right — Maggie Smith is a BAMF.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 26, 2015 13:30

“If you have a moral attachment to a way of eating, that’s a disordered behavior”: How dieting culture, fat shaming and food porn shape our lives

Masses of Americans will resolve to lose weight for New Year’s, but Kelsey Miller, a senior features writer at Refinery29 and author of the upcoming memoir "Big Girl: How I Gave Up Dieting and Got a Life" (Jan. 5), would propose a different solution to body image woes. Miller struggled with her weight since childhood and hopped on and off fad diets throughout her teens and twenties. It took collapsing in the woods after a failed “Spartan Warrior Workout” for her to realize this method wasn't working. She had tried the same thing — dieting — for years and expected different results, the oft-cited definition of insanity. The outcome, of course, was instead the same: She’d grow miserable, relapse to “unhealthy” eating, and end up back where she started, eager to check another weight-loss regime off her list. So at 29, Miller embarked on a fast from fasting itself, chronicling her experiment in Refinery29's Anti-Diet Project column. Secretly, she hoped this new approach would somehow help her shed pounds. But when her nutrition therapist asked if she'd be okay never losing an ounce through intuitive eating — consuming exactly what she was hungry for — she was forced to reconsider what health and fitness meant. At a café in her home borough of Brooklyn, Miller told Salon about what she learned along her diet-free journey, the toxic messages the media sends about food and weight, and the advice she'd give others seeking a saner relationship with their bodies. The interview has been condensed and lightly edited for clarity. At the beginning of your book, you ditch dieting in favor of a new philosophy called intuitive eating. Could you explain what intuitive eating is?  It sounds like a fake thing, and that’s what I thought it was when I first heard about it, before I quit dieting. I’d stumbled across it a couple times and thought it sounded sort of adorable and crunchy and like something I should probably definitely do, but not yet — when I was thinner. When I had the big turning point, that phrase was hanging around in my head. Really, it’s just diet deprogramming and learning to eat like a normal person again. For better or worse, most of us don’t have a sense of the way we ate when we were 3 years old, when we weren’t afraid of basic carbohydrates. It’s learning to eat again as if you were a child with all those instincts firmly in place, before they got polluted by diet culture and outside influence. It’s having full permission to eat whatever. Nothing is going to make you a bad person if you eat it. And then it’s monitoring your fullness, which is just a very fancy way of saying, “You eat when you’re hungry. No matter what, you do it. And when you’re full, you acknowledge that.” Some people think it means you can only eat according to hunger and fullness and there’s no leeway for when you’re just like, “Oh my God, that cookie looks really good.” But of course you’re allowed to do that. I have a friend who’s an intuitive eating coach who references birthday cake as a case of “emotional eating” that’s important. We eat birthday cake for a reason. If you’re really not in the mood and you don’t feel like it, you don’t have it. But you don’t eat birthday cake because you’re like, “My body really needs some birthday cake right now, and I’m going to honor that,” and that doesn’t make it illegitimate. Can intuitive eating include consideration for what is healthiest to put in your body? Or is that its own form of disordered eating?  That was my fear when I started. I really thought I was just going to eat pizza until I died of pizza. But the truth is, when nothing is off limits and you’re eating mindfully, you realize pizza is good, and then too much pizza doesn’t feel good. And if you have a sense that there will always be pizza, you can always get more pizza, and it isn’t going anywhere, you’re not going to have that need to eat all the pizza when it’s in front of you. That sense of deprivation is really what leads to overeating. Once you have a sense of security around food and you’re not approaching food with a sense of deprivation and fear, absolutely, you can think of nutrition. I’ll have the thought sometimes, “There aren’t really a lot of vegetables in my breakfast this morning. I’ve got to get some greens in there.” I know my body feels better when I have roughage in it. Most people who have been dieting their whole lives have a sense of what’s healthy and what’s not, and the issue is that a lot of the ideas we form are a perversion of health. Instead of being like, “I don’t want to eat too much sugar because obviously I know that eating cups and cups of sugar won’t make my body feel good and function well,” we end up thinking, “Sugar is Satan, sugar is the devil, and if you eat sugar, I’ll call the police.” That sense of moralization around food and healthiness is different from wanting a vegetable on my plate. The association between food and morality was a major theme in your book. You wrote that while you were dieting, you viewed eating after 7 p.m. as "a crime on par with infanticide." How do you think we learn to assign moral value to food?  Man, we should write a thesis on that. I’m sure there is one somewhere. But, for one thing, we associate thinness with goodness and restriction with being good. “I’m being good this week” — that kind of thing. So, our language and the way we perceive and talk about food totally foster that connection. On the flip side, we hear about junk food and garbage food and poison — “gluten is poison,” people say — and so we take it to extremes. That’s the other side of the perversion of healthiness: that sense that I’m a bad person if I eat refined bread. It’s really hard to let go of that. Also, we have what we’re told growing up: what you should eat, when you should eat, what you shouldn’t have too much of. If I ate too much of something that my mom or dad didn’t want me to eat, I felt bad. I felt like I did something wrong. It starts very early, and the way we talk about food in the media totally feeds into that moral discussion. Do you think that’s a problem for women especially? I think it is for everybody, but certainly women are far more targeted by the diet industry. Obviously, men pick up on these things a lot, and we don’t talk about it with men. I don’t think a lot of men feel as comfortable talking about these things. There’s a lot more crossover than there used to be, but I think there’s an emphasis on women. If you look back, we associate women with morality, so why wouldn’t we associate women with purity and health and thinness? What do you mean when you say we associate women with morality? I’m thinking of the righteous wife, “pure as the driven snow,” all those archetypes of women in literature. Women are often described as virgins or whores, and those are just the opposite sides of morality. It seems like there are two opposing movements in this country right now. On the one hand, there’s the push away from moralizing food and bodies, toward body positivity and fat acceptance. On the other, there’s this fear of the so-called obesity epidemic, which some medical professionals and public figures like Michelle Obama are trying to combat. Do you think there’s room for both, or is concern for health just fat-shaming in disguise? The health-at-every-size movement has shed light on some interesting things, and this is an ongoing exploration, but the fact that people react to fat positivity and health-at-any-size should tell us something. The way we associate size and health is clearly wrongheaded. It’s vitriolic and hateful and just speaks to this enormous bias that’s absolutely endemic across our culture. That is something we need to take a good, hard look at. We should be addressing the foods we’re feeding ourselves and our kids, but we also need to be acknowledging and thinking about the way we feed them and the way we feed ourselves. We all know there are some people who are just going to be bigger naturally and some people who will be smaller naturally, and there are some people who are going to be bigger because they are eating poorly or they’re eating in a disordered way or they’re just overeating. There’s a lot of reasons behind that, but it’s not as simple as “fat person = bad, unhealthy person.” That moralization of health is in there because there’s nothing but shame and disgust in the way we talk about fatness. We have to address a lot of the foods we eat but also the way we talk about our bodies. If anything’s an emergency, that’s an emergency as well. But the quote-unquote obesity epidemic is a complicated matter, and I have no question that the people behind that are trying to do something good. Nobody wants American children to eat piles of mac and cheese nonstop and no vegetables. But why aren’t we talking about that instead of saying, “We have to fix our fat kids?” We now know that shaming and restriction haven’t gotten us very far. They have never made anybody any thinner or healthier or better. Why isn’t that part of the conversation more? That needs to be a global discussion as well. It seems like your experience with disordered eating without a diagnosable eating disorder is really common. What are some signs that your eating habits could be disordered?  That’s a really important distinction a lot of people don’t recognize yet. A lot more people have disordered eating than eating disorders. If any part of your day or your life is anchored around food, that’s a disordered behavior. If you have a moral attachment to a way of eating, that’s a disordered behavior. It’s a little complicated because there are people like ethical vegans who aren’t necessarily eating in a disordered way even though there is a morality there. But I think you have to take a hard look at yourself, which is very difficult, and say, “Am I engaging with this food in a way that is neutral by treating it like food, or am I treating it like something else?” I think food neutrality is the barometer there. Could you explain what you mean by “food neutrality?” The example I use in the book is that French fries were the ultimate food on my bad list, versus kale, which was like the saintliest of greens and just feels like you’re eating Gwyneth Paltrow’s blonde hair. It’s like you’re consuming something so pure and good and it’s making you better, whereas you have to eat French fries in a dark room alone so nobody can see this terrible crime you have committed, and you have to repent with kale and eat it in the presence of a thin person. That’s the food neutrality issue: when you’re treating food as something other than food. Even people who use words like “I’m so bad! I just ate a ton of French fries” — in the end, if it’s not haunting you but you’re still like, “Well, I ate the French fries. That’s too bad I ate a huge pile of French fries,” you’re still engaging with it like it’s not just food but the thing defining your moral character. That’s really difficult to let go of. I can’t look at a potato and not see Weight Watcher’s points. My brain just has that data, so I have to consciously look at a potato, and I hear the points in my head, and I tell myself the rest of the story about the potato: “This is the potato. It’s a potato. It is not something I’m going to have to make up for. It’s a vegetable. It’s a starch. Do I want it? Am I hungry? Do I crave it?” I ask myself a bunch of questions to bring it back to what it is: just a potato, nothing more. What did you gain from dieting that made it so hard to give up? How do you get those things now? The thrill. The thrill of the new diet and the fantasy that comes with it. Every time you start a new diet, you really believe this will be the one to change your body and that everything in your life will be a million times better and more magical when you’re done. They don’t say, “When you’re finished with this, you’ll have a pony,” but it’s basically implied. They imply that everything will be wonderful. And then, of course, there’s the comfort of failure. It’s all very familiar to start something new and then have some success and then stumble or plateau and then fail and then just reverse to that binge-y period in between diets. I won’t say I don’t miss that structure. I don’t want it, but it was a loss in my life. And I’m glad. I willingly sacrificed it because now I live a real life without the structure of the fantasy and the failure and everything so linear and all tied to how much I ate in a day. Now I live a much more complicated life. It’s a more full life, but I miss the simplicity sometimes. I miss the goals. I miss not having to think about it so much. So I have sympathy for myself and everybody else who still does this because it’s not just about the desire to lose weight. It’s about that cycle as well. I don’t have something that promises me a perfect future 30 pounds from now anymore, but that’s okay because that’s not a sustainable pattern, and that wasn’t getting me anywhere. That was keeping me stuck in limbo for a really long time, and now I live in the reality of day to day, which is sometimes great and sometimes shitty and sometimes boring, but it’s real. It’s the real deal. Do you think the Internet makes it harder to remain sane about food? For example, your book describes feeling overwhelmed with options when you order from Seamless, and you recount Internet trolls making you self-conscious. What has the Internet not affected in our lives? It makes everything better and worse, doesn’t it? I think the Internet has really fetishized food even more. We have all these food channels, and we have this crazy bacon obsession. It’s very extreme: “Oh my God, the bacon bonanza and the pork belly! What’s the most fatty, decadent thing I can eat?” versus “CLEANSING!” Social media has fed into that. I always want to take a picture of my sandwich. Every time you do that, you’re making food more than food. It took me a really long time to recognize that as maybe not the healthiest impulse. Is loving food just as destructive as fearing it, then? There’s a difference between loving food and having your life centered on it — because, man, there are those meals that really are special, and sometimes you have that life-changing burger or that thing your mom used to make that you’ll always remember from childhood. That is a normal and very okay thing to have in your life. But there’s a difference when every food has that intense vibration, when it stands out and you can’t just pick up an apple and eat it because that’s what you’d like to eat right now. So, yes, I think you can love food and not be obsessed with it. I still have foods that I absolutely love. I’ve even written about the “food porn” Instagram. You have to define the line for yourself, but there are times when it’s perfectly acceptable to be like, “Look at this goddamn beautiful cake I made” or “This pasta is wowing me.” Food is part of our lives, and we should celebrate it, but we have to be conscious about what we’re doing. What other advice would you give people who want to develop a healthier relationship with food and their bodies? For starters, it’s always good to get some help if you can, at least in the beginning. If I just did this myself, I wouldn’t have been held accountable. I needed somebody to remind me what I was doing, to remind me to trust myself. Whenever you feel alone, things are a lot more painful and harder to commit to. Help is available, even if it’s just picking up the Intuitive Eating book and going online to an intuitive eating community. While there’s a “pro-ana” community that people rely on to maintain their commitment to anorexia, there’s a much bigger and healthier community around body positivity. But you have to seek it out because you’re not just going to turn on the television and see body-positive representations. I’m certainly not the only one talking about these things. There’s plenty of people of all shapes and sizes and genders and backgrounds. So make an effort to immerse yourself in literature and blogs and people and a culture that support what you’re doing. If you could go back in time and talk to the version of yourself that was still dieting, is there anything you would say to spare yourself the pain of learning all these things the hard way? I think about that sometimes, but at that time I probably wouldn’t have been ready to hear it and would have just rejected it. It would be amazing, though, if I could have just shaken my shoulders and said, “You think your life’s going to change with this diet, but really, nothing is going to change until you stop this. Until you really, really stop this.” I wasted so much of my life treading water in this cycle and feeling like I couldn’t do anything until I fixed this problem that was my body. If I could, I’d tell myself, “Just go out and start your career and date people and live your life. Don’t wait until anything. Don’t wait.”Masses of Americans will resolve to lose weight for New Year’s, but Kelsey Miller, a senior features writer at Refinery29 and author of the upcoming memoir "Big Girl: How I Gave Up Dieting and Got a Life" (Jan. 5), would propose a different solution to body image woes. Miller struggled with her weight since childhood and hopped on and off fad diets throughout her teens and twenties. It took collapsing in the woods after a failed “Spartan Warrior Workout” for her to realize this method wasn't working. She had tried the same thing — dieting — for years and expected different results, the oft-cited definition of insanity. The outcome, of course, was instead the same: She’d grow miserable, relapse to “unhealthy” eating, and end up back where she started, eager to check another weight-loss regime off her list. So at 29, Miller embarked on a fast from fasting itself, chronicling her experiment in Refinery29's Anti-Diet Project column. Secretly, she hoped this new approach would somehow help her shed pounds. But when her nutrition therapist asked if she'd be okay never losing an ounce through intuitive eating — consuming exactly what she was hungry for — she was forced to reconsider what health and fitness meant. At a café in her home borough of Brooklyn, Miller told Salon about what she learned along her diet-free journey, the toxic messages the media sends about food and weight, and the advice she'd give others seeking a saner relationship with their bodies. The interview has been condensed and lightly edited for clarity. At the beginning of your book, you ditch dieting in favor of a new philosophy called intuitive eating. Could you explain what intuitive eating is?  It sounds like a fake thing, and that’s what I thought it was when I first heard about it, before I quit dieting. I’d stumbled across it a couple times and thought it sounded sort of adorable and crunchy and like something I should probably definitely do, but not yet — when I was thinner. When I had the big turning point, that phrase was hanging around in my head. Really, it’s just diet deprogramming and learning to eat like a normal person again. For better or worse, most of us don’t have a sense of the way we ate when we were 3 years old, when we weren’t afraid of basic carbohydrates. It’s learning to eat again as if you were a child with all those instincts firmly in place, before they got polluted by diet culture and outside influence. It’s having full permission to eat whatever. Nothing is going to make you a bad person if you eat it. And then it’s monitoring your fullness, which is just a very fancy way of saying, “You eat when you’re hungry. No matter what, you do it. And when you’re full, you acknowledge that.” Some people think it means you can only eat according to hunger and fullness and there’s no leeway for when you’re just like, “Oh my God, that cookie looks really good.” But of course you’re allowed to do that. I have a friend who’s an intuitive eating coach who references birthday cake as a case of “emotional eating” that’s important. We eat birthday cake for a reason. If you’re really not in the mood and you don’t feel like it, you don’t have it. But you don’t eat birthday cake because you’re like, “My body really needs some birthday cake right now, and I’m going to honor that,” and that doesn’t make it illegitimate. Can intuitive eating include consideration for what is healthiest to put in your body? Or is that its own form of disordered eating?  That was my fear when I started. I really thought I was just going to eat pizza until I died of pizza. But the truth is, when nothing is off limits and you’re eating mindfully, you realize pizza is good, and then too much pizza doesn’t feel good. And if you have a sense that there will always be pizza, you can always get more pizza, and it isn’t going anywhere, you’re not going to have that need to eat all the pizza when it’s in front of you. That sense of deprivation is really what leads to overeating. Once you have a sense of security around food and you’re not approaching food with a sense of deprivation and fear, absolutely, you can think of nutrition. I’ll have the thought sometimes, “There aren’t really a lot of vegetables in my breakfast this morning. I’ve got to get some greens in there.” I know my body feels better when I have roughage in it. Most people who have been dieting their whole lives have a sense of what’s healthy and what’s not, and the issue is that a lot of the ideas we form are a perversion of health. Instead of being like, “I don’t want to eat too much sugar because obviously I know that eating cups and cups of sugar won’t make my body feel good and function well,” we end up thinking, “Sugar is Satan, sugar is the devil, and if you eat sugar, I’ll call the police.” That sense of moralization around food and healthiness is different from wanting a vegetable on my plate. The association between food and morality was a major theme in your book. You wrote that while you were dieting, you viewed eating after 7 p.m. as "a crime on par with infanticide." How do you think we learn to assign moral value to food?  Man, we should write a thesis on that. I’m sure there is one somewhere. But, for one thing, we associate thinness with goodness and restriction with being good. “I’m being good this week” — that kind of thing. So, our language and the way we perceive and talk about food totally foster that connection. On the flip side, we hear about junk food and garbage food and poison — “gluten is poison,” people say — and so we take it to extremes. That’s the other side of the perversion of healthiness: that sense that I’m a bad person if I eat refined bread. It’s really hard to let go of that. Also, we have what we’re told growing up: what you should eat, when you should eat, what you shouldn’t have too much of. If I ate too much of something that my mom or dad didn’t want me to eat, I felt bad. I felt like I did something wrong. It starts very early, and the way we talk about food in the media totally feeds into that moral discussion. Do you think that’s a problem for women especially? I think it is for everybody, but certainly women are far more targeted by the diet industry. Obviously, men pick up on these things a lot, and we don’t talk about it with men. I don’t think a lot of men feel as comfortable talking about these things. There’s a lot more crossover than there used to be, but I think there’s an emphasis on women. If you look back, we associate women with morality, so why wouldn’t we associate women with purity and health and thinness? What do you mean when you say we associate women with morality? I’m thinking of the righteous wife, “pure as the driven snow,” all those archetypes of women in literature. Women are often described as virgins or whores, and those are just the opposite sides of morality. It seems like there are two opposing movements in this country right now. On the one hand, there’s the push away from moralizing food and bodies, toward body positivity and fat acceptance. On the other, there’s this fear of the so-called obesity epidemic, which some medical professionals and public figures like Michelle Obama are trying to combat. Do you think there’s room for both, or is concern for health just fat-shaming in disguise? The health-at-every-size movement has shed light on some interesting things, and this is an ongoing exploration, but the fact that people react to fat positivity and health-at-any-size should tell us something. The way we associate size and health is clearly wrongheaded. It’s vitriolic and hateful and just speaks to this enormous bias that’s absolutely endemic across our culture. That is something we need to take a good, hard look at. We should be addressing the foods we’re feeding ourselves and our kids, but we also need to be acknowledging and thinking about the way we feed them and the way we feed ourselves. We all know there are some people who are just going to be bigger naturally and some people who will be smaller naturally, and there are some people who are going to be bigger because they are eating poorly or they’re eating in a disordered way or they’re just overeating. There’s a lot of reasons behind that, but it’s not as simple as “fat person = bad, unhealthy person.” That moralization of health is in there because there’s nothing but shame and disgust in the way we talk about fatness. We have to address a lot of the foods we eat but also the way we talk about our bodies. If anything’s an emergency, that’s an emergency as well. But the quote-unquote obesity epidemic is a complicated matter, and I have no question that the people behind that are trying to do something good. Nobody wants American children to eat piles of mac and cheese nonstop and no vegetables. But why aren’t we talking about that instead of saying, “We have to fix our fat kids?” We now know that shaming and restriction haven’t gotten us very far. They have never made anybody any thinner or healthier or better. Why isn’t that part of the conversation more? That needs to be a global discussion as well. It seems like your experience with disordered eating without a diagnosable eating disorder is really common. What are some signs that your eating habits could be disordered?  That’s a really important distinction a lot of people don’t recognize yet. A lot more people have disordered eating than eating disorders. If any part of your day or your life is anchored around food, that’s a disordered behavior. If you have a moral attachment to a way of eating, that’s a disordered behavior. It’s a little complicated because there are people like ethical vegans who aren’t necessarily eating in a disordered way even though there is a morality there. But I think you have to take a hard look at yourself, which is very difficult, and say, “Am I engaging with this food in a way that is neutral by treating it like food, or am I treating it like something else?” I think food neutrality is the barometer there. Could you explain what you mean by “food neutrality?” The example I use in the book is that French fries were the ultimate food on my bad list, versus kale, which was like the saintliest of greens and just feels like you’re eating Gwyneth Paltrow’s blonde hair. It’s like you’re consuming something so pure and good and it’s making you better, whereas you have to eat French fries in a dark room alone so nobody can see this terrible crime you have committed, and you have to repent with kale and eat it in the presence of a thin person. That’s the food neutrality issue: when you’re treating food as something other than food. Even people who use words like “I’m so bad! I just ate a ton of French fries” — in the end, if it’s not haunting you but you’re still like, “Well, I ate the French fries. That’s too bad I ate a huge pile of French fries,” you’re still engaging with it like it’s not just food but the thing defining your moral character. That’s really difficult to let go of. I can’t look at a potato and not see Weight Watcher’s points. My brain just has that data, so I have to consciously look at a potato, and I hear the points in my head, and I tell myself the rest of the story about the potato: “This is the potato. It’s a potato. It is not something I’m going to have to make up for. It’s a vegetable. It’s a starch. Do I want it? Am I hungry? Do I crave it?” I ask myself a bunch of questions to bring it back to what it is: just a potato, nothing more. What did you gain from dieting that made it so hard to give up? How do you get those things now? The thrill. The thrill of the new diet and the fantasy that comes with it. Every time you start a new diet, you really believe this will be the one to change your body and that everything in your life will be a million times better and more magical when you’re done. They don’t say, “When you’re finished with this, you’ll have a pony,” but it’s basically implied. They imply that everything will be wonderful. And then, of course, there’s the comfort of failure. It’s all very familiar to start something new and then have some success and then stumble or plateau and then fail and then just reverse to that binge-y period in between diets. I won’t say I don’t miss that structure. I don’t want it, but it was a loss in my life. And I’m glad. I willingly sacrificed it because now I live a real life without the structure of the fantasy and the failure and everything so linear and all tied to how much I ate in a day. Now I live a much more complicated life. It’s a more full life, but I miss the simplicity sometimes. I miss the goals. I miss not having to think about it so much. So I have sympathy for myself and everybody else who still does this because it’s not just about the desire to lose weight. It’s about that cycle as well. I don’t have something that promises me a perfect future 30 pounds from now anymore, but that’s okay because that’s not a sustainable pattern, and that wasn’t getting me anywhere. That was keeping me stuck in limbo for a really long time, and now I live in the reality of day to day, which is sometimes great and sometimes shitty and sometimes boring, but it’s real. It’s the real deal. Do you think the Internet makes it harder to remain sane about food? For example, your book describes feeling overwhelmed with options when you order from Seamless, and you recount Internet trolls making you self-conscious. What has the Internet not affected in our lives? It makes everything better and worse, doesn’t it? I think the Internet has really fetishized food even more. We have all these food channels, and we have this crazy bacon obsession. It’s very extreme: “Oh my God, the bacon bonanza and the pork belly! What’s the most fatty, decadent thing I can eat?” versus “CLEANSING!” Social media has fed into that. I always want to take a picture of my sandwich. Every time you do that, you’re making food more than food. It took me a really long time to recognize that as maybe not the healthiest impulse. Is loving food just as destructive as fearing it, then? There’s a difference between loving food and having your life centered on it — because, man, there are those meals that really are special, and sometimes you have that life-changing burger or that thing your mom used to make that you’ll always remember from childhood. That is a normal and very okay thing to have in your life. But there’s a difference when every food has that intense vibration, when it stands out and you can’t just pick up an apple and eat it because that’s what you’d like to eat right now. So, yes, I think you can love food and not be obsessed with it. I still have foods that I absolutely love. I’ve even written about the “food porn” Instagram. You have to define the line for yourself, but there are times when it’s perfectly acceptable to be like, “Look at this goddamn beautiful cake I made” or “This pasta is wowing me.” Food is part of our lives, and we should celebrate it, but we have to be conscious about what we’re doing. What other advice would you give people who want to develop a healthier relationship with food and their bodies? For starters, it’s always good to get some help if you can, at least in the beginning. If I just did this myself, I wouldn’t have been held accountable. I needed somebody to remind me what I was doing, to remind me to trust myself. Whenever you feel alone, things are a lot more painful and harder to commit to. Help is available, even if it’s just picking up the Intuitive Eating book and going online to an intuitive eating community. While there’s a “pro-ana” community that people rely on to maintain their commitment to anorexia, there’s a much bigger and healthier community around body positivity. But you have to seek it out because you’re not just going to turn on the television and see body-positive representations. I’m certainly not the only one talking about these things. There’s plenty of people of all shapes and sizes and genders and backgrounds. So make an effort to immerse yourself in literature and blogs and people and a culture that support what you’re doing. If you could go back in time and talk to the version of yourself that was still dieting, is there anything you would say to spare yourself the pain of learning all these things the hard way? I think about that sometimes, but at that time I probably wouldn’t have been ready to hear it and would have just rejected it. It would be amazing, though, if I could have just shaken my shoulders and said, “You think your life’s going to change with this diet, but really, nothing is going to change until you stop this. Until you really, really stop this.” I wasted so much of my life treading water in this cycle and feeling like I couldn’t do anything until I fixed this problem that was my body. If I could, I’d tell myself, “Just go out and start your career and date people and live your life. Don’t wait until anything. Don’t wait.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 26, 2015 12:30

A new treatment for multiple sclerosis: “The drug has dramatic effects on relapsing MS”

Scientific American Symptoms come and go in most cases of multiple sclerosis (MS), a chronic disease in which the immune system attacks myelin, the nonconductive sheath that surrounds neurons' axons. Yet 10 to 15 percent of cases are progressive rather than relapsing. This more severe version appears later in life and is marked by steadily worsening symptoms. No treatments are currently available, but that might be about to change. In September pharmaceutical company Hoffmann–La Roche announced positive results from three large clinical trials of ocrelizumab, an injectable antibody medication that targets B cells, for both relapsing and progressive MS. They found that the drug was more effective at treating relapsing MS than interferon beta-1a (Rebif), a top-performing drug now used to treat the disease. Even more exciting, it slowed the advance of symptoms in patients with progressive MS for the entire 12-week duration of the study. “The drug has dramatic effects on relapsing MS, and we finally have our foot in the door with the progressive form,” says Stephen Hauser, a neurologist at the University of California, San Francisco, who was involved in the trials. The fact that ocrelizumab works on both types of MS is a tantalizing clue for scientists trying to understand the root causes of the disease and figure out why the inflammation of the relapsing form eventually turns into progressive degeneration in some patients. “These results give evidence that the inflammatory and the degenerative components of MS are related,” Hauser says. “The big question now is, If we begin treatment really early, can we protect relapsing patients from developing the progressive problems later on?” With these trials, Roche has cleared the last major hurdle in the FDA's drug-testing protocol. The company plans to file for approval to treat both forms of MS in early 2016, which means the drug could be on shelves as soon as 2017.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 26, 2015 12:00

We have Lincoln wrong: Our greatest Lincoln historian explains his real Civil War motivations

The United States has just concluded a five-year observance of the sesquicentennial of the Civil War. As in the past, most new books about the period have focused principally on matters military, reexamining the familiar major battles or offering new biographies of generals of the war. A few have explored new aspects of Lincoln’s life and presidency and the political conflicts immediately preceding and during the war. For all the merits of these recent volumes, too few have provided satisfying answers to an essential question: why was the Civil War really fought? This subject still cries out for serious and informed exploration and analysis. The prevailing arguments—that the war occurred to preserve the American Union for its own sake, to defend or destroy slavery, or to expand or restrict federal authority—fall short because they do not embrace the full vision for the future held by those engaged in the conflict. The most illuminating way to begin this essential conversation is to focus on the commander in chief who chose war rather than cede the democracy to those who would divide it rather than recognize its legitimacy. That ever-compelling figure, of course, is Abraham Lincoln. True, Lincoln has already inspired thousands of books. Yet while scores of new Lincoln volumes rolled off the presses during the period leading up to the bicentennial of his birth in 2009, and dozens more have appeared to coincide with the sesquicentennial of the years 1860–1865, only a few have actually dealt with the causes of the conflict—the conflagration that consumed nearly every day of his presidency and cost 750,000 American lives. Few have explored Lincoln’s motivations for fighting the war and maintaining the Union when the conflict expanded exponentially from a small struggle to an enormous war unprecedented in world history. The unanswered question remains more crucial to our own present and future than ever. Why would a basically peaceful man who might as easily have allowed the United States to divide in two, with no resulting loss of life or treasure, choose instead to lead a devastating American-versus-American war to maintain a fragile, still-experimental Union? This book offers a direct answer to that unresolved question with a new focus and a new emphasis. For too long, historians have accepted without challenge the notion that Lincoln determined to preserve the Union primarily because nationhood held a powerfully symbolic, almost “mystical” importance to him from childhood on. Fueled by Weems’s Life of Washington and similarly hagiographic stories of the American Revolution, the young Lincoln is said to have developed early a stubborn passion to cement the foundations of the Republic for all time. Another theory holds that Lincoln entered the presidency—and allowed the country to go to war with itself—to remove the stain of slavery that for more than fourscore years had blighted the original American commitment to “life, liberty, and the pursuit of happiness.” Elements of truth support both arguments, to be sure, but ignore the overwhelming evidence that Lincoln focused his entire political career, in peace and war alike, in pursuit of economic opportunity for the widest possible circle of hardworking Americans. To achieve this ambition he was willing to fight a war to maintain the perpetual existence of the one nation in the world that held the highest promise for people dedicated to this cause. Lincoln’s decision to resist Southern secession and fight a war to maintain the American Union was motivated primarily by his belief that the nation was founded on the idea that this country “proposed to give all a chance” and allow “the weak to grow stronger.” The toxic combination of secession together with an unending commitment to unpaid human bondage by a new and separate Confederate nation, he calculated, would be fatal to the American Dream. It posed a direct threat to a self-sustaining middle-class society and to the promise of America leading the way to spreading the idea of opportunity and upward mobility throughout the world. “I hold the value of life is to improve one’s condition,” Lincoln declared just three weeks before assuming the presidency, reiterating a lifetime of similarly expressed commitment to what historian Gabor Boritt brilliantly calls the uniquely American “right to rise.” Seven slaveholding Southern states had already declared by their independence the converse: the right to establish a nation of their own based on the denial of opportunity. Lincoln believed that the American nation based on the credo of opportunity for all was worth fighting for. “Whatever is calculated to advance the condition of the honest, struggling laboring man, so far as my judgment will enable me to judge of a correct thing, I am for that thing,” he said in 1861. In the face of unimaginable casualties and devastation, he remained for “that thing” for the rest of his life. The origin, depth, and durability of Lincoln’s commitment cry out for new exploration and interpretation, particularly now, as the ability to rise is being challenged in the United States by economic, social, and political conditions producing ever-increasing inequality. We Americans believe we so fully understand Abraham Lincoln’s contribution to our nation’s beliefs about slavery and freedom that his role in shaping our uniquely American vision of a just and generous economic society has been largely neglected. In fact, Lincoln was unwavering in his commitment to preserve the American Dream of economic opportunity for future generations, a dream he lived by escaping the poverty of his childhood and one he advocated throughout his political life. It was this commitment that lay behind his determination to ensure that a government dedicated to providing economic opportunity for its citizens “shall not perish from the earth.” Lincoln largely fought the Civil War over this principle, establishing a role for government in securing and guaranteeing economic opportunity for its citizens, a guarantee that has remained at the center of political debate and discord ever since, seldom so acrimoniously as today. Lincoln was the first president to use the federal government as an agent to support Americans in their effort to achieve and sustain a middle-class life. Even as the Civil War commenced, Lincoln supported a program of direct government action to support his vision of America’s middle-class society. More than is often realized, the Civil War was fought not over the morality of slavery or the abstract sanctity of the American Union, but over what kind of economy the nation should have. It is difficult to grasp the degree to which the United States, on the eve of the Civil War, had truly evolved into what Lincoln called, quoting scripture, a “house divided”: virtually two separate nations based on very different economic structures. More than anything else, the secession crisis and the Civil War became a clash over expanding the economic and social system of either section. The question became: which economy and society would define the future of America as it migrated westward, that of the North or that of the South? The American economy in the North before the Civil War supported a largely middle-class society. With almost unlimited natural resources, most Americans in the Northern states and northwestern territories had the opportunity to secure a middle-class life. Unlike most European countries and the American economy of the South, there was no aristocratic economic tradition in the North. Farmers owned their own land, craftsmen operated independent businesses, and doctors, lawyers, and other professionals maintained their own practices. Wealth was not concentrated in a few hands, and economic opportunity for adult white men was widespread. What Lincoln feared most was the spread of the Southern economic system. The fear was that the Southern slave-labor system would drive out free labor, first in the West, then later in the country as a whole. The fear was that the American Dream of unlimited economic opportunity—“a fair chance, in the race of life”—would no longer be available to future generations of Americans. Lincoln believed the unique purpose of the United States was to clear the path for the individual to labor for himself and get ahead economically. He called it the “laudable pursuit” of economic advancement. Lincoln understood that this purpose was challenged by the slave-based, aristocratic economic and social system of the Southern states. It was this dichotomy that created a house divided: two separate societies based on very different economic and social structures. Lincoln saw saving the Union not simply as a political objective but as a moral imperative to secure for the America of the future the democratic society of the Northern states, what we have come to call the American “middle-class” society. This was the moral imperative that made him willing to fight the Civil War. Lincoln was one of the first American leaders to fully grasp that economic opportunity to rise to the middle class was, in truth, the defining feature of America. More than any other president, Lincoln is the father of the American Dream that all Americans should have the opportunity through hard work to build a comfortable middle-class life. For Lincoln, liberty meant, above all, the right of individuals to enjoy the fruits of their own labor, which he saw as the best path to prosperity. Lincoln believed that the greatest evil of the Southern slave system—aside from the denial of liberty itself—was that it effectively blocked this economic pathway forever for white workers, who could not compete with slave labor, and for the slaves themselves, who could never hope to escape their bondage and eventually work for wages. Slavery itself, Lincoln believed, was morally repugnant and a stain on the founders’ vision that all men were created equal. But his commitment to economic opportunity was what spurred him on the path toward emancipation. It is crucial to remember that long before he was willing to entertain political or social rights for African Americans, including citizenship, voting rights, or racial equality, Lincoln insisted that African Americans were entitled to the same economic rights as all other Americans. This book explores Abraham Lincoln’s struggle to preserve, and ultimately redefine, the exceptionalism of the American experiment. Lincoln’s vision evolved from his personal experiences. His perspective was that of a man born into abject poverty who worked his way up the social and economic ladders through sheer discipline, persistence, and force of will. It was a perspective he never lost. It shaped his core values. As he put it, “The prudent, penniless beginner in the world, labors for wages awhile, saves a surplus with which to buy tools or land for himself; then labors on his own account another while, and at length hires another new beginner to help him. This is the just, and generous, and prosperous system, which opens the way to all—gives hope to all, and consequent energy, and progress, and improvement of condition to all.” This was, for Lincoln, the raison d’être of America and what made America a model to nations throughout the world. This subject could not be presented at a more opportune time as we confront anew—often at a decibel level that seems only a single notch lower than another civil war—the basic question of what defines our American nation. Americans today continue to pay lip service to the idea of a middle-class society. But there is no way to avoid the data that confirm the rising tide of income inequality in the United States as well as the rest of the world. Are we condemned to a new society with an ever-declining middle class? Or can we find our way back to public policies that nurture a reinvigorated middle-class society, a society that restores Lincoln’s commitment to a nation that is not only “of the people” and “by the people” but also “for the people”? Understanding Lincoln’s lifework challenges us to confront the ever-growing fragility of our present condition and the challenge of a new century to complete what Lincoln called, at Gettysburg, America’s “unfinished work.” This book recalculates the foundations of Lincoln’s political faith, examines the philosophical commitments that undergirded his actions in the secession crisis of 1860 and 1861, underscores the development of Lincoln’s rhetoric into American nationalistic gospel, and describes the wildly varying efforts by his White House successors to align with, interpret, ignore, or co-opt his message. We seek to unravel the complete legacy of politics from Lincoln’s time to ours to define its influence and assess the importance of Lincoln’s enduring and ever-challenging call to action. We have self-consciously let Lincoln speak for himself on the issues he was most concerned about. Rather than offering short phrases from Lincoln’s speeches and writings, we present long quotations to provide our readers with the contextual framework that was critical to Lincoln’s thoughts and arguments. In writing this book, we have learned much from the many excellent scholarly studies of the role played by Lincoln in his lifetime. We were both fortunate to work over the years with David Herbert Donald, whose outstanding biography provides the frame of reference for any serious scholarly work on Lincoln. We have also brought to bear the work we have individually done in our previous studies of Lincoln’s life and legacy. In this book, we have utilized some of the thoughts and some of the language of our previous individual works about specific aspects of Lincoln’s life and legacy. We hope our readers will benefit from the way in which we have put Lincoln’s actual words and our own earlier writings in the context of our new understanding of Lincoln’s continuing role in the future life of our nation. The question on the eve of the Civil War was whether the democratic system envisioned by the nation’s founders would survive. Lincoln had long endured in a “house divided” between two ways of life. On the one side was a Northern middle-class society honoring labor and offering multiple opportunities for economic advancement by ordinary people, where government was assuming an increasingly constructive role in “clearing the path” for economic success. On the other side was a Southern aristocratic society rigidly divided between rich and poor, ensuring through law and oppression that labor—white and black—remained fixed in place, devalued and cheap, dedicated to an unfettered market, neglectful of the public sector, and offering few opportunities for ordinary people and none at all for a whole race of human beings. For Lincoln, the choice, painful as it became, was never a hard one. Until his dying day, fulfillment of the American Dream remained what Lincoln called at Gettysburg his “unfinished work”—and America’s. Excerpted from "A Just and Generous Nation: Abraham Lincoln and the Fight for American Opportunity" by Harold Holzer and Norton Garfinkle. Published by Basic Books. Copyright 2015 by Harold Holzer and Norton Garfinkle. Reprinted with permission of the publisher. All rights reserved.The United States has just concluded a five-year observance of the sesquicentennial of the Civil War. As in the past, most new books about the period have focused principally on matters military, reexamining the familiar major battles or offering new biographies of generals of the war. A few have explored new aspects of Lincoln’s life and presidency and the political conflicts immediately preceding and during the war. For all the merits of these recent volumes, too few have provided satisfying answers to an essential question: why was the Civil War really fought? This subject still cries out for serious and informed exploration and analysis. The prevailing arguments—that the war occurred to preserve the American Union for its own sake, to defend or destroy slavery, or to expand or restrict federal authority—fall short because they do not embrace the full vision for the future held by those engaged in the conflict. The most illuminating way to begin this essential conversation is to focus on the commander in chief who chose war rather than cede the democracy to those who would divide it rather than recognize its legitimacy. That ever-compelling figure, of course, is Abraham Lincoln. True, Lincoln has already inspired thousands of books. Yet while scores of new Lincoln volumes rolled off the presses during the period leading up to the bicentennial of his birth in 2009, and dozens more have appeared to coincide with the sesquicentennial of the years 1860–1865, only a few have actually dealt with the causes of the conflict—the conflagration that consumed nearly every day of his presidency and cost 750,000 American lives. Few have explored Lincoln’s motivations for fighting the war and maintaining the Union when the conflict expanded exponentially from a small struggle to an enormous war unprecedented in world history. The unanswered question remains more crucial to our own present and future than ever. Why would a basically peaceful man who might as easily have allowed the United States to divide in two, with no resulting loss of life or treasure, choose instead to lead a devastating American-versus-American war to maintain a fragile, still-experimental Union? This book offers a direct answer to that unresolved question with a new focus and a new emphasis. For too long, historians have accepted without challenge the notion that Lincoln determined to preserve the Union primarily because nationhood held a powerfully symbolic, almost “mystical” importance to him from childhood on. Fueled by Weems’s Life of Washington and similarly hagiographic stories of the American Revolution, the young Lincoln is said to have developed early a stubborn passion to cement the foundations of the Republic for all time. Another theory holds that Lincoln entered the presidency—and allowed the country to go to war with itself—to remove the stain of slavery that for more than fourscore years had blighted the original American commitment to “life, liberty, and the pursuit of happiness.” Elements of truth support both arguments, to be sure, but ignore the overwhelming evidence that Lincoln focused his entire political career, in peace and war alike, in pursuit of economic opportunity for the widest possible circle of hardworking Americans. To achieve this ambition he was willing to fight a war to maintain the perpetual existence of the one nation in the world that held the highest promise for people dedicated to this cause. Lincoln’s decision to resist Southern secession and fight a war to maintain the American Union was motivated primarily by his belief that the nation was founded on the idea that this country “proposed to give all a chance” and allow “the weak to grow stronger.” The toxic combination of secession together with an unending commitment to unpaid human bondage by a new and separate Confederate nation, he calculated, would be fatal to the American Dream. It posed a direct threat to a self-sustaining middle-class society and to the promise of America leading the way to spreading the idea of opportunity and upward mobility throughout the world. “I hold the value of life is to improve one’s condition,” Lincoln declared just three weeks before assuming the presidency, reiterating a lifetime of similarly expressed commitment to what historian Gabor Boritt brilliantly calls the uniquely American “right to rise.” Seven slaveholding Southern states had already declared by their independence the converse: the right to establish a nation of their own based on the denial of opportunity. Lincoln believed that the American nation based on the credo of opportunity for all was worth fighting for. “Whatever is calculated to advance the condition of the honest, struggling laboring man, so far as my judgment will enable me to judge of a correct thing, I am for that thing,” he said in 1861. In the face of unimaginable casualties and devastation, he remained for “that thing” for the rest of his life. The origin, depth, and durability of Lincoln’s commitment cry out for new exploration and interpretation, particularly now, as the ability to rise is being challenged in the United States by economic, social, and political conditions producing ever-increasing inequality. We Americans believe we so fully understand Abraham Lincoln’s contribution to our nation’s beliefs about slavery and freedom that his role in shaping our uniquely American vision of a just and generous economic society has been largely neglected. In fact, Lincoln was unwavering in his commitment to preserve the American Dream of economic opportunity for future generations, a dream he lived by escaping the poverty of his childhood and one he advocated throughout his political life. It was this commitment that lay behind his determination to ensure that a government dedicated to providing economic opportunity for its citizens “shall not perish from the earth.” Lincoln largely fought the Civil War over this principle, establishing a role for government in securing and guaranteeing economic opportunity for its citizens, a guarantee that has remained at the center of political debate and discord ever since, seldom so acrimoniously as today. Lincoln was the first president to use the federal government as an agent to support Americans in their effort to achieve and sustain a middle-class life. Even as the Civil War commenced, Lincoln supported a program of direct government action to support his vision of America’s middle-class society. More than is often realized, the Civil War was fought not over the morality of slavery or the abstract sanctity of the American Union, but over what kind of economy the nation should have. It is difficult to grasp the degree to which the United States, on the eve of the Civil War, had truly evolved into what Lincoln called, quoting scripture, a “house divided”: virtually two separate nations based on very different economic structures. More than anything else, the secession crisis and the Civil War became a clash over expanding the economic and social system of either section. The question became: which economy and society would define the future of America as it migrated westward, that of the North or that of the South? The American economy in the North before the Civil War supported a largely middle-class society. With almost unlimited natural resources, most Americans in the Northern states and northwestern territories had the opportunity to secure a middle-class life. Unlike most European countries and the American economy of the South, there was no aristocratic economic tradition in the North. Farmers owned their own land, craftsmen operated independent businesses, and doctors, lawyers, and other professionals maintained their own practices. Wealth was not concentrated in a few hands, and economic opportunity for adult white men was widespread. What Lincoln feared most was the spread of the Southern economic system. The fear was that the Southern slave-labor system would drive out free labor, first in the West, then later in the country as a whole. The fear was that the American Dream of unlimited economic opportunity—“a fair chance, in the race of life”—would no longer be available to future generations of Americans. Lincoln believed the unique purpose of the United States was to clear the path for the individual to labor for himself and get ahead economically. He called it the “laudable pursuit” of economic advancement. Lincoln understood that this purpose was challenged by the slave-based, aristocratic economic and social system of the Southern states. It was this dichotomy that created a house divided: two separate societies based on very different economic and social structures. Lincoln saw saving the Union not simply as a political objective but as a moral imperative to secure for the America of the future the democratic society of the Northern states, what we have come to call the American “middle-class” society. This was the moral imperative that made him willing to fight the Civil War. Lincoln was one of the first American leaders to fully grasp that economic opportunity to rise to the middle class was, in truth, the defining feature of America. More than any other president, Lincoln is the father of the American Dream that all Americans should have the opportunity through hard work to build a comfortable middle-class life. For Lincoln, liberty meant, above all, the right of individuals to enjoy the fruits of their own labor, which he saw as the best path to prosperity. Lincoln believed that the greatest evil of the Southern slave system—aside from the denial of liberty itself—was that it effectively blocked this economic pathway forever for white workers, who could not compete with slave labor, and for the slaves themselves, who could never hope to escape their bondage and eventually work for wages. Slavery itself, Lincoln believed, was morally repugnant and a stain on the founders’ vision that all men were created equal. But his commitment to economic opportunity was what spurred him on the path toward emancipation. It is crucial to remember that long before he was willing to entertain political or social rights for African Americans, including citizenship, voting rights, or racial equality, Lincoln insisted that African Americans were entitled to the same economic rights as all other Americans. This book explores Abraham Lincoln’s struggle to preserve, and ultimately redefine, the exceptionalism of the American experiment. Lincoln’s vision evolved from his personal experiences. His perspective was that of a man born into abject poverty who worked his way up the social and economic ladders through sheer discipline, persistence, and force of will. It was a perspective he never lost. It shaped his core values. As he put it, “The prudent, penniless beginner in the world, labors for wages awhile, saves a surplus with which to buy tools or land for himself; then labors on his own account another while, and at length hires another new beginner to help him. This is the just, and generous, and prosperous system, which opens the way to all—gives hope to all, and consequent energy, and progress, and improvement of condition to all.” This was, for Lincoln, the raison d’être of America and what made America a model to nations throughout the world. This subject could not be presented at a more opportune time as we confront anew—often at a decibel level that seems only a single notch lower than another civil war—the basic question of what defines our American nation. Americans today continue to pay lip service to the idea of a middle-class society. But there is no way to avoid the data that confirm the rising tide of income inequality in the United States as well as the rest of the world. Are we condemned to a new society with an ever-declining middle class? Or can we find our way back to public policies that nurture a reinvigorated middle-class society, a society that restores Lincoln’s commitment to a nation that is not only “of the people” and “by the people” but also “for the people”? Understanding Lincoln’s lifework challenges us to confront the ever-growing fragility of our present condition and the challenge of a new century to complete what Lincoln called, at Gettysburg, America’s “unfinished work.” This book recalculates the foundations of Lincoln’s political faith, examines the philosophical commitments that undergirded his actions in the secession crisis of 1860 and 1861, underscores the development of Lincoln’s rhetoric into American nationalistic gospel, and describes the wildly varying efforts by his White House successors to align with, interpret, ignore, or co-opt his message. We seek to unravel the complete legacy of politics from Lincoln’s time to ours to define its influence and assess the importance of Lincoln’s enduring and ever-challenging call to action. We have self-consciously let Lincoln speak for himself on the issues he was most concerned about. Rather than offering short phrases from Lincoln’s speeches and writings, we present long quotations to provide our readers with the contextual framework that was critical to Lincoln’s thoughts and arguments. In writing this book, we have learned much from the many excellent scholarly studies of the role played by Lincoln in his lifetime. We were both fortunate to work over the years with David Herbert Donald, whose outstanding biography provides the frame of reference for any serious scholarly work on Lincoln. We have also brought to bear the work we have individually done in our previous studies of Lincoln’s life and legacy. In this book, we have utilized some of the thoughts and some of the language of our previous individual works about specific aspects of Lincoln’s life and legacy. We hope our readers will benefit from the way in which we have put Lincoln’s actual words and our own earlier writings in the context of our new understanding of Lincoln’s continuing role in the future life of our nation. The question on the eve of the Civil War was whether the democratic system envisioned by the nation’s founders would survive. Lincoln had long endured in a “house divided” between two ways of life. On the one side was a Northern middle-class society honoring labor and offering multiple opportunities for economic advancement by ordinary people, where government was assuming an increasingly constructive role in “clearing the path” for economic success. On the other side was a Southern aristocratic society rigidly divided between rich and poor, ensuring through law and oppression that labor—white and black—remained fixed in place, devalued and cheap, dedicated to an unfettered market, neglectful of the public sector, and offering few opportunities for ordinary people and none at all for a whole race of human beings. For Lincoln, the choice, painful as it became, was never a hard one. Until his dying day, fulfillment of the American Dream remained what Lincoln called at Gettysburg his “unfinished work”—and America’s. Excerpted from "A Just and Generous Nation: Abraham Lincoln and the Fight for American Opportunity" by Harold Holzer and Norton Garfinkle. Published by Basic Books. Copyright 2015 by Harold Holzer and Norton Garfinkle. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 26, 2015 11:00

2015: The year in political theater, from Trump to ISIS — and then there’s the scary stuff

You might think that any discussion of 2015 as political theater begins and ends with Donald Trump, with a whole lot of cheesy Trump-burger stuffed into the middle and rich, chocolatey Trump sauce poured over the top. You would not be entirely wrong. If we understand Trump’s presidential campaign as a performance—and, honestly, how can we do otherwise? Every single one of the thousands of articles published this year about the irresistible rise of this impossible candidate begins from the presumption that it’s a performance first, and other things later. So if we start there, Trump is all-purpose tragicomedy, a guy playing King Lear and the Fool and Henry V and, I don’t know, the moronic cop from “Much Ado About Nothing,” all at the same time. He’s the clown in the circus and also the lion, the diabolical ringmaster and also the guy in a silk hat who’s buying the circus and tearing it down to build another ghastly condo tower. Addictive as the Trump spectacle may be, if we stop there we have essentially surrendered to the Trumpian worldview, which holds that only the spectacle of politics matters, and the substance is illusory or nonexistent. That’s a salutary perspective in many ways, but politics is not just about the gang of idiots who run for elective office. They’re the stars, but not the writers and directors. And political theater can mean much more than empty pandering. It can be an instrument for change, beneficial or otherwise, and we saw that in 2015 as well. As the year began, the nation was captivated by political theater of a very different kind, diametrically opposed to Trump-mania: the wave of Black Lives Matter protests that rippled outward from Ferguson, Missouri, after the death of Michael Brown the previous October. Whether the activist outrage of BLM has accomplished meaningful change, and how much, remains to be seen. But there’s no doubt that the movement and its moment at least temporarily focused national attention on an issue many white people had actively avoided confronting. The role played by BLM protesters as cop-killing, America-hating villains in the fantasy universe of Bill O’Reilly and Sean Hannity makes that clear enough. BLM and its causes certainly did not disappear as the year progressed, and the street theater of Ferguson and Baltimore helped to galvanize racial protests that upended the normal social order at the University of Missouri, Yale and numerous other colleges and universities later in the year. But we’re getting ahead of ourselves. BLM was abruptly shunted off the stage late in January by another form of radical political theater whose goals are nebulous, but do not include democratic social reforms. I’m talking about the Charlie Hebdo attacks in Paris, which fundamentally defined the character of 2015, and whose consequences go far beyond anything Donald Trump has ever done or will ever do. Is it obscene to consider the crimes of ISIS, and those of its acolytes and copycats, as modes of political theater? I don’t think so. The fact that real people died does not mean that those events had no symbolic power; the true object of terrorism, after all, is not the people who are killed but those who are terrorized. Even in bringing this up, I’m compelled to fend off the notion that ISIS is the most terrifying thing in human history, or at least the very worst thing in the world at this moment. That testifies to the would-be caliphate’s enormous success as theater. With a handful of gruesome YouTube videos and two murderous attacks on the cultural capital of Europe, ISIS has provoked a worldwide reaction grossly out of scale to the actual danger. Unlike al-Qaida, whose symbolic messaging was always hokey and tedious—Osama riding his stupid charger, or delivering endless lectures on websites apparently designed in 1991—ISIS is a cultural phenomenon hatched within the Western media spectacle itself, like the monster in “Alien.” It looks menacing in large part because its political agenda is nonsensical, and isn’t even really an agenda. (By contrast, al-Qaida’s goals were coherent if unachievable: Western withdrawal from the Muslim world, and the defeat or destruction of Israel.) The Islamic State is a non-state, with none of the desires or behavior patterns of normal states. It doesn’t want to sign treaties or forge alliances or exchange ambassadors, not with Western nations, not with Russia or China, and least of all with its Muslim neighbors. Its only vision of the future is the Islamic zombie apocalypse, which is above all a theatrical vision drawn from the deepest nightmares of Western philosophy and psychology. In political terms, Trump benefited immensely from both Paris attacks and the thematically related mass shooting in San Bernardino. Whether or not he actually appears in an ISIS recruitment video, his theatrics and those of the would-be caliphate reinforce each other in a toxic feedback loop. With the resurgence of youth activism exemplified by BLM, the campus protests of early fall and the Bernie Sanders campaign, we can perceive glimmers of a more promising Third Way in political theater, a possible pathway through the swampy morass between Trump and ISIS. (It’s something like the Dead Marshes in “Lord of the Rings,” with ghostly faces—Joe McCarthy, Adolf Eichmann, Vlad the Impaler—leering in the stagnant water.) There are always minor players to consider, the ones who strut and fret on the national stage for 15 minutes (or less) and then are heard no more: Ben Carson had his moment of mild-mannered cluelessness, and Carly Fiorina looked like the thinking woman’s conservative candidate for a split second, until she was revealed as a pathological liar even by Republican standards. Lindsey Graham barely bothered to treat his own campaign seriously, and I miss him already. But the actors to watch out for are always the ones who appear not to be acting, and who announce that their performance is not a performance at all. (According to Jesse Eisenberg, Al Pacino once gave him the following advice about acting in the movies: “Just because they say ‘action’ doesn’t mean you have to do anything.”) Bernie Sanders’ garrulous anti-charisma is a key aspect of his appeal, while his principal Democratic opponent is a woman famous for being devoid of warmth and inept at retail politics. If Hillary Clinton’s performance persona is that of the banker who’s about to turn you down for a loan, while Sanders is the grandpa you just awakened from his nap, they are struggling to outdo each other in earnestness and sincerity, as if those qualities could not possibly be packaged or rehearsed. Across the aisle we have a dark horse Best Bad Actor contender coming up on the outside: Ted Cruz, with his Peter Parker haircut, his weaponized daughters and his pristine Western wear ensembles, which appear to have been purchased by his mother for the first day of school. Cruz delivers Trump’s paranoid-fantasy dialogue in the manner of an irritable suburban drama teacher telling his charges for the love of Mike not to leave their costumes lying on the floor. ISIS won’t want him for the recruitment video; he’s just too creepy. We can only hope Americans agree.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 26, 2015 09:00