Helen H. Moore's Blog, page 952

November 15, 2015

Let’s not get it wrong this time: The terrorists won after 9/11 because we chose to invade Iraq, shred our Constitution

What is terrorism? Many are convinced that the word is inherently so vague as to be meaningless. I have never understood this. To me the definition seems singular, and obvious, and it would appear that simply understanding it is the key to avoiding terrible missteps in the aftermath of an attack like the one in Paris. Terrorism is a tactic in which the primary objective is to produce fear, rather than direct harm. Terrorist attacks are, first and foremost, psychological operations designed to alter behavior amongst the terrorized in a way that the actors believe will serve them. The 9/11 perpetrators killed about 3,000 people, and did about $13 billion in physical damage to the United States. That’s a lot of harm in absolute terms, but not relative to a nation of 300 million people, with a GDP of almost $15 trillion. It was a massive blow to many families, and to New York City. But to the nation as a whole that level of damage was about as dangerous as a bee sting. You may find that analogy suspect because bee stings are deadly to those with an allergy. But what kills people is not the sting itself. It is their own massive overreaction to an otherwise tiny threat, that fatally disrupts the functional systems of the body. And that is exactly what terrorists hope to trigger—a muscular and reflexive response on the part of the victim-state that advances the perpetrators’ interests far beyond their own capacity to advance them. The 9/11 attack was symbolic. It was not designed to cripple us economically or militarily, at least not directly. It was designed to provoke a reaction. The reaction cost more than 6,000 American lives in the wars in Iraq and Afghanistan, and more than $3 trillion in U.S. treasure. The reaction also caused the United States to cripple its own Constitution and radicalize the Muslim world with a reign of terror that has killed hundreds of thousands of Iraqi and Afghani civilians. The return on the terrorists’ investment was spectacular. Assuming the official story is right, then Al Qaeda got $7 million of effect for every dollar it spent on the attack--$7 million, to one. The ratio of harm inflicted on U.S. targets by the 9/11 attacks, to the financial harm the U.S. inflicted on itself reflects the same amplification. For every $1 of damage they did to us, we did $231 to ourselves. For every American that was killed in the attack, we sacrificed more than two on the battlefield. And that is all before we consider the instability we brought to the Middle East, the harm we did to our own freedoms, and the spectacular cost to our reputation abroad. The lesson, of course, is that above all else a nation should refuse to do what everyone will expect it to do in response to an attack. And if there is a silver lining, it is that one does not need to be sure of the identity or intent of their attackers to respond intelligently. Terrorists do not engage in terror attacks because they are strong. They engage in these attacks because they are weak. The gruesome spectacle of terrorism is a cost saving measure in which the fears of the victims and onlookers amplify the resources that the terrorists themselves are able to deploy. Reacting reflexively is inherently self-defeating. If a nation wishes to make itself an unappealing target, then it should get its primordial fears under control. We are not made safe from terrorists by helicopters, or missiles or boots on the ground. Nor is it drones, torture or digital dragnets that protect us. What makes us as individuals safe from a terror attack is the staggering probability that we will be elsewhere when one occurs. Accepting a tiny chance that we will die at the hands of terrorists is a bargain price for freedom. Reconciling oneself to it is very much like accepting a small chance that one will die on the highway, in exchange for the ability to travel at will. There is much we do not know, and much we many never know about ISIS and its objectives. We can, however be sure of this: ISIS would like the citizens of the West to surrender their liberties, while lashing out blindly into the dark. This time, let’s not. Bret Weinstein is a professor of evolutionary biology at The Evergreen State College in Olympia, Washington

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 15, 2015 07:15

November 14, 2015

Here’s what I’ve got for you, Kid: Lucky for my daughter, she’s half made up of her dad so the bad knees and slow metabolism aren’t a sure thing

A few months ago I paid $99 to spit into a cup. I sent the cup to a lab, and a few weeks later I got an email detailing my ancestral composition (mostly British, Irish and German. Absolutely no surprises there; I look, much to my dismay, like a less-murderous Ilse Koch).  It was nice, having a tidy chart of my ethnic provenance, a map of my little store of DNA. I mean, it’s a dead boring chart, ethnicity-wise, but still.  I’ve passed this genetic bundle on to a new human. My daughter is, delightfully, a real 50-50 split between her father (Italian, Irish) and myself (Bitch of Buchenwald). She has my hair and eyes, and his full lips and upturned nose. Making a next step in a genetic through-line leads one to meditate on the nature of that through-line – of heritage, heritability, what it means to be of a people or from a place. In other words, what else is in this bundle I’m giving her?  Where I’m from is a non-starter in terms of my own identity, and certainly of hers. I grew up two towns east of Hartford, Connecticut (or, two units of dullness away from Dulltown, USA, if you prefer). I have no familial ties to the area – my parents both hail from Houston, and nobody in my family lives in Connecticut anymore. I left my town as quickly as I could and never looked back. I feel a pang, a sense of having missed out on something, when I hear people describe a sense of being from a place – like, you have a connection to the place where you were born? The sights and smells and rhythms of your home are a part of you, and not something to shed as quickly and ruthlessly as possible? That sounds nice. It is my hope, raising our child in Brooklyn, that she will feel an enthusiasm for her natal lands that I did not. It is entirely possible that she will disappear into the wilds of Montana to live out her days with only a Husky for companionship and curse us for raising her in the mass of humanity that is New York.  And what about ethnicity, anyway? My connections to the Old Countries have long been stretched past the breaking point; both sides of my family have been in Texas so long that, my parents having moved North, Texas itself is the “old country.” Like a lot of suburban white people, my folkways consisted of, more or less, going to the mall and watching television. Go back far enough in parts of my family, and you get into some business I am truly not looking forward to addressing with my daughter.  I do not, in all seriousness, know how to tell her what kind of people she’s from. Boring, shading back into monsters? Will that do?  Other items in the grab bag: Bad knees. Slow metabolism. Stubby hands. Skin problems (already old hat over here; try Vanicream!). If she winds up with my frame, she’ll look pregnant her whole life. Oh and a funhouse’s worth of mental disorders from schizophrenia to psychotic breaks to plain ol’ depression, BUT EVERYTHING IS GOING TO BE JUST FINE DON’T YOU WORRY. I mean, it skipped me, so far, but the game’s not over quite yet – there’s still time for me to be bundled off to the cuckoo’s nest. I hope if it happens when she’s an adult, she puts me somewhere nice.  What’s not in the bag? I recently bought Martha Stewart’s handbook of how to do all the things around one’s house.  I bought this book because I know how to do none of the things. This is not because we had any kind of housekeeper (we did not), but because things were more or less left undone or done wrong – not in a hoarders-level kind of wrong, mind you, but with the exception of my father showing me how to sew on a button, the level of Important Life Skills being passed along was minimal at best. As a result I admire – worship, maybe – competence and skill in others; nothing pleases me more than watching somebody who knows what the hell they’re doing do a thing, whether it’s fixing a leak, hanging wallpaper or just playing pool. I crave this competence; I lack it sorely. But then, I’m only about a quarter of the way through the Martha book.  My daughter is, as I write this, only 3 – certainly nowhere near formed, but aspects of who she might be as a person are shining through. She loves arranging things in tidy rows, a neatnik (or possibly OCD) thing she definitely didn’t get from me – but I see the pride shining in her neatnik (OCD?) father’s eyes. We brought her recently to a book event I did at a bookstore that has a bar in it, and she was gloriously in her element. Hugging strangers at a bar? That’s my kid, all right.  I’m trying to up my game in hopes of having some skills – besides dumb arts-related stuff – to pass on to my daughter. We’re unlikely to become Doomsday Prepper-level DIYers, but I can make sure she has as full a complement of Basic Grown-up Life Skills as possible.  We have a whole big beautiful city and a whole big beautiful world to give her, and my hope for her is that she will navigate it with grace and aplomb – especially the ugly parts. I can’t do much about the past, but I can raise her to be a kind and conscious person, aware of her privilege in the world and her duty to refit society in a way that levels the playing field. I can and will teach her to sew on a button. And lucky for her, she’s half made-up of her daddy. I mean, he’s Canadian, after all. Emily Flake is the author of "Mama Tried: Dispatches From the Seamy Underbelly of Modern Parenting."A few months ago I paid $99 to spit into a cup. I sent the cup to a lab, and a few weeks later I got an email detailing my ancestral composition (mostly British, Irish and German. Absolutely no surprises there; I look, much to my dismay, like a less-murderous Ilse Koch).  It was nice, having a tidy chart of my ethnic provenance, a map of my little store of DNA. I mean, it’s a dead boring chart, ethnicity-wise, but still.  I’ve passed this genetic bundle on to a new human. My daughter is, delightfully, a real 50-50 split between her father (Italian, Irish) and myself (Bitch of Buchenwald). She has my hair and eyes, and his full lips and upturned nose. Making a next step in a genetic through-line leads one to meditate on the nature of that through-line – of heritage, heritability, what it means to be of a people or from a place. In other words, what else is in this bundle I’m giving her?  Where I’m from is a non-starter in terms of my own identity, and certainly of hers. I grew up two towns east of Hartford, Connecticut (or, two units of dullness away from Dulltown, USA, if you prefer). I have no familial ties to the area – my parents both hail from Houston, and nobody in my family lives in Connecticut anymore. I left my town as quickly as I could and never looked back. I feel a pang, a sense of having missed out on something, when I hear people describe a sense of being from a place – like, you have a connection to the place where you were born? The sights and smells and rhythms of your home are a part of you, and not something to shed as quickly and ruthlessly as possible? That sounds nice. It is my hope, raising our child in Brooklyn, that she will feel an enthusiasm for her natal lands that I did not. It is entirely possible that she will disappear into the wilds of Montana to live out her days with only a Husky for companionship and curse us for raising her in the mass of humanity that is New York.  And what about ethnicity, anyway? My connections to the Old Countries have long been stretched past the breaking point; both sides of my family have been in Texas so long that, my parents having moved North, Texas itself is the “old country.” Like a lot of suburban white people, my folkways consisted of, more or less, going to the mall and watching television. Go back far enough in parts of my family, and you get into some business I am truly not looking forward to addressing with my daughter.  I do not, in all seriousness, know how to tell her what kind of people she’s from. Boring, shading back into monsters? Will that do?  Other items in the grab bag: Bad knees. Slow metabolism. Stubby hands. Skin problems (already old hat over here; try Vanicream!). If she winds up with my frame, she’ll look pregnant her whole life. Oh and a funhouse’s worth of mental disorders from schizophrenia to psychotic breaks to plain ol’ depression, BUT EVERYTHING IS GOING TO BE JUST FINE DON’T YOU WORRY. I mean, it skipped me, so far, but the game’s not over quite yet – there’s still time for me to be bundled off to the cuckoo’s nest. I hope if it happens when she’s an adult, she puts me somewhere nice.  What’s not in the bag? I recently bought Martha Stewart’s handbook of how to do all the things around one’s house.  I bought this book because I know how to do none of the things. This is not because we had any kind of housekeeper (we did not), but because things were more or less left undone or done wrong – not in a hoarders-level kind of wrong, mind you, but with the exception of my father showing me how to sew on a button, the level of Important Life Skills being passed along was minimal at best. As a result I admire – worship, maybe – competence and skill in others; nothing pleases me more than watching somebody who knows what the hell they’re doing do a thing, whether it’s fixing a leak, hanging wallpaper or just playing pool. I crave this competence; I lack it sorely. But then, I’m only about a quarter of the way through the Martha book.  My daughter is, as I write this, only 3 – certainly nowhere near formed, but aspects of who she might be as a person are shining through. She loves arranging things in tidy rows, a neatnik (or possibly OCD) thing she definitely didn’t get from me – but I see the pride shining in her neatnik (OCD?) father’s eyes. We brought her recently to a book event I did at a bookstore that has a bar in it, and she was gloriously in her element. Hugging strangers at a bar? That’s my kid, all right.  I’m trying to up my game in hopes of having some skills – besides dumb arts-related stuff – to pass on to my daughter. We’re unlikely to become Doomsday Prepper-level DIYers, but I can make sure she has as full a complement of Basic Grown-up Life Skills as possible.  We have a whole big beautiful city and a whole big beautiful world to give her, and my hope for her is that she will navigate it with grace and aplomb – especially the ugly parts. I can’t do much about the past, but I can raise her to be a kind and conscious person, aware of her privilege in the world and her duty to refit society in a way that levels the playing field. I can and will teach her to sew on a button. And lucky for her, she’s half made-up of her daddy. I mean, he’s Canadian, after all. Emily Flake is the author of "Mama Tried: Dispatches From the Seamy Underbelly of Modern Parenting."

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 15:30

“Vagina voter”: Witnessing the sexism hurled at Hillary in ’08 — and the assumptions made about her supporters — changed my life

Hillary Clinton and her myriad personal and political experiences have made me a braver person than I used to be. Looking back on the 2008 primary campaign for the Democratic presidential nomination, I can connect the events that ultimately changed how I see myself as a political person. As a volunteer on Hillary’s campaign, for eight months, I made numerous phone calls to super-delegates and members of the National Organization for Women. Until then, I’d never followed Hillary’s life or career too closely. I only became a volunteer after I saw the sexism that was thrown her way by fellow Democrats as she campaigned; that sexism prompted me to learn more about her as a person and as a candidate. I didn’t realize at the time how strongly sexism would ultimately shape Hillary’s campaign and its outcome, as well as my view of the political world. At the time, I was working on a documentary film about the women’s liberation movement and had a full-time job in the film industry as a visual effects editor. My daughter was five years old; the time demands of motherhood were still new for me. Nevertheless, I managed to schedule my campaign volunteer work during lunch breaks and in the few hours I had after I got home. It was the first time I ever volunteered for a presidential political campaign. Was it hard? Yes. But it was transformational. It was explained to me once in the early days of my visual effects work on feature films that although the work seemed difficult and abstract, one day I would understand its technical complexity in a simple way. The metaphor used was the light bulb being turned on in a darkened room. One day the switch would flip to “on” and I would fully understand. My growing awareness of how sexism impacted Hillary’s presidential campaign wasn’t achieved as easily as a switch turning on; it was a slow rotation, as with a dimmer switch. With each incremental turn, a cultural undercurrent that involved women and presidential politics was illuminated. It changed me profoundly. The first turn of that political dimmer switch happened in my kitchen in 2007. I was watching Hillary’s online announcement to the nation that she was running for president. In the video, sitting on a couch and looking into the camera, she said, “I’m beginning a conversation with you, with America.” I had never heard of anyone announcing a presidential run in such an understated way, and it felt awkward. There was no man on a stage with his wife dutifully standing next to him to project an image of family. There was no mention of God. It was just her, alone, and she wanted to talk about our country. She wanted to “chat.” The image of her looking back at me was unique. That metaphorical dimmer switch turned up one millimeter. The next event was in January of 2008. It was the evening of the first primary election in the much-anticipated Iowa caucus. (Iowa, at this point, was one of only four states that had never elected a woman to any national office.) I was watching Chris Matthews on the MSNBC news show Hardball, as I always did. Hillary had come in third in the caucus, and it was a shocker, as she had been expected to win. Matthews was full of bravado as he questioned whether Hillary should stay in the race. That hit me in my gut. Why would a pundit suggest she should quit at the start of the primaries? After all, both Bill Clinton and George W. Bush lost Iowa’s coveted first place win when they ran for the presidency. I didn’t remember people calling for them to quit the race. Another millimeter. Then on the heels of the Iowa loss was her New Hampshire win. That night, I got my daughter out of bed, and we watched Hillary’s victory speech together. I said to her, “She is going to be our next president.” That was odd. I had never said those words before. I could feel that dimmer switch turning up another millimeter. But something happened the next day on Hardball that would continue to happen on many shows after every state Hillary won. Chris Matthews looked glum on the news panel of reporters and commentators who were discussing Hillary’s New Hampshire win, when he remarked, “the reason she may be a front-runner is her husband messed around. That’s how she got to be senator from New York.” He said it confidently, as if he were declaring a fact, somehow sure that a woman could never win a presidential primary or a U.S. Senate race on her own merit without voter sympathy about a cheating husband. This dislike and, often times, hatred for her by liberal progressives was something for which I was naively unprepared. I expected the venom to be delivered by Republicans and conservative pundits, but not from her supposed allies. The sexualization of women who dare enter the male halls of authority is a common tactic to suppress female ambitions for the White House. And men aren’t the only ones who resort to it. When the liberal Air America radio show host Randi Rhodes called Hillary a “big fucking whore” and another Air America host, Stephanie Miller, continually referred to Hillary as “Mrs. Clinton” instead of Senator Clinton, these subtle and not-so- subtle attacks succeeded in doing two things: They relegated Hillary to a mere sexual being, and they erased her substantive political experience, both as United States senator and as First Lady. Plenty of examples of that abound. Conservative MSNBC host Tucker Carlson said in July of 2007, “She scares me. I cross my legs every time she talks.” When discussing Hillary’s performance at one of the debates, MSNBC commentator Mike Barnicle said Hillary’s attitude made her “... [look] like everyone’s first wife standing outside a probate court.” This created a bonding moment for the all male panel, as they laughed at that image comparing a serious presidential candidate to that of an annoying spouse. Another millimeter. The Hillary hate was also marketed. When the Hillary Clinton nutcracker came along in 2008, it was advertised with the feature of “serrated stainless steel thighs that, well, crack nuts.” That, coupled with Tucker’s revelation, should have clued me in to the pairing of men and Oval Office politics as a clubhouse that had a “No Girls Allowed” sign hanging on the door. At the time, I was a daily listener to National Public Radio and a reader of progressive blogs, but their common use of sports analogies awakened my senses to a different way of interpreting political coverage. NPR correspondents would open news shows with biased lines about Obama inching closer to Hillary’s delegate count with phrases such as, “He’s within striking distance.” Daily Kos bloggers wrote articles about the state primary dates with themes that reflected a boxing game “... the DNC put a stop to these contests, thwarting her ability to land a knockout punch here.” Why care about sports analogies? Because they are common in our traditionally male presidential campaign history. Reporters use sports symbolism to cover a race more easily; policy issues are complex, and contests are simple. When women enter races, they are expected to be one of the guys and participate in the language of sports, but that is a man’s game. Men have been building campaign traditions since the founding of this country. By boiling down the primary race into a sports contest, of sorts, it repudiated Hillary’s solid experience—a major experience difference between her and Obama and a suggestion that to win, she needed to “be one of the boys.” I could see that a powerful male context was flowing through American presidential politics. The dimmer switch about how male-oriented our political system is was turning up full force but wasn’t yet on all the way. However, the building blocks of our nation’s politics were in greater view for me. When my progressive friends offered that their line in the sand with Hillary was her Senate vote for the Iraq War Resolution, I noted that there had been no similar line in the sand for John Kerry in 2004 or for Senator Joe Biden when he was nominated as Obama’s vice presidential running mate in 2008. Both of these men voted for the Iraq War. This revelation was usually met with silence. The dimmer switch was a millimeter away from full. I also came to realize that a presidential nominee has to be likable and, alternately, an aggressor. This is easier for men to portray than for women because of historical archetypes. For the first time we saw a First Lady—the most traditional of political mother images—run for president, and people had to take the Norman Rockwell ideal of a fatherly leader of our nation who sometimes reluctantly declares war and replace it with a motherly image of a woman. In Hillary’s case, she was both a mother and senator who voted for the Iraq War Resolution. This is a combination of two archetypes, caregiver and ruler. For many, this created chaos. With all the gendered criticism of her, as well as a reluctance to acknowledge her experience and qualifications, I stopped listening to Air America, NPR, and Chris Matthews. I un-bookmarked the Daily Kos. Within a few months I even discontinued my cable service. I had to re-think everything I believed about partisan politics. I found the hatred for Hillary interesting because I thought that progressives would have been proud of Clinton’s experience as First Lady, notably her speech to the Fourth World Conference on Women in 1995 in Beijing. At that now-famous event, Hillary made a high-profile speech about global women’s rights that included the now oft-quoted line, “Human rights are women’s rights and women’s rights are human rights.” She spoke passionately about the fact that even though there are many people who would try to silence the words of women on issues concerning the human rights of women and girls, that freedom of speech on these issues was extremely important. Prior to her trip, the White House administration was nervous about China’s reaction to her speech, especially as she singled out that country’s silencing of women. She ignored their fear and proceeded with her plan. Her speech sent positive shock waves across the globe and was met with tremendous applause by both liberals and conservatives. Looking back now we can see how forward-thinking her speech was on policy issues for women. It was the perfect blend of two images: leader and mother. A few people angrily asked me why Clinton didn’t quit the race for president, as they thought she was standing in the way of Obama. I think what they really wanted to know was why I wasn’t quitting Hillary. Now the dimmer switch was turned up to full because I had to answer this for myself. And, this is where my shift in understanding took place. It was June of 2008, and it was the end of the primary race. My old hallmarks of progressivism that formed part of my identity had dissolved, and I saw partisan politics more objectively. I had developed a new way of looking at the political world, especially with regard to women. All the writers, anchors, and politicians just looked like a deck of cards to me. In my mind I took the deck, placed it between my thumb and fore- finger, and jettisoned the cards into the air. I didn’t look to see where the cards fell because I had already walked through the political looking glass. I was in the wilderness. I felt alone, and it was just a little bit cold. But in reality, I was with eighteen million voters who stayed with her, those cracks in the male ceiling of the Oval Office. We could all see differently now after having experienced that campaign. Hillary’s presence and words were a powerful message to girls and women, and that is why she didn’t quit. And as a mother of a daughter, that is one of the reasons I didn’t quit her. To be sure, Hillary was fighting to win, but she also knew that we needed the memory and the images to move forward for those who came after her. Like her Beijing speech, she was ahead of the curve, but this was a rougher road. The accusations that were hurled at Hillary were hurled at all of us. We were all called “bitter clingers,” “vagina voters,” “working class,” “old,” and “bitches” right along with her and suffered the same dismissal as she did. And some of the people who threw out those slurs were feminists. Hillary’s 2008 campaign is now a snapshot that is a part of our collective cultural memory from past events that we all share. These memories help us form our identities as individuals and as citizens. Boys and men have the totality of presidential cultural memory reflected to them in the United States: Franklin D. Roosevelt holding up his hat, Dwight Eisenhower with arms held aloft, and John F. Kennedy with Marilyn Monroe. The history of male presidents is the gendered bedrock of power upon which we form our national identity, and women, without similar memories, sense their lack of power. I grieved when Hillary lost the nomination. The night after she won in South Dakota, one of the last primary states in a campaign already lost, I dreamt about her. In the dream, Hillary was in the White House. I was with many women in a room waiting to meet with her. When it was my turn, Hillary stood in front of me. I held out my arms as if to receive something. She placed several Middle Eastern shawls and fabrics into my empty arms. I knew women had made them. I took them. Several years later after the election, I finished my film about the women’s liberation movement and started to speak about how important it is to remember that movement and include it in our cultural memory. If we had had a cultural memory about female leaders in 2008, Hillary may not have been seen as an interloper in the male Oval Office. As soon as I released my film in 2013, I received an invitation to screen it in Islamabad, Pakistan, as a guest of the International Islamic University. Was I afraid to go? Yes. But after having lived through the 2008 presidential campaign, complete with its gendered rhetoric and undercurrent of dismissiveness of women, I knew how important it was for me to go. Hillary gave me the strength, and I traveled alone. I screened my film and spoke to an amazing group of Pakistani women who were in the midst of shaping feminism in their own country and wanted to learn more about American feminism. Later, with several of these Pakistani women, I went shopping in one of their open markets. I bought some beautiful fabrics, and as I took them in my arms, I remembered the dream and remembered that Hillary also had been to Pakistan. That dream wasn’t about me personally. Nor was the 2008 campaign just about the loss of my preferred candidate. There was a bigger picture developing here, and it was an image of Americans connecting with women in faraway places from the symbolic power of a woman in the Oval Office in a way that can’t happen if we elect another man to the White House. I know Hillary is a big part of this picture. I can see that clearly now because the lights are turned up brightly. Excerpted from "Love Her, Love Her Not: The Hillary Paradox," edited by Joanne Cronrath Bamberger. Copyright © 2015. Reprinted by permission of She Writes Press.Hillary Clinton and her myriad personal and political experiences have made me a braver person than I used to be. Looking back on the 2008 primary campaign for the Democratic presidential nomination, I can connect the events that ultimately changed how I see myself as a political person. As a volunteer on Hillary’s campaign, for eight months, I made numerous phone calls to super-delegates and members of the National Organization for Women. Until then, I’d never followed Hillary’s life or career too closely. I only became a volunteer after I saw the sexism that was thrown her way by fellow Democrats as she campaigned; that sexism prompted me to learn more about her as a person and as a candidate. I didn’t realize at the time how strongly sexism would ultimately shape Hillary’s campaign and its outcome, as well as my view of the political world. At the time, I was working on a documentary film about the women’s liberation movement and had a full-time job in the film industry as a visual effects editor. My daughter was five years old; the time demands of motherhood were still new for me. Nevertheless, I managed to schedule my campaign volunteer work during lunch breaks and in the few hours I had after I got home. It was the first time I ever volunteered for a presidential political campaign. Was it hard? Yes. But it was transformational. It was explained to me once in the early days of my visual effects work on feature films that although the work seemed difficult and abstract, one day I would understand its technical complexity in a simple way. The metaphor used was the light bulb being turned on in a darkened room. One day the switch would flip to “on” and I would fully understand. My growing awareness of how sexism impacted Hillary’s presidential campaign wasn’t achieved as easily as a switch turning on; it was a slow rotation, as with a dimmer switch. With each incremental turn, a cultural undercurrent that involved women and presidential politics was illuminated. It changed me profoundly. The first turn of that political dimmer switch happened in my kitchen in 2007. I was watching Hillary’s online announcement to the nation that she was running for president. In the video, sitting on a couch and looking into the camera, she said, “I’m beginning a conversation with you, with America.” I had never heard of anyone announcing a presidential run in such an understated way, and it felt awkward. There was no man on a stage with his wife dutifully standing next to him to project an image of family. There was no mention of God. It was just her, alone, and she wanted to talk about our country. She wanted to “chat.” The image of her looking back at me was unique. That metaphorical dimmer switch turned up one millimeter. The next event was in January of 2008. It was the evening of the first primary election in the much-anticipated Iowa caucus. (Iowa, at this point, was one of only four states that had never elected a woman to any national office.) I was watching Chris Matthews on the MSNBC news show Hardball, as I always did. Hillary had come in third in the caucus, and it was a shocker, as she had been expected to win. Matthews was full of bravado as he questioned whether Hillary should stay in the race. That hit me in my gut. Why would a pundit suggest she should quit at the start of the primaries? After all, both Bill Clinton and George W. Bush lost Iowa’s coveted first place win when they ran for the presidency. I didn’t remember people calling for them to quit the race. Another millimeter. Then on the heels of the Iowa loss was her New Hampshire win. That night, I got my daughter out of bed, and we watched Hillary’s victory speech together. I said to her, “She is going to be our next president.” That was odd. I had never said those words before. I could feel that dimmer switch turning up another millimeter. But something happened the next day on Hardball that would continue to happen on many shows after every state Hillary won. Chris Matthews looked glum on the news panel of reporters and commentators who were discussing Hillary’s New Hampshire win, when he remarked, “the reason she may be a front-runner is her husband messed around. That’s how she got to be senator from New York.” He said it confidently, as if he were declaring a fact, somehow sure that a woman could never win a presidential primary or a U.S. Senate race on her own merit without voter sympathy about a cheating husband. This dislike and, often times, hatred for her by liberal progressives was something for which I was naively unprepared. I expected the venom to be delivered by Republicans and conservative pundits, but not from her supposed allies. The sexualization of women who dare enter the male halls of authority is a common tactic to suppress female ambitions for the White House. And men aren’t the only ones who resort to it. When the liberal Air America radio show host Randi Rhodes called Hillary a “big fucking whore” and another Air America host, Stephanie Miller, continually referred to Hillary as “Mrs. Clinton” instead of Senator Clinton, these subtle and not-so- subtle attacks succeeded in doing two things: They relegated Hillary to a mere sexual being, and they erased her substantive political experience, both as United States senator and as First Lady. Plenty of examples of that abound. Conservative MSNBC host Tucker Carlson said in July of 2007, “She scares me. I cross my legs every time she talks.” When discussing Hillary’s performance at one of the debates, MSNBC commentator Mike Barnicle said Hillary’s attitude made her “... [look] like everyone’s first wife standing outside a probate court.” This created a bonding moment for the all male panel, as they laughed at that image comparing a serious presidential candidate to that of an annoying spouse. Another millimeter. The Hillary hate was also marketed. When the Hillary Clinton nutcracker came along in 2008, it was advertised with the feature of “serrated stainless steel thighs that, well, crack nuts.” That, coupled with Tucker’s revelation, should have clued me in to the pairing of men and Oval Office politics as a clubhouse that had a “No Girls Allowed” sign hanging on the door. At the time, I was a daily listener to National Public Radio and a reader of progressive blogs, but their common use of sports analogies awakened my senses to a different way of interpreting political coverage. NPR correspondents would open news shows with biased lines about Obama inching closer to Hillary’s delegate count with phrases such as, “He’s within striking distance.” Daily Kos bloggers wrote articles about the state primary dates with themes that reflected a boxing game “... the DNC put a stop to these contests, thwarting her ability to land a knockout punch here.” Why care about sports analogies? Because they are common in our traditionally male presidential campaign history. Reporters use sports symbolism to cover a race more easily; policy issues are complex, and contests are simple. When women enter races, they are expected to be one of the guys and participate in the language of sports, but that is a man’s game. Men have been building campaign traditions since the founding of this country. By boiling down the primary race into a sports contest, of sorts, it repudiated Hillary’s solid experience—a major experience difference between her and Obama and a suggestion that to win, she needed to “be one of the boys.” I could see that a powerful male context was flowing through American presidential politics. The dimmer switch about how male-oriented our political system is was turning up full force but wasn’t yet on all the way. However, the building blocks of our nation’s politics were in greater view for me. When my progressive friends offered that their line in the sand with Hillary was her Senate vote for the Iraq War Resolution, I noted that there had been no similar line in the sand for John Kerry in 2004 or for Senator Joe Biden when he was nominated as Obama’s vice presidential running mate in 2008. Both of these men voted for the Iraq War. This revelation was usually met with silence. The dimmer switch was a millimeter away from full. I also came to realize that a presidential nominee has to be likable and, alternately, an aggressor. This is easier for men to portray than for women because of historical archetypes. For the first time we saw a First Lady—the most traditional of political mother images—run for president, and people had to take the Norman Rockwell ideal of a fatherly leader of our nation who sometimes reluctantly declares war and replace it with a motherly image of a woman. In Hillary’s case, she was both a mother and senator who voted for the Iraq War Resolution. This is a combination of two archetypes, caregiver and ruler. For many, this created chaos. With all the gendered criticism of her, as well as a reluctance to acknowledge her experience and qualifications, I stopped listening to Air America, NPR, and Chris Matthews. I un-bookmarked the Daily Kos. Within a few months I even discontinued my cable service. I had to re-think everything I believed about partisan politics. I found the hatred for Hillary interesting because I thought that progressives would have been proud of Clinton’s experience as First Lady, notably her speech to the Fourth World Conference on Women in 1995 in Beijing. At that now-famous event, Hillary made a high-profile speech about global women’s rights that included the now oft-quoted line, “Human rights are women’s rights and women’s rights are human rights.” She spoke passionately about the fact that even though there are many people who would try to silence the words of women on issues concerning the human rights of women and girls, that freedom of speech on these issues was extremely important. Prior to her trip, the White House administration was nervous about China’s reaction to her speech, especially as she singled out that country’s silencing of women. She ignored their fear and proceeded with her plan. Her speech sent positive shock waves across the globe and was met with tremendous applause by both liberals and conservatives. Looking back now we can see how forward-thinking her speech was on policy issues for women. It was the perfect blend of two images: leader and mother. A few people angrily asked me why Clinton didn’t quit the race for president, as they thought she was standing in the way of Obama. I think what they really wanted to know was why I wasn’t quitting Hillary. Now the dimmer switch was turned up to full because I had to answer this for myself. And, this is where my shift in understanding took place. It was June of 2008, and it was the end of the primary race. My old hallmarks of progressivism that formed part of my identity had dissolved, and I saw partisan politics more objectively. I had developed a new way of looking at the political world, especially with regard to women. All the writers, anchors, and politicians just looked like a deck of cards to me. In my mind I took the deck, placed it between my thumb and fore- finger, and jettisoned the cards into the air. I didn’t look to see where the cards fell because I had already walked through the political looking glass. I was in the wilderness. I felt alone, and it was just a little bit cold. But in reality, I was with eighteen million voters who stayed with her, those cracks in the male ceiling of the Oval Office. We could all see differently now after having experienced that campaign. Hillary’s presence and words were a powerful message to girls and women, and that is why she didn’t quit. And as a mother of a daughter, that is one of the reasons I didn’t quit her. To be sure, Hillary was fighting to win, but she also knew that we needed the memory and the images to move forward for those who came after her. Like her Beijing speech, she was ahead of the curve, but this was a rougher road. The accusations that were hurled at Hillary were hurled at all of us. We were all called “bitter clingers,” “vagina voters,” “working class,” “old,” and “bitches” right along with her and suffered the same dismissal as she did. And some of the people who threw out those slurs were feminists. Hillary’s 2008 campaign is now a snapshot that is a part of our collective cultural memory from past events that we all share. These memories help us form our identities as individuals and as citizens. Boys and men have the totality of presidential cultural memory reflected to them in the United States: Franklin D. Roosevelt holding up his hat, Dwight Eisenhower with arms held aloft, and John F. Kennedy with Marilyn Monroe. The history of male presidents is the gendered bedrock of power upon which we form our national identity, and women, without similar memories, sense their lack of power. I grieved when Hillary lost the nomination. The night after she won in South Dakota, one of the last primary states in a campaign already lost, I dreamt about her. In the dream, Hillary was in the White House. I was with many women in a room waiting to meet with her. When it was my turn, Hillary stood in front of me. I held out my arms as if to receive something. She placed several Middle Eastern shawls and fabrics into my empty arms. I knew women had made them. I took them. Several years later after the election, I finished my film about the women’s liberation movement and started to speak about how important it is to remember that movement and include it in our cultural memory. If we had had a cultural memory about female leaders in 2008, Hillary may not have been seen as an interloper in the male Oval Office. As soon as I released my film in 2013, I received an invitation to screen it in Islamabad, Pakistan, as a guest of the International Islamic University. Was I afraid to go? Yes. But after having lived through the 2008 presidential campaign, complete with its gendered rhetoric and undercurrent of dismissiveness of women, I knew how important it was for me to go. Hillary gave me the strength, and I traveled alone. I screened my film and spoke to an amazing group of Pakistani women who were in the midst of shaping feminism in their own country and wanted to learn more about American feminism. Later, with several of these Pakistani women, I went shopping in one of their open markets. I bought some beautiful fabrics, and as I took them in my arms, I remembered the dream and remembered that Hillary also had been to Pakistan. That dream wasn’t about me personally. Nor was the 2008 campaign just about the loss of my preferred candidate. There was a bigger picture developing here, and it was an image of Americans connecting with women in faraway places from the symbolic power of a woman in the Oval Office in a way that can’t happen if we elect another man to the White House. I know Hillary is a big part of this picture. I can see that clearly now because the lights are turned up brightly. Excerpted from "Love Her, Love Her Not: The Hillary Paradox," edited by Joanne Cronrath Bamberger. Copyright © 2015. Reprinted by permission of She Writes Press.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 14:30

“This is the first direct hit on music”: Bono reacts to Paris terror attack on Eagles of Death Metal concert

Saturday and Sunday evening U2 concerts in Paris have been canceled in the wake of Friday's terror attacks, in which 129 people were killed and more than 350 injured by Islamic State attackers across several sites in the city. The band was scheduled to perform at Bercy Arena this weekend, about three miles from Bataclan, the venue where 118 people were killed by the end of a deadly hostage siege that took place during an Eagles of Death Metal concert. Rolling Stone reports that U2 frontman Bono spoke to Irish radio DJ Dave Fanning by phone today and said that the band had been rehearsing when the news of the attacks broke out, and that while the decision to cancel the shows was not the band's, they are supportive. "It's up to the French authorities and the city to decide when we can go back." Bono then said the band's first thoughts are with the victims, especially those at Bataclan for the concert. "If you think about it, the majority of victims last night are music fans. This is the first direct hit on music that we've had in this so-called War on Terror or whatever it's called," Bono told Fanning. "It's very upsetting. These are our people. This could be me at a show. You at a show, in that venue." Read more at Rolling Stone.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 13:39

Donald Trump’s callous response to Paris terror attack: Victims should have been armed

Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN: Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN: Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN: Leave it to blowhard presidential contender Donald Trump to deliver one of the most vile responses to yesterday's horrific terror attacks in Paris, which killed 129 people and injured hundreds more. Speaking in Beaumont, Texas on Saturday, the GOP frontrunner blamed gun control for the scale of the casualties, echoing comments he made following the Charlie Hebdo attack in January. CNN reports:

Donald Trump said Saturday that the terrorist attacks in Paris "would've been a much different situation" if the city had looser gun laws.

"When you look at Paris -- you know the toughest gun laws in the world, Paris -- nobody had guns but the bad guys. Nobody had guns. Nobody," Trump said at a rally here. "They were just shooting them one by one and then they (security forces) broke in and had a big shootout and ultimately killed the terrorists."

"You can say what you want, but if they had guns, if our people had guns, if they were allowed to carry --" Trump said, pausing as the crowd erupted into raucous applause, "-- it would've been a much, much different situation."

Watch Trump's remarks below, via CNN:

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 13:34

Blame it on the baby boomers: Yes, pretty much everything

Manners are the happy ways of doing things . . . ’tis the very beginning of civility,—to make us, I mean, endurable to each other. —Ralph Waldo Emerson, The Conduct of Life As civilization began, it became apparent that the human race was imperfectly suited for it. Living together took some work. All these years later, there are innumerable examples to show that we’re still not terribly good at being civilized, from the student commuter who dashes in front of an elderly woman to claim a seat on the subway to how societies treat their poor and unemployed. The unchanging reality is that people tend to think of themselves first, yet the task of coexistence is made easier if they don’t. And so efforts arose many years ago to teach folks how to get along. The concept of grace—refined ease of movement and manner, as a way of pleasing, assisting, and honoring others—wove through this endeavor. Indeed, the term getting along itself, in the sense of being on harmonious terms, implies graceful behavior. It carries a hint of a dance, a peaceable duet, or the falling-in-step impulse that horses have with one another, which helps make them manageable. Grace and manners, the general principles of social behavior, have historically been entwined; each adds luster to the other. To trace the development of grace through time, where grace isn’t specifically mentioned, I’ve looked for an emphasis on the art of getting along. By that I mean manners that are aimed at harmonious interactions and creating a climate of warmth and appreciation, as opposed to formalities about fish forks and introductions, which are in the more detail-oriented domain of etiquette. Some of the world’s most influential books have been instruction manuals on the art of getting along, or what we’ve come to know as the social graces. These include the oldest writings of the ancient era, the runaway best sellers of the Renaissance, and the must-reads of American colonists, revolutionaries, and early twentieth-century strivers with an eye for elegance and civilized living. Yet instruction in grace mysteriously dropped out of our lives a few decades ago. Well, “mysteriously” isn’t quite right. There is a pendulum swing in the history of manners, when one era comes up with rules and they grow more and more strict until another generation says, oh, just forget about it—this is ridiculous. And grace gets thrown out for being an act, insincere, phony. “We have the residue now, with well-meaning parents who say to their children, ‘Just be yourself,’” said Judith Martin, when I asked her why the social graces were in decline. Martin is the author of the internationally syndicated Miss Manners newspaper column and many books on etiquette. “What does that mean? Who would they be if they weren’t themselves? Parents don’t teach their children how to act out being glad for a present, or how to seem pleased to see someone they may not want to see. “Etiquette has long struggled with the opposing ideas of grace and naturalness, of appearing natural and being natural, which are two entirely different things,” she continued. This inherent paradox, of feeling one thing and saying another, leaves etiquette open to the charge of insincerity. “There is a disconnect in what you feel and what you ought to project, which is the opposite of sincerity. For example, the hostess who says, ‘Oh, don’t worry about it,’ when you’ve just broken her favorite lamp. Of course she cares about it, but the primary goal is putting the other person at ease. “People say etiquette is artificial. But what they really object to is the obviously artificial,” Martin said. “Yes, it is artificial and it’s often better than the raw expression of natural desires. Look at dance: Is human movement better when it’s totally untutored or is it better when you put thought and work into it?” Social grace, just like physical grace, requires work. That was the point of the conduct books from centuries past: to make it plain that correct behavior required effort and discipline. Being with people is an art like any other art, or a practice, if you will, just like cooking or riding a bicycle. The more you realize what smooths things over, what pleases people, and the more you want to be graceful and practice being graceful, the better and more convincing you will become. Grace will cease to be something you “act out.” But as with any learned activity, there are different degrees of polish here. There is the hostess who reacts to her broken lamp by saying, “Oh, don’t worry about it” through clenched teeth, making you feel terrible. And then there is one who reacts with grace, putting on a better act, perhaps. Maybe she’s a Meryl Streep, imperceptibly masking her true feelings with an Oscar-worthy portrayal of nonchalance. Or maybe she really hated that lamp and is glad it’s headed for the trash. Or maybe she is really and truly a happy-go-lucky angel on earth whose every impulse is upright and pure. It makes no difference to the embarrassed guest who just wants to be forgiven. He’s grateful for grace any way it comes. Grace lies in the manner in which the rules are followed, Martin says. “Do you follow etiquette rules to the letter, or do you make it seem as if they arise naturally from good feelings and it’s easy for you to say, ‘Oh, never mind, don’t worry about it’? It’s not easy for a dancer to leap into the air either, and we don’t see the bloody toes and the sweat from a distance. And in the same way, if she’s being graceful, we don’t see the hostess thinking, ‘Oh my gosh, this is going to cost me a fortune to fix.’” Let’s face it, if we all exposed our true feelings all the time, the world would be unbearable. Grace, as Martin put it, “is that covering through which we make the world pleasant.” And yet we’re in one of those extremes of the pendulum swing where honesty is overvalued and the brilliant act, the self-discipline, the training that produces grace has faded away. An accumulation of blows has led to its downfall, but they stem from a reaction against the overcomplication of everyday life that picked up strength in the 1950s and ’60s. The modern means of self-improvement turned from building up one’s character (a rather slow, internal, and never-ending process) to the far easier focus on things we can buy. Buying our way into the good life. With the surge in department stores and shopping malls, with ever-present advertising, with our voyeurism via television into the lives and possessions of others, shopping became the modern means of self-betterment. This was a 180-degree turn from the previous idea. America’s Founding Fathers, for example, were obsessed with inner self-improvement. Striving for “moral perfection,” a twenty-year-old Benjamin Franklin worked methodically to acquire a list of virtues, from silence and sincerity to tranquility and humility. He assessed himself each evening and tracked his progress on charts. John Adams, in a typical diary entry, resolved to become more conscientious and socially pleasant: “I find my self very much inclin’d to an unreasonable absence of mind, and to a morose, unsociable disposition. Let it therefore be my constant endeavor to reform these great faults.” But two hundred years on, such vestiges of a Puritan past had been swept aside by a greater interest in cars, appliances, and shiny hair. The spread of the suburbs after World War II, with their backyard weenie roasts, patios, and cheese dips, was also a way of escaping an overcomplicated, formal life. It encouraged a sportier, more casual lifestyle for a middle class newly freed from decades of deprivation. Add to that the great wave of Baby Boomers, born into prosperity and surrounded by products, a Me Generation showered with attention, not inclined to modesty, and little interested in the artifice of social graces and their required self-control. In them, the age-old tendency of the young to rebel against their elders attained an unprecedented critical mass. And with that came even more informality, more “be yourself” free rein. The courtesies of their parents’ era were a drag. Child-rearing practices were also changing. In the new, less formal times, manners instruction for children simply went out of style, and the subtleties of grace were deemed passé, or worse: elitist. Anything implying snobbery was swept aside by a growing middle class, the youth counterculture, and a surging progressive tide. Change was sorely needed, as the civil rights, antiwar, and women’s movements demonstrated. But it wasn’t only social institutions that were rocked. So was the cradle. A nation crawling with babies was hungry for advice, the simpler the better. The easygoing child-centered approach advocated by Benjamin Spock in his enormously influential, best-selling Common Sense Book of Baby and Child Care, which first came out in 1946, gave parents permission to forgo the feeding schedules and strict discipline of former times and simply enjoy their kids. Hugs were in, spankings were out. But if you’re tracking the demise of grace, you can find a few nicks and cuts in his pages. Since people like children with “sensibly good” manners, Spock writes, “parents owe it to their children to make them likable.” But he also put forth the view that “good manners come naturally” if a child feels good about himself. Yet self-esteem is not the answer to everything. In fact, some researchers blame the self-esteem movement of the 1980s for the rise in narcissism among college students today as compared with those of thirty years ago. Narcissists have a grandiose view of themselves but care little about others; the argument is that parents who fill their children’s ears with how special they are (as opposed to, say, how hard they work or how kind they are) create adults with little patience for those who don’t recognize their superiority. We’ve all encountered plenty of people, young and old, with high opinions of themselves and precious little grace. It is one thing to empower a child with self-worth and confidence and to guide her in becoming a good person. But children who are not taught to behave with consideration for others and to respect other people’s feelings will not develop empathy and compassion. While likable is a perfectly fine quality, it’s a low bar to set for parents. It refers only to how others view the child, and in a bland way at that. Being likable means you’re receiving something—someone’s approval. Compare it with agreeable, which is about giving. It’s other-directed, referring to getting along, being warm, supportive, and helpful, while diminishing the focus on yourself. “Be pretty if you can, be witty if you must, but be agreeable if it kills you!” declared the 1930s Home Institute booklet Charm. Interestingly, Spock’s view of the primacy of likability flips the long-standing Anglo-American notion, prevalent among the Puritans and up through the nineteenth and early decades of the twentieth centuries, that one builds character through service to others, whether God or your fellow man. In this older view, the less you fixate on yourself the better, apart from controlling unruly impulses. Putting priority on others is the right—and graceful—thing to do.

A Culture of Coarseness

What has most threatened grace is what I can only describe as a culture of coarseness. We’re insensitive to our effect on other people. We don’t think about how others feel when we shoot down their ideas in a meeting, when we court laughs at their expense, when we criticize them in front of colleagues. Or when we make it known how little they matter once someone more interesting comes along. I was having lunch with a colleague once when she saw a man she knew passing by on the sidewalk. Waving vigorously through the window to get his attention, she urged him to join us. But the moment he got to our table, before she’d had a chance to introduce us (I’m choosing to believe that was her plan), her cell phone rang. She’d placed it on the table in case this should happen, so of course she took the call, having long forgotten the conversation she’d interrupted by inviting in a guy off the street, and leaving me and a stranger in awkward silence while she also forgot about us. Our devices are draining us of grace. “We need to e-mail!” a friend I haven’t seen in a while calls over her shoulder, because there’s no time to talk. E-mail and texting are convenient, but they also crumple us up physically and make us unaware socially, closed off from those around us. Riding the subway can be like nursery school, what with the manspreaders who don’t want to share the bench they’re sprawling on with wide-open knees and a slump, and the woman who takes up two seats with all her bags and doesn’t much care if you have to stand. Or maybe she doesn’t notice you because she’s very busy texting, like the toy store owner sitting behind the counter who couldn’t be moved to help me find a birthday present for my nephew. Silly me, I thought that she was entering important data on her tablet; it was my savvier preteen daughter who detected instantly the gestures of a stealth texter. With the hours spent hunched over keyboards, no wonder we’re awkward when we get up. Hips tighten, necks droop, our backs round. I watch people walking and standing. Most of us sag in the front, with shoulders pitched forward and chests caving, probably from too much sitting and driving and not enough walking, or walking incorrectly. Our footfalls are heavy; we gaze at the ground or at what’s in our hands. We’ve lost the ability to carry ourselves with upright buoyancy and ease. Grace is not only the furthest thing from our minds, it’s beyond the reach of our bodies. Instead, we’re drawn to disgrace. No teaser is bigger Internet click bait than the one that promises bad behavior: “Mogul Throws Fit Over Spilt Champagne”; Lindsay Lohan gets kicked out of a hotel; Justin Bieber moons his fans on Instagram. Reality TV thrives on disgrace. Fans watch it for the awkward moments, for people to be told they’re fired, they suck, they’re the weakest link. The appeal of American Idol used to be Simon Cowell bullying a contestant who had volunteered himself for public shaming. Would we ever be so stupid? Of course not. Survivor competitors drag one another through the dirt, physically and verbally; the mothers on Dance Moms put the toddler antics of subway riders to shame. Viewers can puff themselves up in comparison, engage in some vicarious ribbing without responsibility. The glee of disgrace, of course, exists beyond TV. In May 2014, Evan Spiegel, CEO and founder of Snapchat, the ephemeral photo-sharing app, issued an apology after the release of e-mails he’d written to his frat brothers while attending Stanford. Those missives had cheerfully chronicled getting sorority girls (“sororisluts”) drunk and musing about whether he’d peed on his date. Typical frat boy fun, some said. Are we too easily outraged? Or are we numb to what is truly outrageous (torture, for starters), because we’re overoutraged? Internet outrage has become a fact of life, a ritual of righteous indignation practiced after the inappropriate tweet. Outrage is such a satisfying cycle: First there is a celebrity faux pas; then the offended take to Twitter, the defenders counterattack, the bloggers repost, a Facebook fight erupts, and after all the time invested in following this trail—trust me, even your respected local newspaper is following this trail—why, there’s a new dumb thing to get mad about. We’re in an environment of grabbing and taking: taking advantage, taking control, taking for oneself. Grace, by contrast, is associated with giving. The three Charites of Greek mythology, you’ll recall, are the givers of charm, beauty, and ease. In so many fields of activity—sports, entertainment, business—-success isn’t just winning, it’s crushing. Total domination is the desired image to project. Power is valued over grace; taking is celebrated. Giving is considered a lesser quality, even a weakness. These are the days of category-killing control and sensory bombardments by any means necessary. It’s as if society at large has been captivated by the steroid aesthetic of today’s sports. Asked by business analysts if he was going to retire at sixty-five, Boeing CEO Jim McNerney said no, despite it being company custom, and by way of explanation—offered to people he wanted to impress, no less—he chose to depict himself as a monster. “The heart will still be beating, the employees will still be cowering,” he said. “I’ll be working hard. There’s no end in sight.” This prompted another memorable public apology. Yet McNerney’s original phrasing was telling, right up to his last words. There’s no end in sight. Perpetual power: Why give it up if you’re on a roll? Why give up anything if you’re in a position to take? If those down the rungs have anything to relinquish—if they can be made to cower, to give back benefits and raises and job security—then that must be done, because it can be done. Bigger may be better, but gigantic is best, whether it’s profits, or the wedding of Kanye West and Kim Kardashian, or the tech effects of a Hollywood blockbuster. (Just look at how the intimate, human-scale charm of The Wizard of Oz gave way to the massive 3-D spectacle of Oz the Great and Powerful, with its CGI landscape, booming soundtrack, explosions, and strained seriousness.) In all of this, being compassionate and humble, generous and considerate, elegantly restrained rather than a show-off, at ease instead of in-your-face—in short, being graceful—seems rather behind the times. “Go out of your way to do something nice for somebody—it will make your heart warm,” urged a 1935 guide, Personality Preferred! How to Grow Up Gracefully. This book, like others of its era, took a holistic view of grace as a way of being that one acquired through habits of the body, mind, and spirit. “Grace isn’t just a set of behaviors you dust off and display on special occasions,” author Elizabeth Woodward explained to her young readers. “It’s how you carry yourself every day.” Woodward, an editor at Ladies’ Home Journal, wrote her book after getting hundreds of thousands of letters from young women seeking advice. Before the upheavals in the mid-twentieth century, growing-up advice to young people, such as Woodward’s book, generally followed a course set in antiquity. Making one’s way in the world was seen as an art, something to be practiced and perfected. It was in some ways like a lifelong dance, with rules and steps and choreography, as well as the need for rehearsal. This art of living incorporated not only what people said and how they behaved at dinner or in the parlor, but how they moved in many ways, large and small. Control of the body through posture and proper body language has long been a part of “conduct books.” In How to Grow Up Gracefully and publications like it, for example, it is essential to the graceful life. Excerpted from "The Art of Grace: On Moving Well Through Life" by Sarah L. Kaufman. Copyright © 2015 by Sarah Kaufman. Reprinted by permission of W.W. Norton & Co. Manners are the happy ways of doing things . . . ’tis the very beginning of civility,—to make us, I mean, endurable to each other. —Ralph Waldo Emerson, The Conduct of Life As civilization began, it became apparent that the human race was imperfectly suited for it. Living together took some work. All these years later, there are innumerable examples to show that we’re still not terribly good at being civilized, from the student commuter who dashes in front of an elderly woman to claim a seat on the subway to how societies treat their poor and unemployed. The unchanging reality is that people tend to think of themselves first, yet the task of coexistence is made easier if they don’t. And so efforts arose many years ago to teach folks how to get along. The concept of grace—refined ease of movement and manner, as a way of pleasing, assisting, and honoring others—wove through this endeavor. Indeed, the term getting along itself, in the sense of being on harmonious terms, implies graceful behavior. It carries a hint of a dance, a peaceable duet, or the falling-in-step impulse that horses have with one another, which helps make them manageable. Grace and manners, the general principles of social behavior, have historically been entwined; each adds luster to the other. To trace the development of grace through time, where grace isn’t specifically mentioned, I’ve looked for an emphasis on the art of getting along. By that I mean manners that are aimed at harmonious interactions and creating a climate of warmth and appreciation, as opposed to formalities about fish forks and introductions, which are in the more detail-oriented domain of etiquette. Some of the world’s most influential books have been instruction manuals on the art of getting along, or what we’ve come to know as the social graces. These include the oldest writings of the ancient era, the runaway best sellers of the Renaissance, and the must-reads of American colonists, revolutionaries, and early twentieth-century strivers with an eye for elegance and civilized living. Yet instruction in grace mysteriously dropped out of our lives a few decades ago. Well, “mysteriously” isn’t quite right. There is a pendulum swing in the history of manners, when one era comes up with rules and they grow more and more strict until another generation says, oh, just forget about it—this is ridiculous. And grace gets thrown out for being an act, insincere, phony. “We have the residue now, with well-meaning parents who say to their children, ‘Just be yourself,’” said Judith Martin, when I asked her why the social graces were in decline. Martin is the author of the internationally syndicated Miss Manners newspaper column and many books on etiquette. “What does that mean? Who would they be if they weren’t themselves? Parents don’t teach their children how to act out being glad for a present, or how to seem pleased to see someone they may not want to see. “Etiquette has long struggled with the opposing ideas of grace and naturalness, of appearing natural and being natural, which are two entirely different things,” she continued. This inherent paradox, of feeling one thing and saying another, leaves etiquette open to the charge of insincerity. “There is a disconnect in what you feel and what you ought to project, which is the opposite of sincerity. For example, the hostess who says, ‘Oh, don’t worry about it,’ when you’ve just broken her favorite lamp. Of course she cares about it, but the primary goal is putting the other person at ease. “People say etiquette is artificial. But what they really object to is the obviously artificial,” Martin said. “Yes, it is artificial and it’s often better than the raw expression of natural desires. Look at dance: Is human movement better when it’s totally untutored or is it better when you put thought and work into it?” Social grace, just like physical grace, requires work. That was the point of the conduct books from centuries past: to make it plain that correct behavior required effort and discipline. Being with people is an art like any other art, or a practice, if you will, just like cooking or riding a bicycle. The more you realize what smooths things over, what pleases people, and the more you want to be graceful and practice being graceful, the better and more convincing you will become. Grace will cease to be something you “act out.” But as with any learned activity, there are different degrees of polish here. There is the hostess who reacts to her broken lamp by saying, “Oh, don’t worry about it” through clenched teeth, making you feel terrible. And then there is one who reacts with grace, putting on a better act, perhaps. Maybe she’s a Meryl Streep, imperceptibly masking her true feelings with an Oscar-worthy portrayal of nonchalance. Or maybe she really hated that lamp and is glad it’s headed for the trash. Or maybe she is really and truly a happy-go-lucky angel on earth whose every impulse is upright and pure. It makes no difference to the embarrassed guest who just wants to be forgiven. He’s grateful for grace any way it comes. Grace lies in the manner in which the rules are followed, Martin says. “Do you follow etiquette rules to the letter, or do you make it seem as if they arise naturally from good feelings and it’s easy for you to say, ‘Oh, never mind, don’t worry about it’? It’s not easy for a dancer to leap into the air either, and we don’t see the bloody toes and the sweat from a distance. And in the same way, if she’s being graceful, we don’t see the hostess thinking, ‘Oh my gosh, this is going to cost me a fortune to fix.’” Let’s face it, if we all exposed our true feelings all the time, the world would be unbearable. Grace, as Martin put it, “is that covering through which we make the world pleasant.” And yet we’re in one of those extremes of the pendulum swing where honesty is overvalued and the brilliant act, the self-discipline, the training that produces grace has faded away. An accumulation of blows has led to its downfall, but they stem from a reaction against the overcomplication of everyday life that picked up strength in the 1950s and ’60s. The modern means of self-improvement turned from building up one’s character (a rather slow, internal, and never-ending process) to the far easier focus on things we can buy. Buying our way into the good life. With the surge in department stores and shopping malls, with ever-present advertising, with our voyeurism via television into the lives and possessions of others, shopping became the modern means of self-betterment. This was a 180-degree turn from the previous idea. America’s Founding Fathers, for example, were obsessed with inner self-improvement. Striving for “moral perfection,” a twenty-year-old Benjamin Franklin worked methodically to acquire a list of virtues, from silence and sincerity to tranquility and humility. He assessed himself each evening and tracked his progress on charts. John Adams, in a typical diary entry, resolved to become more conscientious and socially pleasant: “I find my self very much inclin’d to an unreasonable absence of mind, and to a morose, unsociable disposition. Let it therefore be my constant endeavor to reform these great faults.” But two hundred years on, such vestiges of a Puritan past had been swept aside by a greater interest in cars, appliances, and shiny hair. The spread of the suburbs after World War II, with their backyard weenie roasts, patios, and cheese dips, was also a way of escaping an overcomplicated, formal life. It encouraged a sportier, more casual lifestyle for a middle class newly freed from decades of deprivation. Add to that the great wave of Baby Boomers, born into prosperity and surrounded by products, a Me Generation showered with attention, not inclined to modesty, and little interested in the artifice of social graces and their required self-control. In them, the age-old tendency of the young to rebel against their elders attained an unprecedented critical mass. And with that came even more informality, more “be yourself” free rein. The courtesies of their parents’ era were a drag. Child-rearing practices were also changing. In the new, less formal times, manners instruction for children simply went out of style, and the subtleties of grace were deemed passé, or worse: elitist. Anything implying snobbery was swept aside by a growing middle class, the youth counterculture, and a surging progressive tide. Change was sorely needed, as the civil rights, antiwar, and women’s movements demonstrated. But it wasn’t only social institutions that were rocked. So was the cradle. A nation crawling with babies was hungry for advice, the simpler the better. The easygoing child-centered approach advocated by Benjamin Spock in his enormously influential, best-selling Common Sense Book of Baby and Child Care, which first came out in 1946, gave parents permission to forgo the feeding schedules and strict discipline of former times and simply enjoy their kids. Hugs were in, spankings were out. But if you’re tracking the demise of grace, you can find a few nicks and cuts in his pages. Since people like children with “sensibly good” manners, Spock writes, “parents owe it to their children to make them likable.” But he also put forth the view that “good manners come naturally” if a child feels good about himself. Yet self-esteem is not the answer to everything. In fact, some researchers blame the self-esteem movement of the 1980s for the rise in narcissism among college students today as compared with those of thirty years ago. Narcissists have a grandiose view of themselves but care little about others; the argument is that parents who fill their children’s ears with how special they are (as opposed to, say, how hard they work or how kind they are) create adults with little patience for those who don’t recognize their superiority. We’ve all encountered plenty of people, young and old, with high opinions of themselves and precious little grace. It is one thing to empower a child with self-worth and confidence and to guide her in becoming a good person. But children who are not taught to behave with consideration for others and to respect other people’s feelings will not develop empathy and compassion. While likable is a perfectly fine quality, it’s a low bar to set for parents. It refers only to how others view the child, and in a bland way at that. Being likable means you’re receiving something—someone’s approval. Compare it with agreeable, which is about giving. It’s other-directed, referring to getting along, being warm, supportive, and helpful, while diminishing the focus on yourself. “Be pretty if you can, be witty if you must, but be agreeable if it kills you!” declared the 1930s Home Institute booklet Charm. Interestingly, Spock’s view of the primacy of likability flips the long-standing Anglo-American notion, prevalent among the Puritans and up through the nineteenth and early decades of the twentieth centuries, that one builds character through service to others, whether God or your fellow man. In this older view, the less you fixate on yourself the better, apart from controlling unruly impulses. Putting priority on others is the right—and graceful—thing to do.

A Culture of Coarseness

What has most threatened grace is what I can only describe as a culture of coarseness. We’re insensitive to our effect on other people. We don’t think about how others feel when we shoot down their ideas in a meeting, when we court laughs at their expense, when we criticize them in front of colleagues. Or when we make it known how little they matter once someone more interesting comes along. I was having lunch with a colleague once when she saw a man she knew passing by on the sidewalk. Waving vigorously through the window to get his attention, she urged him to join us. But the moment he got to our table, before she’d had a chance to introduce us (I’m choosing to believe that was her plan), her cell phone rang. She’d placed it on the table in case this should happen, so of course she took the call, having long forgotten the conversation she’d interrupted by inviting in a guy off the street, and leaving me and a stranger in awkward silence while she also forgot about us. Our devices are draining us of grace. “We need to e-mail!” a friend I haven’t seen in a while calls over her shoulder, because there’s no time to talk. E-mail and texting are convenient, but they also crumple us up physically and make us unaware socially, closed off from those around us. Riding the subway can be like nursery school, what with the manspreaders who don’t want to share the bench they’re sprawling on with wide-open knees and a slump, and the woman who takes up two seats with all her bags and doesn’t much care if you have to stand. Or maybe she doesn’t notice you because she’s very busy texting, like the toy store owner sitting behind the counter who couldn’t be moved to help me find a birthday present for my nephew. Silly me, I thought that she was entering important data on her tablet; it was my savvier preteen daughter who detected instantly the gestures of a stealth texter. With the hours spent hunched over keyboards, no wonder we’re awkward when we get up. Hips tighten, necks droop, our backs round. I watch people walking and standing. Most of us sag in the front, with shoulders pitched forward and chests caving, probably from too much sitting and driving and not enough walking, or walking incorrectly. Our footfalls are heavy; we gaze at the ground or at what’s in our hands. We’ve lost the ability to carry ourselves with upright buoyancy and ease. Grace is not only the furthest thing from our minds, it’s beyond the reach of our bodies. Instead, we’re drawn to disgrace. No teaser is bigger Internet click bait than the one that promises bad behavior: “Mogul Throws Fit Over Spilt Champagne”; Lindsay Lohan gets kicked out of a hotel; Justin Bieber moons his fans on Instagram. Reality TV thrives on disgrace. Fans watch it for the awkward moments, for people to be told they’re fired, they suck, they’re the weakest link. The appeal of American Idol used to be Simon Cowell bullying a contestant who had volunteered himself for public shaming. Would we ever be so stupid? Of course not. Survivor competitors drag one another through the dirt, physically and verbally; the mothers on Dance Moms put the toddler antics of subway riders to shame. Viewers can puff themselves up in comparison, engage in some vicarious ribbing without responsibility. The glee of disgrace, of course, exists beyond TV. In May 2014, Evan Spiegel, CEO and founder of Snapchat, the ephemeral photo-sharing app, issued an apology after the release of e-mails he’d written to his frat brothers while attending Stanford. Those missives had cheerfully chronicled getting sorority girls (“sororisluts”) drunk and musing about whether he’d peed on his date. Typical frat boy fun, some said. Are we too easily outraged? Or are we numb to what is truly outrageous (torture, for starters), because we’re overoutraged? Internet outrage has become a fact of life, a ritual of righteous indignation practiced after the inappropriate tweet. Outrage is such a satisfying cycle: First there is a celebrity faux pas; then the offended take to Twitter, the defenders counterattack, the bloggers repost, a Facebook fight erupts, and after all the time invested in following this trail—trust me, even your respected local newspaper is following this trail—why, there’s a new dumb thing to get mad about. We’re in an environment of grabbing and taking: taking advantage, taking control, taking for oneself. Grace, by contrast, is associated with giving. The three Charites of Greek mythology, you’ll recall, are the givers of charm, beauty, and ease. In so many fields of activity—sports, entertainment, business—-success isn’t just winning, it’s crushing. Total domination is the desired image to project. Power is valued over grace; taking is celebrated. Giving is considered a lesser quality, even a weakness. These are the days of category-killing control and sensory bombardments by any means necessary. It’s as if society at large has been captivated by the steroid aesthetic of today’s sports. Asked by business analysts if he was going to retire at sixty-five, Boeing CEO Jim McNerney said no, despite it being company custom, and by way of explanation—offered to people he wanted to impress, no less—he chose to depict himself as a monster. “The heart will still be beating, the employees will still be cowering,” he said. “I’ll be working hard. There’s no end in sight.” This prompted another memorable public apology. Yet McNerney’s original phrasing was telling, right up to his last words. There’s no end in sight. Perpetual power: Why give it up if you’re on a roll? Why give up anything if you’re in a position to take? If those down the rungs have anything to relinquish—if they can be made to cower, to give back benefits and raises and job security—then that must be done, because it can be done. Bigger may be better, but gigantic is best, whether it’s profits, or the wedding of Kanye West and Kim Kardashian, or the tech effects of a Hollywood blockbuster. (Just look at how the intimate, human-scale charm of The Wizard of Oz gave way to the massive 3-D spectacle of Oz the Great and Powerful, with its CGI landscape, booming soundtrack, explosions, and strained seriousness.) In all of this, being compassionate and humble, generous and considerate, elegantly restrained rather than a show-off, at ease instead of in-your-face—in short, being graceful—seems rather behind the times. “Go out of your way to do something nice for somebody—it will make your heart warm,” urged a 1935 guide, Personality Preferred! How to Grow Up Gracefully. This book, like others of its era, took a holistic view of grace as a way of being that one acquired through habits of the body, mind, and spirit. “Grace isn’t just a set of behaviors you dust off and display on special occasions,” author Elizabeth Woodward explained to her young readers. “It’s how you carry yourself every day.” Woodward, an editor at Ladies’ Home Journal, wrote her book after getting hundreds of thousands of letters from young women seeking advice. Before the upheavals in the mid-twentieth century, growing-up advice to young people, such as Woodward’s book, generally followed a course set in antiquity. Making one’s way in the world was seen as an art, something to be practiced and perfected. It was in some ways like a lifelong dance, with rules and steps and choreography, as well as the need for rehearsal. This art of living incorporated not only what people said and how they behaved at dinner or in the parlor, but how they moved in many ways, large and small. Control of the body through posture and proper body language has long been a part of “conduct books.” In How to Grow Up Gracefully and publications like it, for example, it is essential to the graceful life. Excerpted from "The Art of Grace: On Moving Well Through Life" by Sarah L. Kaufman. Copyright © 2015 by Sarah Kaufman. Reprinted by permission of W.W. Norton & Co. 

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 13:30

Your brain cells can now be turned on and off like a bedside lamp: A new discovery in optogenics

Scientific American Optogenetics is probably the biggest buzzword in neuroscience today. It refers to techniques that use genetic modification of cells so they can be manipulated with light. The net result is a switch that can turn brain cells off and on like a bedside lamp. The technique has enabled neuroscientists to achieve previously unimagined feats and two of its inventors—Karl Deisseroth of Stanford University and the Howard Hughes Medical Institute and Ed Boyden of Massachusetts Institute of Technology—received a Breakthrough Prize in the life sciences on November 8 in recognition of their efforts. The technology is able to remotely control motor circuits—one example is having an animal run in circles at the flick of a switch. It can even label and alter memories that form as a mouse explores different environments. These types of studies allow researchers to firmly establish a cause-and-effect relationship between electrical activity in specific neural circuits and various aspects of behavior and cognition, making optogenetics one of the most widely used methods in neuroscience today. As its popularity soars, new tricks are continually added to the optogenetic arsenal. The latest breakthroughs, promise to deliver the biggest step forward for the technology since its inception. Researchers have devised ways of broadening optogenetics to enter into a dynamic dialogue with the signals moving about inside functioning brains. The term optogenetics usually refers to the control of neurons. Researchers insert a gene for a light-sensitive protein into cells. The cells then produce that protein on their surfaces. When these cells are later exposed to light, channels open and charged particles (positive sodium ions) rush inside, causing the cell to “fire” a “spike,” sending an electrical signal to other cells. The most commonly used proteins are “channelrhodopsins,” originally discovered in algae, but there is also a protein from a bacterium found in Egyptian salt lakes that has the opposite effect. Negative, rather than positive (chlorine) ions, rush into the cell, which prevents it from firing. Researchers can thus use these two actuator proteins to switch neurons on and off, using light. This can be achieved via fiber-optic cables, so researchers can manipulate neurons in freely moving animals and observe the effects on behavior. The genes are delivered by various forms of genetic manipulation. Different genes are turned on, or expressed in different types of cell, so the gene is accompanied by a special genetic sequence, called a promoter, which is only active in specific cell types, thus ensuring the protein is only produced in the desired target. More generally, optogenetics refers to any method for communicating with cells using genetics plus optics—and that can mean observing cell activity, not just turning a neuron on or off. Nongenetic approaches, such as fluorescent dyes that increase cell illumination in response to activity, have been used previously, but lack the precision of targeting a particular type of cell. A new way to watch what’s happening with cells utilizes the same genetic targeting methods used to switch circuits off and on, Except now indicator proteins are integrated into selected cells via genetic  tweaks. The indicators generally consist of a protein sensitive to cell activity, linked to a fluorescent protein, so they light up in response to a cell’s firing. Combining these genetically targeted optical readouts with the standard armamentarium of tools for controlling cell activity unlocks the full potential of optogenetics. The combined technique allows researchers to hold two-way conversations with populations of selected neurons, using only pulses of light. Once various technical difficulties are overcome, researchers will be able to carry out this dialogue with single neurons, in real-time, allowing a level of interaction with awake, functioning brains, that has not been possible before now. At the recent Society for Neuroscience meeting in Chicago, several leading researchers spoke about “all-optical interrogation of neural circuits” and co-wrote an accompanying review in The Journal of Neuroscience. They outlined the challenges involved and described pioneering work to overcome these obstacles. The approach has the potential to shed light, in literal terms, on the relationship between brain activity, on one hand, and cognition, behavior and emotion, on the other. This goal fits well with the U.S. BRAIN initiative, which aims to encourage the development of new tools for exploring the link between neural signals and cognition. One of the speakers and co-authors was neuroscientist and practicing psychiatrist Deisseroth of Stanford University, whose work developing the original channelrhodopsin-based photosensitive actuator cells together with colleague Boyden was what earned the pair their recent Breakthrough Prize. Deisseroth's research has always been conducted with an eye on psychiatry. This new work focuses on overcoming limitations of existing technical methods. Two main types of genetically inserted indicators are commonly used for these all-optical apparatuses. Calcium indicators exploit the fact that when neurons fire, calcium channels on cells open, causing levels of calcium to rise. Indicators make use of this to deform a calcium-sensitive protein, which is linked to a fluorescent protein that emits light. The main problem is speed. “Calcium signals are slow, they last one second or so, and the brain is a little faster than that,” says Thomas Knöpfel, chair of Optogenetics and Circuit Neurosciences at Imperial College London, who didn't speak at the session. Furthermore, calcium levels can also change without neurons firing, and some important changes that don't actually cause neurons to fire don't alter calcium levels. This is because calcium is a proxy for the signal researchers are really interested in—voltage. Knöpfel has been developing genetically encoded voltage indicators (GEVIs) for 20 years. The main problem, compared with calcium indicators, is the signals are weaker and are harder to detect. The problems are only exacerbated by the faster signals, which require shorter exposure times. The signals also tend to be noisier. Another challenge is observing or stimulating cells deep inside the brain. Traditional one-photon microscopy suffers from poor depth penetration and image quality—photons are absorbed and scattered by tissue. Two-photon microscopy overcomes this using near-infrared light. The longer wavelength light penetrates tissue, but because the photons have less energy, two must strike a protein to excite it, hence the name. This has the advantage that only proteins in the tiny focus-spot of the beam are stimulated, but it also means that, when trying to stimulate a neuron, few channels are activated, which may not be enough to trigger a spike. There are two ways to solve this issue: One is by using scanning lasers, which quickly scan a laser-beam across a target (whether one or multiple cells), activating many channels sequentially. The other involves parallel approaches, which use holographic techniques to shape a beam into the required pattern, illuminating the whole target at once. This method can even produce three-dimensional illumination patterns that stimulate cells at different depths. The main advantage though, is speed. “Applications that require precise control over spike timing are better using parallel approaches,” says Valentina Emiliani of the neurophotonics laboratory at Paris Descartes University, lead review author who presented her group's holographic work at the conference. The biggest obstacle though, is that both stimulating and recording activity with light causes problems if wavelengths overlap. This is especially challenging because the proteins used as indicators need to be excited by light in order to emit light. “The compounds used for imaging and photo-stimulation have very overlapping spectra,” Emiliani says. “It's difficult to illuminate your preparation to do imaging, while making sure you don't also photo-stimulate.” Likewise, researchers must be careful the indicator signals they record are not corrupted by wavelengths used for stimulation. Much work in the field is therefore now focused on finding proteins whose wavelengths don't overlap. For instance, Harvard University biophysicist Adam Cohen and his team presented work, together with Ed Boyden's group at M.I.T., which combines a channelrhodopsin shifted toward blue wavelengths with a voltage-indicator that emits near-infrared. The groups have used the indicator called QuasAr in live mice, and stimulated neurons with blue light while monitoring in red, in stem-cell cultures derived from people with Lou Gehrig's disease. They plan to test the combination in live animals next. Once these challenges are overcome, all-optical techniques could revolutionize neuroscience by allowing researchers to simultaneously control and monitor precisely even single spikes from either single neurons or large ensembles of neurons as experimental animals move freely about. "This approach will open a whole new range of experiments," says Michael Hausser, a neuroscientist at University College London, who co-chaired the conference session alongside Emiliani. “Unlocking the full potential of optogenetics requires going beyond targeting genetically-defined cell-types, to targeting cells according to functional properties, rather than just genetic identity.” In other words, rather than just passively monitoring, or observing the results of an experiment in which the level of stimulation of neurons must be carefully planned beforehand, these innovations will enable researchers to adapt how they stimulate cells depending on how cells are behaving. “If you can tailor stimulation to activity patterns, you can do the manipulation on the fly,” Hausser explains. “For instance, during a decision-making experiment, if you can track the activity of ensembles of neurons in real time, you can influence behavior more effectively by manipulating ensembles as they form.” “These are two game-changing advances, compared to what we could do before: targeting functional ensembles and manipulating activity in real time,” Hausser says. “Ultimately these approaches could help define the neural codes used in the brain to drive behavior.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 12:00

Bill Maher’s moral superiority: Grandstanding on Islam and political correctness while tragedy in Paris unfolds

“I’m not demonizing, I’m characterizing,” Bill Maher said of the entire religion of Islam on Friday evening, in what appeared to be complete sincerity. Maher is a professed liberal comedian with a show called “Real Time” on HBO, and yesterday, the first live major political reaction to the coordinate terrorist attacks in Paris was on his show. At the time "Real Time" was filmed—10 p.m. on the East Coast—the identities or affiliations of the terrorists that killed over 120 civilians in Paris were not known. Rumor had it that they were acting on the behalf of the Islamic State, commonly known as ISIS — Wolf Blitzer was saying "if it's ISIS" early in CNN's breaking news coverage. In spite of chaos and terror, France 24, the primary source of French news for those of us outside of France, was incredibly restrained. Speculation was ruthlessly curtailed; assumptions and confabulations were sidelined. Other discussions may have been happening elsewhere, but the primary feed of French news to English-speaking networks—so primary, indeed, that CNN, CBS, and the BBC were simulcasting from it—was a deliberate and calm livestream of reportage. Maher nodded to the ambiguity, but added, due to the use of AK-47s and suicide vests, "It's probably not the Amish." Later this morning, it was reported that ISIS claimed responsibility for the attacks. Once again, Bill Maher was right. It is the rightness of Bill Maher that, to anyone with a more developed sense of compassion, ends up being the most frustrating. Anyone that morally superior should be wrong all the time, just so the wide grin of their unflappable arrogance might be rightfully deflated. But Maher's show—caught up in the semantics between demonizing and characterizing—reflects exactly what is happening in the rest of the world right now. Bill Maher may sound entirely reductive when he dismisses the existence of moderate Muslims, or conflates ISIS and very extreme practices of Islam like female genital mutilation with the entirety of the religion. But he is vocalizing the same dismissal and conflation that has characterized all of Western media since 2001—albeit, not always in quite so many words. His on-going cross-examination of Islam is also the same kind of prolonged trial that the media subjects to Muslims around the world. Within mere minutes of the reported attacks, grandstanding conservatives on Twitter were shouting about refugees, immigration, Hillary Clinton's culpability, and how complaining college students should stop complaining, because the only time anyone is really infringing on your rights is when an Islamic terrorist is wearing a suicide vest. Maher puts Islam on trial because everyone else is doing it. He's right; he's just, in the immortal words of "The Big Lebowski," also an asshole. The comedian didn’t deviate too much from his pre-written set to address the events in Paris—which were still unfolding as the taping went on, a tricky line to walk for any showman. (He sang a few bars of "La Marseillaise" to open the show.) But in the sit-down interview, with Daily Beast writer Asra Nomani, Paris came up again and again. Nomani is an activist who was raised Muslim and now campaigns for radical reform within the religion; her angle, specifically, is to raise the status of women in Islam. With Maher, Nomani ended up repeating and verifying all of his talking points, in what read like an exercise of ego. Following yesterday's terrorist attacks—in Beirut as well as Paris—it felt painfully insensitive, a dismissal of the real human tragedy of the terrorists' actions in favor of proving to each other who was more right about it. And being right is a weird and complicated thing to be, in the aftermath of inexplicable, unimaginable horror. There is something laudable, to a degree, about the way that Maher identifies the most corrosive issues in liberal politics and shapes them into some kind of progressive agenda; jokes aside, he manages to identify and confirm the worst fears of the audience, bleeding out the righteous indignation and turning it into some semblance of talking points. What's scary is that those talking points—that agenda—exist mostly to allow Maher and his audience to be freed from culpability for the ills of the world. Only idiots would take religion so seriously; only idiots would kowtow to political correctness; only idiots would vote Republican. Maher is obsessed with being right, and sure, he often is. Maybe we would all be better off if none of us, for example, believed in a higher power and/or subscribed to organized religion. But here, in this universe, many of us do. In a line of thinking that, it turns out, dovetails rather well with what ISIS wants to happen, Nomani and Maher repeatedly posited that there are no moderate Muslims. Maher recounted a letter he received from a man in Saudi Arabia, who told him that all moderate Muslims are now in jail. "I don't understand liberals who don't understand this man," he said. Maher's closing statements expressed similar bafflement at his chosen audience. After six minutes of telling white people they have things really good—in terms of incarceration rates, personal wealth, and standard of living—he made a quip about black women, joking that the reason they have less individual savings than white women is because they went and spent it all on a weave. The audience gasped more than laughed. “Fuck you, you politically correct assholes,” Maher said, laughing. “One joke! One joke about the blacks." I think Maher confuses compassion with idiocy. Compassion is a quality that has nothing to do with how smart or how right you are. It's a quality that is at the root of not wanting to make generalizations, and at the root of wanting to say things that do not horrifically offend other systematically oppressed people. I fully believe that Maher doesn't understand those well-meaning liberals, those politically correct assholes. I would just rather be one of them, I think, than to merely be right; I would like to be able to understand another point of view, from time to time. And especially on a day like yesterday, I would like to be able to feel compassion.“I’m not demonizing, I’m characterizing,” Bill Maher said of the entire religion of Islam on Friday evening, in what appeared to be complete sincerity. Maher is a professed liberal comedian with a show called “Real Time” on HBO, and yesterday, the first live major political reaction to the coordinate terrorist attacks in Paris was on his show. At the time "Real Time" was filmed—10 p.m. on the East Coast—the identities or affiliations of the terrorists that killed over 120 civilians in Paris were not known. Rumor had it that they were acting on the behalf of the Islamic State, commonly known as ISIS — Wolf Blitzer was saying "if it's ISIS" early in CNN's breaking news coverage. In spite of chaos and terror, France 24, the primary source of French news for those of us outside of France, was incredibly restrained. Speculation was ruthlessly curtailed; assumptions and confabulations were sidelined. Other discussions may have been happening elsewhere, but the primary feed of French news to English-speaking networks—so primary, indeed, that CNN, CBS, and the BBC were simulcasting from it—was a deliberate and calm livestream of reportage. Maher nodded to the ambiguity, but added, due to the use of AK-47s and suicide vests, "It's probably not the Amish." Later this morning, it was reported that ISIS claimed responsibility for the attacks. Once again, Bill Maher was right. It is the rightness of Bill Maher that, to anyone with a more developed sense of compassion, ends up being the most frustrating. Anyone that morally superior should be wrong all the time, just so the wide grin of their unflappable arrogance might be rightfully deflated. But Maher's show—caught up in the semantics between demonizing and characterizing—reflects exactly what is happening in the rest of the world right now. Bill Maher may sound entirely reductive when he dismisses the existence of moderate Muslims, or conflates ISIS and very extreme practices of Islam like female genital mutilation with the entirety of the religion. But he is vocalizing the same dismissal and conflation that has characterized all of Western media since 2001—albeit, not always in quite so many words. His on-going cross-examination of Islam is also the same kind of prolonged trial that the media subjects to Muslims around the world. Within mere minutes of the reported attacks, grandstanding conservatives on Twitter were shouting about refugees, immigration, Hillary Clinton's culpability, and how complaining college students should stop complaining, because the only time anyone is really infringing on your rights is when an Islamic terrorist is wearing a suicide vest. Maher puts Islam on trial because everyone else is doing it. He's right; he's just, in the immortal words of "The Big Lebowski," also an asshole. The comedian didn’t deviate too much from his pre-written set to address the events in Paris—which were still unfolding as the taping went on, a tricky line to walk for any showman. (He sang a few bars of "La Marseillaise" to open the show.) But in the sit-down interview, with Daily Beast writer Asra Nomani, Paris came up again and again. Nomani is an activist who was raised Muslim and now campaigns for radical reform within the religion; her angle, specifically, is to raise the status of women in Islam. With Maher, Nomani ended up repeating and verifying all of his talking points, in what read like an exercise of ego. Following yesterday's terrorist attacks—in Beirut as well as Paris—it felt painfully insensitive, a dismissal of the real human tragedy of the terrorists' actions in favor of proving to each other who was more right about it. And being right is a weird and complicated thing to be, in the aftermath of inexplicable, unimaginable horror. There is something laudable, to a degree, about the way that Maher identifies the most corrosive issues in liberal politics and shapes them into some kind of progressive agenda; jokes aside, he manages to identify and confirm the worst fears of the audience, bleeding out the righteous indignation and turning it into some semblance of talking points. What's scary is that those talking points—that agenda—exist mostly to allow Maher and his audience to be freed from culpability for the ills of the world. Only idiots would take religion so seriously; only idiots would kowtow to political correctness; only idiots would vote Republican. Maher is obsessed with being right, and sure, he often is. Maybe we would all be better off if none of us, for example, believed in a higher power and/or subscribed to organized religion. But here, in this universe, many of us do. In a line of thinking that, it turns out, dovetails rather well with what ISIS wants to happen, Nomani and Maher repeatedly posited that there are no moderate Muslims. Maher recounted a letter he received from a man in Saudi Arabia, who told him that all moderate Muslims are now in jail. "I don't understand liberals who don't understand this man," he said. Maher's closing statements expressed similar bafflement at his chosen audience. After six minutes of telling white people they have things really good—in terms of incarceration rates, personal wealth, and standard of living—he made a quip about black women, joking that the reason they have less individual savings than white women is because they went and spent it all on a weave. The audience gasped more than laughed. “Fuck you, you politically correct assholes,” Maher said, laughing. “One joke! One joke about the blacks." I think Maher confuses compassion with idiocy. Compassion is a quality that has nothing to do with how smart or how right you are. It's a quality that is at the root of not wanting to make generalizations, and at the root of wanting to say things that do not horrifically offend other systematically oppressed people. I fully believe that Maher doesn't understand those well-meaning liberals, those politically correct assholes. I would just rather be one of them, I think, than to merely be right; I would like to be able to understand another point of view, from time to time. And especially on a day like yesterday, I would like to be able to feel compassion.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 14, 2015 11:45