Helen H. Moore's Blog, page 1006

August 23, 2015

I’m black, but I’m uncomfortable around black people

It happened. I failed the “black” test. My hair stylist and I were chatting while she was taking a break from retightening my locs. I made a funny quip, and she extended her palm so that we could partake in the standard Black American handshake. In what was most likely the longest three seconds in the universe, I stared at her hand in befuddlement, trying to figure out what she was doing. By the time I realized that this was the handshake, it was too late. I tried to recover with some weird amalgamation of a fist bump and a high five, but the damage had been done. I had revealed myself to be the Carlton to her Fresh Prince.

I replayed the scene over and over in my head during my walk to the train. How could I have been so oblivious to an obvious cultural norm? This set off a mini existential crisis where I came to one of my greatest philosophical epiphanies: I’m uncomfortable around black people. This is a peculiar realization being that I am also a black person.

But you see, my stylist embodies a certain Harlem black cool I’ve always been told (by white people) that I lack. Every time I walk into the black barbershop where she does hair, I feel like I’m going to be “found out.” In my mind when other black people see me, they’re thinking: “She may look black, but she’s not black black, if you know what I mean.”

Where does this discomfort come from? And why do I think of Blackness as a test I am doomed to fail?

Like most psychological problems, it all began in my childhood, specifically the eight years I spent living in all white towns in rural Wisconsin. If there was one phrase I heard more than “nigger,” it was “You’re not black.” Talk about irony.  

Sometimes it was phrased as a “compliment,” meaning you’re one of the good black people. But other times it was meant so white people, whose sole interaction with black culture came through the distorted lens of racist media, could assert their own twisted version of blackness over me.

“I’m blacker than you because I know more Tupac songs than you.”

“You’re not black. Your lips aren’t even that big.”

“You’re not even that black. Look, my ass is fatter than yours.”

“I know so many white girls that can gangsta walk better than you.”

“You’re not black, you can’t even dance!”

It didn’t surprise me that Rachel Dolezal truly thought she was black. I’ve long known that, for many white people, being black is simply checking off a list of well-worn stereotypes.

I always brushed off those comments, because I knew I was black enough to be called “nigger.”  I was black enough that white people stared at me everywhere I went in those lily-white towns. And I was black enough to be accused of stealing during shopping trips.

But if you hear something enough, it can seep into your unconscious and start to guide your decisions. Somewhere along the way I started believing that I wasn’t black enough, whatever that meant. This is the clusterfuck of all realizations: Racism made me uncomfortable around my own people. Ain’t that some shit?

And it even affected my college experience. I never applied to any historical black colleges because I thought everyone would make fun of me because my black wasn’t cool enough. I was more comfortable with the thought of being around white people, where my blackness was for sure going to be denigrated in one form or another, than I was with the thought of being around my own people. By that time I had already accepted racism as a staple of life, but the thought of possibly being rejected by people that looked like me was too much to bear.

Recently I was hanging out with a friend who was born and raised in Harlem. For me she represents the epitome of black cool and I envy that she grew up around black people her entire life. She told me that because of her alternative interests, namely metal music, she was accused of “acting white” by her high school peers.

No black person has ever outright accused me of not being black enough, while that’s all she ever experienced as a teenager. Our childhoods couldn’t have been any more different, but we both grappled with having our own blackness invalidated by superficial parameters.

In the foreword for the book “Black Cool: One Thousand Streams of Blackness,” Henry Louis Gates, Jr. writes: “There are 40 million black people in this country, and there are 40 million ways to be black … I do not mean to suggest that we are all of us in our own separate boxes, that one black life bears no relation to another. Of course not. We are not a monolith, but we are a community.”

It’s taken some time, but now I’m aware that there is no “black test” and that, even though I’m more Carlton than Fresh Prince, my blackness is still valid. My hair stylist doesn’t see me as some racial imposter. To her, I’m just some weirdo who doesn’t know how to do a proper handshake. Resisting the temptation to police my own blackness and the blackness of others has been a gradual process, but a necessary one.

And who knows what I’ve missed out on? How many friends I could’ve made, how many organizations I didn’t join out of fear. For years I isolated myself from the community that Henry Louis Gates, Jr. talks about, keeping potential sources of emotional support at arm’s length. And with new hashtags popping up every day, strong emotional support systems are needed more than ever.

White supremacy takes on many forms. It’s most visible as the daily physical assault on black lives. But we shouldn’t underestimate the psychological effects of something as seemingly simple as how we define what it means to be black.

It happened. I failed the “black” test. My hair stylist and I were chatting while she was taking a break from retightening my locs. I made a funny quip, and she extended her palm so that we could partake in the standard Black American handshake. In what was most likely the longest three seconds in the universe, I stared at her hand in befuddlement, trying to figure out what she was doing. By the time I realized that this was the handshake, it was too late. I tried to recover with some weird amalgamation of a fist bump and a high five, but the damage had been done. I had revealed myself to be the Carlton to her Fresh Prince.

I replayed the scene over and over in my head during my walk to the train. How could I have been so oblivious to an obvious cultural norm? This set off a mini existential crisis where I came to one of my greatest philosophical epiphanies: I’m uncomfortable around black people. This is a peculiar realization being that I am also a black person.

But you see, my stylist embodies a certain Harlem black cool I’ve always been told (by white people) that I lack. Every time I walk into the black barbershop where she does hair, I feel like I’m going to be “found out.” In my mind when other black people see me, they’re thinking: “She may look black, but she’s not black black, if you know what I mean.”

Where does this discomfort come from? And why do I think of Blackness as a test I am doomed to fail?

Like most psychological problems, it all began in my childhood, specifically the eight years I spent living in all white towns in rural Wisconsin. If there was one phrase I heard more than “nigger,” it was “You’re not black.” Talk about irony.  

Sometimes it was phrased as a “compliment,” meaning you’re one of the good black people. But other times it was meant so white people, whose sole interaction with black culture came through the distorted lens of racist media, could assert their own twisted version of blackness over me.

“I’m blacker than you because I know more Tupac songs than you.”

“You’re not black. Your lips aren’t even that big.”

“You’re not even that black. Look, my ass is fatter than yours.”

“I know so many white girls that can gangsta walk better than you.”

“You’re not black, you can’t even dance!”

It didn’t surprise me that Rachel Dolezal truly thought she was black. I’ve long known that, for many white people, being black is simply checking off a list of well-worn stereotypes.

I always brushed off those comments, because I knew I was black enough to be called “nigger.”  I was black enough that white people stared at me everywhere I went in those lily-white towns. And I was black enough to be accused of stealing during shopping trips.

But if you hear something enough, it can seep into your unconscious and start to guide your decisions. Somewhere along the way I started believing that I wasn’t black enough, whatever that meant. This is the clusterfuck of all realizations: Racism made me uncomfortable around my own people. Ain’t that some shit?

And it even affected my college experience. I never applied to any historical black colleges because I thought everyone would make fun of me because my black wasn’t cool enough. I was more comfortable with the thought of being around white people, where my blackness was for sure going to be denigrated in one form or another, than I was with the thought of being around my own people. By that time I had already accepted racism as a staple of life, but the thought of possibly being rejected by people that looked like me was too much to bear.

Recently I was hanging out with a friend who was born and raised in Harlem. For me she represents the epitome of black cool and I envy that she grew up around black people her entire life. She told me that because of her alternative interests, namely metal music, she was accused of “acting white” by her high school peers.

No black person has ever outright accused me of not being black enough, while that’s all she ever experienced as a teenager. Our childhoods couldn’t have been any more different, but we both grappled with having our own blackness invalidated by superficial parameters.

In the foreword for the book “Black Cool: One Thousand Streams of Blackness,” Henry Louis Gates, Jr. writes: “There are 40 million black people in this country, and there are 40 million ways to be black … I do not mean to suggest that we are all of us in our own separate boxes, that one black life bears no relation to another. Of course not. We are not a monolith, but we are a community.”

It’s taken some time, but now I’m aware that there is no “black test” and that, even though I’m more Carlton than Fresh Prince, my blackness is still valid. My hair stylist doesn’t see me as some racial imposter. To her, I’m just some weirdo who doesn’t know how to do a proper handshake. Resisting the temptation to police my own blackness and the blackness of others has been a gradual process, but a necessary one.

And who knows what I’ve missed out on? How many friends I could’ve made, how many organizations I didn’t join out of fear. For years I isolated myself from the community that Henry Louis Gates, Jr. talks about, keeping potential sources of emotional support at arm’s length. And with new hashtags popping up every day, strong emotional support systems are needed more than ever.

White supremacy takes on many forms. It’s most visible as the daily physical assault on black lives. But we shouldn’t underestimate the psychological effects of something as seemingly simple as how we define what it means to be black.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 17:00

You can’t make “Game of Thrones” on a YouTube budget: Why “it’s the best of times and the worst of times” for prestige TV

One subject that’s developed a kind of consensus around it – something fanboys and sophisticates, couch potatoes and cineastes, optimists and pessimists alike can agree on – is the superiority of today’s television. The business changed after “The Sopranos” in a way that allowed all kinds of good things to happen, especially on cable and streaming services, where shows as different as “The Wire,” “Orange Is the New Black,” “Breaking Bad,” “Mad Men” and “Game of Thrones” have dealt with serious social problems, showcased first-rate acting, looked at history in fresh ways, and accomplished all kinds of other storytelling feats that TV has rarely done as well. Some of these shows have been compared to serious novels and the great work of cinema’s auteurs. Brett Martin, who wrote the definitive book – “Difficult Men” – on what his subtitle calls TV’s “creative revolution” – says things seem to still be going in the right direction. “It’s a lot of talented people doing a lot of great work,” he told Salon. “It’s giving money to people with talent and vision.” But can the golden age last forever? Some observers wonder. At the recent television press tour, FX Networks CEO John Landgraf warned that we may be reaching peak TV, with the television business “in the late stages of a bubble. We’re seeing a desperate scrum — everyone is trying to jockey for position. We’re playing a game of musical chairs, and they’re starting to take away chairs.” The stock prices for television and media companies have fallen substantially in recent weeks, and insiders worry about cord-cutting, un-bundling, online viewing and piracy. All of this sounds abstract, but the ability of networks to fund ambitious shows – most of them with very high production costs – has to do not just with a creative revolution, but an economic one. People pay for television in a way they rarely do for other forms of culture, and when they stop, so will the flow of great programming. We spoke to Robert Levine, author of “Free Ride: How Digital Parasites are Destroying the Culture Business and How the Culture Business Can Fight Back,” about why TV has thrived, what makes HBO different, and the threats to the next generation of “Game of Thrones” or “Mad Men.” Let’s start out with the fact that television seems like the most successful cultural form these days. How has it done better – made more money, employed creative people, improved at the level of quality – in a way that movies, books, music and other genres haven’t? The reason TV has done the best is because of the way it’s sold. If you think of the way books or movies or music is sold, the dominant mode was always – and still is – that you buy what you want to see. In TV, you buy all of it. If you [previously] bought four albums, you just then buy two. It was tricky, until recently – and still is – to buy half as much cable because it’s packaged. People complain about it: I pay a lot for ESPN and I’m not sure I’ve ever turned it on. But that created an incentive for [cable companies] to create shows with a lot of appeal for a small audience. You’ve had upward competition instead of down. Many of the people running entertainment corporations came from the television divisions: It’s a reflection of how dominant TV became. These were once movie studios who picked up television channels: Now they’re entertainment conglomerates fueled by television. And movies are a small subset of that. Television drives revenue and profit. Let’s break it down for a minute to one show. HBO has had an enormous popular and critical hit with “Game of Thrones.” What needed to go right, during this precarious period for other kinds of culture, for that show to be made? It’s got an enormous production budget by the standards of 20 years ago. You’re not buying HBO by the episode. You’re not even buying it by the show. Their primary method of doing business is you buy all of HBO. That gives them an incentive – like the larger incentive for cable – they have to produce something a small number of people love, not something a large number of people like. I would pay for HBO just for “Game of Thrones”; they’re sort of in the business of taking moonshots. Or “True Detective” – a lot of people feel very strongly about it, a lot of people don’t like it… But they don’t have to produce something that a lot of people will watch so they can sit through ads. They have to produce something that a few people might love. That’s the model that drove “Game of Thrones.” That niche model, then, is very different from the way TV worked in the 50s, 60s, 70s. Some of these things that seemed like a niche became much bigger: Take “The Walking Dead”… I don’t think people thought it would become so successful. In the ’50s, ’60s, and ’70s, it was pretty simple: x was the number of people watching, y is what you charged for an ad, x times y was the money you made. You had to get as many people to watch as possible. Remember, too, the setting was different: When I was growing up, we had one TV in the den. What did we watch? It wasn’t what my dad liked best or what I liked best, but what we could all agree on. It wasn’t the lowest, but the broadest common denominator: What will the whole family tolerate? Now there’s another screen, or you can watch it later. You get more different kinds of choices. There’s more good TV, more bad TV; there’s an explosion of niches. It’s not just that TV got smarter: TV got more everything. So it was important, for “Game of Thrones” to be possible, that it was on a network that had a lot going on, and that people couldn’t use in an a la carte way. If you got HBO, you were paying a lot of money for a lot of programming: Some of them might work, some might not. But all of them helped fund the other, make either possible. That’s true to some extent on every channel. But on other channels, there’s more need to justify every show: Show x needs to bring in y to sell z ads… HBO is more of a coherent whole, so there’s more of an urge to do prestige projects. Same with Showtime and some of the other channels? And same with Netflix and Amazon. And some of the other cable channels. HBO is now competing with Netflix, so it has to deliver more quality content than Nexflix – that’s a great competition. Amazon feels like it’s in the same place, and that’s gonna get a lot more TV. What’s next, then? Right now, it’s the best of times and the worst of times. Over the last decade or more, you’ve had a bunch of incredible incentives to develop a lot of interesting TV for specific audiences. That’s resulted in a lot of people rushing into that business. When HBO had “The Sopranos” and “Sex and the City,” there was nothing else like them. Now there’s a lot of sophisticated shows. But people only have so much time. If you look at the comments made by the head of FX – I don’t know if we’ve reached peak television, but we’ve reached peak, quote unquote, quality television. There are only so many shows I can watch where I have to keep track of dozens of characters. And all those shows are considered more and more important; these expensive shows have good secondary markets – online, DVDs, overseas, airplanes, hotels, you name it. But how many of those can people watch? Increasingly, you see quality shows with expensive development budgets get ignored in a way that might not have been five or 10 years ago. “Masters of Sex” is better than it is popular. “Sense8” is supposed to be really good; I’ve not gotten a chance to watch it yet. I think FX is doing a lot of interesting stuff. But it can be hard to get enough people to watch. At the same time, you have a phenomenon where more and more people are watching this stuff online -- stuff you’re not paying extra for like FX and AMC. If you’re watching them online, they may not be making that much money on them. It’s the best of times because the current model is robust. But it’s the worst of times because we’ve probably reached peak TV. People are watching shows that don’t make those shows that much money. You saw a lot of downer movement in media stocks this week. Look at Comedy Central: “Inside Amy Schumer” is a really cool show that a lot of people are watching, but a lot of them are watching it on YouTube – that doesn’t fund the watching of more shows. I don’t think anyone has really figured out what to do about that. Is that the kind of thing that drove the stock decline? Broadly speaking – that the cable model that seemed so strong and so good [may be weak.] A lot of younger people are not subscribing. And you can watch a lot of stuff online – perfectly legally in a lot of cases. That doesn’t bring in a lot of revenue. There may be a way to produce “Amy Schumer” for an online audience, maybe. Definitely you could not do “Game of Thrones” that way. So what happens when HBO in five years is getting two-thirds the revenues it is now? What happens to the expensive, high-end shows that they deliver? The future may be pretty good for HBO. The question is, Will it be as good for Showtime [and others]? I have Netflix, I have cable, I have Amazon. That’s a lot of different TV. And there’s Hulu… How many of these things are people gonna subscribe too? Besides the money, what a pain in the ass. People might want one or two things and see anything else as a hassle. So what does that mean for the sophisticated shows that people have gotten accustomed to? Does that winnowing pose a threat to them? Right now it’s a winner-take-all market: You’re gonna see people risking stuff. There’s that show “Vinyl,” produced by Mick Jagger and Martin Scorsese: People are gonna throw a lot of things against the wall to see what sticks. What’s gonna happen is what happened to movies: People will say, The really hard part is the marketing, let’s try to base this on a new factor. That needn’t get you “Ant-Man” -- “Better Call Saul” has a built-in audience. We’re already seeing some of this: The two biggest new AMC shows are “Better Call Saul” and “Fear of the Walking Dead,” because they have a built-in audience. That doesn’t have to lead you to mediocrity, but there’s a lot of pressure… People thought “Tyrant” was a good show on FX, or this Denis Leary rock ‘n’ roll show. But it’s a lot easier to do a spinoff of something. When you have a rush into a profitable business, you tend to get an emphasis on marketing. You can spend a lot of money, or you can ride on things people know. Does Turtle from “Entourage” get his own show? Remember “Joanie Loves Chachi”? It’s not like the characters were so unbelievably fascinating. It was, “Hey, we need to launch 10 new shows this year. The other networks are launching 10 new shows. What can we do to set ourselves apart? We can adapt something… Or find a story people already know and keep telling it.” At some point, people will stop making high-end expensive shows. So there’s so much stuff out there competing for people’s time, we might get a “Games of Thrones” – that had a fanbase from the books – but we might not get “Mad Men” or “The Wire,” which don’t have immediate name recognition. It’s not that we won’t get it. But it will be harder. And even if you do everything right, you can have business problems. Why? People are watching stuff online. If you have to do these shows on a YouTube budget, it’s gonna be really hard.One subject that’s developed a kind of consensus around it – something fanboys and sophisticates, couch potatoes and cineastes, optimists and pessimists alike can agree on – is the superiority of today’s television. The business changed after “The Sopranos” in a way that allowed all kinds of good things to happen, especially on cable and streaming services, where shows as different as “The Wire,” “Orange Is the New Black,” “Breaking Bad,” “Mad Men” and “Game of Thrones” have dealt with serious social problems, showcased first-rate acting, looked at history in fresh ways, and accomplished all kinds of other storytelling feats that TV has rarely done as well. Some of these shows have been compared to serious novels and the great work of cinema’s auteurs. Brett Martin, who wrote the definitive book – “Difficult Men” – on what his subtitle calls TV’s “creative revolution” – says things seem to still be going in the right direction. “It’s a lot of talented people doing a lot of great work,” he told Salon. “It’s giving money to people with talent and vision.” But can the golden age last forever? Some observers wonder. At the recent television press tour, FX Networks CEO John Landgraf warned that we may be reaching peak TV, with the television business “in the late stages of a bubble. We’re seeing a desperate scrum — everyone is trying to jockey for position. We’re playing a game of musical chairs, and they’re starting to take away chairs.” The stock prices for television and media companies have fallen substantially in recent weeks, and insiders worry about cord-cutting, un-bundling, online viewing and piracy. All of this sounds abstract, but the ability of networks to fund ambitious shows – most of them with very high production costs – has to do not just with a creative revolution, but an economic one. People pay for television in a way they rarely do for other forms of culture, and when they stop, so will the flow of great programming. We spoke to Robert Levine, author of “Free Ride: How Digital Parasites are Destroying the Culture Business and How the Culture Business Can Fight Back,” about why TV has thrived, what makes HBO different, and the threats to the next generation of “Game of Thrones” or “Mad Men.” Let’s start out with the fact that television seems like the most successful cultural form these days. How has it done better – made more money, employed creative people, improved at the level of quality – in a way that movies, books, music and other genres haven’t? The reason TV has done the best is because of the way it’s sold. If you think of the way books or movies or music is sold, the dominant mode was always – and still is – that you buy what you want to see. In TV, you buy all of it. If you [previously] bought four albums, you just then buy two. It was tricky, until recently – and still is – to buy half as much cable because it’s packaged. People complain about it: I pay a lot for ESPN and I’m not sure I’ve ever turned it on. But that created an incentive for [cable companies] to create shows with a lot of appeal for a small audience. You’ve had upward competition instead of down. Many of the people running entertainment corporations came from the television divisions: It’s a reflection of how dominant TV became. These were once movie studios who picked up television channels: Now they’re entertainment conglomerates fueled by television. And movies are a small subset of that. Television drives revenue and profit. Let’s break it down for a minute to one show. HBO has had an enormous popular and critical hit with “Game of Thrones.” What needed to go right, during this precarious period for other kinds of culture, for that show to be made? It’s got an enormous production budget by the standards of 20 years ago. You’re not buying HBO by the episode. You’re not even buying it by the show. Their primary method of doing business is you buy all of HBO. That gives them an incentive – like the larger incentive for cable – they have to produce something a small number of people love, not something a large number of people like. I would pay for HBO just for “Game of Thrones”; they’re sort of in the business of taking moonshots. Or “True Detective” – a lot of people feel very strongly about it, a lot of people don’t like it… But they don’t have to produce something that a lot of people will watch so they can sit through ads. They have to produce something that a few people might love. That’s the model that drove “Game of Thrones.” That niche model, then, is very different from the way TV worked in the 50s, 60s, 70s. Some of these things that seemed like a niche became much bigger: Take “The Walking Dead”… I don’t think people thought it would become so successful. In the ’50s, ’60s, and ’70s, it was pretty simple: x was the number of people watching, y is what you charged for an ad, x times y was the money you made. You had to get as many people to watch as possible. Remember, too, the setting was different: When I was growing up, we had one TV in the den. What did we watch? It wasn’t what my dad liked best or what I liked best, but what we could all agree on. It wasn’t the lowest, but the broadest common denominator: What will the whole family tolerate? Now there’s another screen, or you can watch it later. You get more different kinds of choices. There’s more good TV, more bad TV; there’s an explosion of niches. It’s not just that TV got smarter: TV got more everything. So it was important, for “Game of Thrones” to be possible, that it was on a network that had a lot going on, and that people couldn’t use in an a la carte way. If you got HBO, you were paying a lot of money for a lot of programming: Some of them might work, some might not. But all of them helped fund the other, make either possible. That’s true to some extent on every channel. But on other channels, there’s more need to justify every show: Show x needs to bring in y to sell z ads… HBO is more of a coherent whole, so there’s more of an urge to do prestige projects. Same with Showtime and some of the other channels? And same with Netflix and Amazon. And some of the other cable channels. HBO is now competing with Netflix, so it has to deliver more quality content than Nexflix – that’s a great competition. Amazon feels like it’s in the same place, and that’s gonna get a lot more TV. What’s next, then? Right now, it’s the best of times and the worst of times. Over the last decade or more, you’ve had a bunch of incredible incentives to develop a lot of interesting TV for specific audiences. That’s resulted in a lot of people rushing into that business. When HBO had “The Sopranos” and “Sex and the City,” there was nothing else like them. Now there’s a lot of sophisticated shows. But people only have so much time. If you look at the comments made by the head of FX – I don’t know if we’ve reached peak television, but we’ve reached peak, quote unquote, quality television. There are only so many shows I can watch where I have to keep track of dozens of characters. And all those shows are considered more and more important; these expensive shows have good secondary markets – online, DVDs, overseas, airplanes, hotels, you name it. But how many of those can people watch? Increasingly, you see quality shows with expensive development budgets get ignored in a way that might not have been five or 10 years ago. “Masters of Sex” is better than it is popular. “Sense8” is supposed to be really good; I’ve not gotten a chance to watch it yet. I think FX is doing a lot of interesting stuff. But it can be hard to get enough people to watch. At the same time, you have a phenomenon where more and more people are watching this stuff online -- stuff you’re not paying extra for like FX and AMC. If you’re watching them online, they may not be making that much money on them. It’s the best of times because the current model is robust. But it’s the worst of times because we’ve probably reached peak TV. People are watching shows that don’t make those shows that much money. You saw a lot of downer movement in media stocks this week. Look at Comedy Central: “Inside Amy Schumer” is a really cool show that a lot of people are watching, but a lot of them are watching it on YouTube – that doesn’t fund the watching of more shows. I don’t think anyone has really figured out what to do about that. Is that the kind of thing that drove the stock decline? Broadly speaking – that the cable model that seemed so strong and so good [may be weak.] A lot of younger people are not subscribing. And you can watch a lot of stuff online – perfectly legally in a lot of cases. That doesn’t bring in a lot of revenue. There may be a way to produce “Amy Schumer” for an online audience, maybe. Definitely you could not do “Game of Thrones” that way. So what happens when HBO in five years is getting two-thirds the revenues it is now? What happens to the expensive, high-end shows that they deliver? The future may be pretty good for HBO. The question is, Will it be as good for Showtime [and others]? I have Netflix, I have cable, I have Amazon. That’s a lot of different TV. And there’s Hulu… How many of these things are people gonna subscribe too? Besides the money, what a pain in the ass. People might want one or two things and see anything else as a hassle. So what does that mean for the sophisticated shows that people have gotten accustomed to? Does that winnowing pose a threat to them? Right now it’s a winner-take-all market: You’re gonna see people risking stuff. There’s that show “Vinyl,” produced by Mick Jagger and Martin Scorsese: People are gonna throw a lot of things against the wall to see what sticks. What’s gonna happen is what happened to movies: People will say, The really hard part is the marketing, let’s try to base this on a new factor. That needn’t get you “Ant-Man” -- “Better Call Saul” has a built-in audience. We’re already seeing some of this: The two biggest new AMC shows are “Better Call Saul” and “Fear of the Walking Dead,” because they have a built-in audience. That doesn’t have to lead you to mediocrity, but there’s a lot of pressure… People thought “Tyrant” was a good show on FX, or this Denis Leary rock ‘n’ roll show. But it’s a lot easier to do a spinoff of something. When you have a rush into a profitable business, you tend to get an emphasis on marketing. You can spend a lot of money, or you can ride on things people know. Does Turtle from “Entourage” get his own show? Remember “Joanie Loves Chachi”? It’s not like the characters were so unbelievably fascinating. It was, “Hey, we need to launch 10 new shows this year. The other networks are launching 10 new shows. What can we do to set ourselves apart? We can adapt something… Or find a story people already know and keep telling it.” At some point, people will stop making high-end expensive shows. So there’s so much stuff out there competing for people’s time, we might get a “Games of Thrones” – that had a fanbase from the books – but we might not get “Mad Men” or “The Wire,” which don’t have immediate name recognition. It’s not that we won’t get it. But it will be harder. And even if you do everything right, you can have business problems. Why? People are watching stuff online. If you have to do these shows on a YouTube budget, it’s gonna be really hard.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 15:00

The science of forgiveness: “When you don’t forgive you release all the chemicals of the stress response”

The Burn Surgeon: How Anger Can Impede Healing In 1978, Dr. Dabney Ewin, a surgeon specializing in burns, was on duty in a New Orleans emergency room when a man was brought in on a gurney. A worker at the Kaiser Aluminum plant, the patient had slipped and fallen into a vat of 950-degree molten aluminum up to his knees. Ewin did something that most would consider strange at best or the work of a charlatan at worst: He hypnotized the burned man. Without a swinging pocket watch or any other theatrical antics, the surgeon did what’s now known in the field of medical hypnosis as an “induction,” instructing the man to relax, breathe deeply, and close his eyes. He told him to imagine that his legs—scorched to the knees and now packed in ice—did not feel hot or painful but “cool and comfortable.” Ewin had found that doing this—in addition to standard treatments—improved his patients’ outcomes. And that’s what happened with the Kaiser Aluminum worker. While such severe burns would normally require months to heal, multiple skin grafts, and maybe even lead to amputation if excessive swelling cut off the blood supply, the man healed in just eighteen days—without a single skin graft. As Ewin continued using hypnosis to expedite his burn patients’ recoveries, he added another unorthodox practice to his regimen: He talked to his patients about anger and forgiveness. He noticed that people coming into the ER with burns were often very angry, and not without reason. They were, as he put it, “all burned up,” both literally and figuratively. Hurt and in severe pain due to their own reckless mistake or someone else’s, as they described the accident that left them burned, their words were tinged with angry guilt or blame. He concluded that their anger may have been interfering with their ability to heal by preventing them from relaxing and focusing on getting better. “I was listening to my patients and feeling what they were feeling,” Ewin told me. “It became obvious that this had to be dealt with. Their attitude affected the healing of their burns, and this was particularly true of skin grafts. With someone who’s real angry, we’d put three or four skin grafts on, but his body would reject them.” Whenever a patient seemed angry, Ewin would help them forgive themselves or the person who hurt them, either through a simple conversation or through hypnosis. Ewin, now eighty-eight and semiretired after practicing surgery and teaching medical hypnosis at the Tulane University School of Medicine for more than thirty years, became interested in hypnosis while he was a young doctor training under the legendary Dr. Champ Lyons, who pioneered the use of penicillin and treated survivors of the famous Cocoanut Grove nightclub fire in Boston in 1942. As Ewin learned to stabilize patients and conduct skin grafts, he wondered about an intriguing practice that he’d learned of from his great uncle. As an independently wealthy “man of leisure” in Nashville, this uncle had dabbled in hypnosis. He even held séances, which had become so popular in the late 1800s that First Lady Mary Todd Lincoln held them in the White House to attempt to reach the spirit of her dead son. (President Abraham Lincoln reportedly attended.) Many of the most popular séance leaders were eventually exposed as frauds exploiting the grief-stricken, but Ewin’s uncle found another forum for hypnosis that was less controversial than hypnotizing an audience into believing that dead friends were speaking to them. He hypnotized the patients of surgeon friends before they went under the knife in order to minimize their pain. (This was before anesthesia was widely used.) Ewin took a few hypnosis courses to find out more. “I figured it couldn’t hurt,” he told me in his friendly New Orleans drawl when I reached him at home by phone. Once he started trying hypnosis on his burn patients, he noticed a difference immediately. If he could reach them within half an hour of the injury, the hypnotic suggestions of “coolness and calm” seemed to halt the continued burning response of the skin that usually occurs for twelve to twenty-four hours, leading to speedier recoveries. (While there are no empirical studies of hypnosis on burn patients and Ewin’s data is anecdotal, multiple studies do show that hypnosis can alleviate symptoms and improve medical outcomes in various scenarios, from asthma and warts to childbirth and post-traumatic stress disorder.) Once Ewin began helping his patients forgive, he noticed even more improvement. “What you’re thinking and feeling affects your body,” he would explain to his patients, using the analogy of something embarrassing causing someone to blush. “What you’re feeling will affect the healing of your skin, and we want you to put all your energy into healing.” At this point, he would learn how the victim had unthinkingly opened a blast furnace without turning it off, or how the workmen at a construction site had repeatedly told the boss about a dangerously placed can of gasoline, to no avail. “I’d do hypnosis with them and help them forgive themselves or the other person,” Ewin said. “I’d say, ‘You can still pursue damages through an attorney. You’re entitled to be angry, but for now I’m asking you to abandon your entitlement and let it go, to direct your energy toward healing, and turn this over to God or nature or whoever you worship. It’s not up to you to get revenge on yourself or someone else. When you know at a feeling level that you’re letting it go, raise your hand.’ Then I’d shut up, they’d raise their hand, and I’d know that skin graft was gonna take.” Ewin taught other burn doctors what he discovered, and has received letters from colleagues in burn units around the world thanking him for helping them achieve faster recovery times for their patients. The Investor Turned Research Patron: How Forgiveness Hit Mainstream Science Like Dabney Ewin, John Templeton was a son of the South, a man of letters who came of age during the Depression and combined his success with less mainstream pursuits. Born to a middle-class family in Winchester, Tennessee, in 1912, Templeton managed to put himself through Yale after the 1929 stock market crash and became a Rhodes Scholar at Oxford. He launched his career on Wall Street by taking the “buy low, sell high” mantra to the extreme, borrowing money at the onset of World War II to buy one hundred shares each in 104 companies selling at one dollar per share or less, including 34 companies that were in bankruptcy. He reaped a healthy profit on all but four. Templeton entered the mutual funds business in the fifties, eventually selling his Templeton Funds to the Franklin Group in 1992. Money magazine called him “arguably the greatest global stock picker of the century.” Yet Templeton was equally passionate about spirituality, morality, and science, and how the scientific method could increase our understanding of life’s “Big Questions"—questions about the nature of consciousness and the role that love and creativity, compassion and forgiveness, play in all areas of human life. In 1987, Templeton founded the John Templeton Foundation, dedicated to funding scientific research “on subjects ranging from complexity, evolution, and infinity, to creativity, forgiveness, love, and free will.” With the motto “How little we know, how eager to learn,” Templeton sought research grantees who were “innovative, creative, and open to competition and new ideas.” Templeton announced the Campaign for Forgiveness Research in 1997, a funding initiative for scientists in multiple disciplines who were interested in taking forgiveness out of the purview of religion and using rigorous scientific protocol to determine its effects on the body and mind. Spearheading the campaign was Dr. Everett Worthington, a psychology professor at Virginia Commonwealth University. One of the first psychologists to create therapeutic tools using forgiveness, he came to the topic through personal tragedy: His elderly mother was bludgeoned to death by an intruder, and, in part because of her death, his brother committed suicide. Struggling with rage and grief, Worthington switched his focus from marriage counseling to forgiveness. He designed a research framework for the Campaign for Forgiveness Research, Archbishop Desmond Tutu became a cochair for the campaign, and the Templeton Foundation provided a $5 million grant. Between 1998 and 2005, the foundation, along with thirteen partners including the Fetzer Institute, a Michigan-based nonprofit that funds research and educational projects focused on love and forgiveness, dedicated $9.4 million to 43 scientific studies on the health impacts of forgiveness. Whereas before, Worthington and a few other researchers were alone in their pursuits (and most of their research was aimed at affirming their own therapeutic models), the Campaign for Forgiveness Research took a traditionally religious concept and placed it firmly on the scientific landscape. In addition to funding researchers directly, the campaign sparked dialogue and interest in the broader scientific community. While in 1998 there were 58 empirical studies on forgiveness in the research literature, by 2005, when the campaign concluded, there were 950. Throughout the process, Templeton was highly engaged. Even into his eighties, he was known to walk waist-deep in the surf for an hour near his Bahamas home each morning before sitting down to read grant proposals. When he died at ninety-five, he was lauded by both the business and scientific communities. The Wall Street Journal called him the “maximum optimist,” whose confidence in rising stocks paid off and whose philanthropy left an enduring legacy. The leading scientific journal Nature wrote, “His love of science and his God led him to form his foundation in 1987 on the basis that mutual dialogue might enrich the understanding of both.” While it’s up for debate whether the research Templeton funded has enriched our understanding of God, it certainly has enriched our understanding of forgiveness, demonstrating that what was traditionally seen as a religious ideal is actually an important skill for anyone, whether atheist, agnostic, or believer, who seeks to live a healthy, happy life. The Science of Forgiveness One of the researchers who participated in the Campaign for Forgiveness Research was Dr. Robert Enright, a developmental psychologist at the University of Wisconsin–Madison. Enright began contemplating forgiveness back in the mid-eighties. As a Christian, he’d been raised on Jesus’ teachings about tolerance and forgiveness. He asked himself: Could forgiveness help patients in a clinical setting? In spite of skeptical colleagues who ridiculed him for applying science to something so “mushy” and “religious,” he designed forgiveness interventions for therapy and studied their psychological and physiological impacts. He began by developing therapies aimed at helping elderly women to forgive those who had wronged them in the past, and to help victims of abuse and incest to understand their tormentors without justifying the abusers’ actions. His initial findings were encouraging. His first study, which compared women undergoing forgiveness therapy with a control group who underwent therapy for emotional wounds without a forgiveness focus, found that the experimental group improved more in emotional and psychological health measures than the control group. It was published in the journal Psychotherapy in 1993. Afterward, Enright honed his therapeutic forgiveness tools, from helping people develop empathy—the ability to understand and share the feelings of another—toward aggressors, to learning to forgive and accept themselves, and tested them on a range of groups. Among battered women and “parental love–deprived college students,” for instance, those subject to forgiveness therapy showed more improvement in emotional and psychological health than control groups who received therapy without a forgiveness focus. Enright’s forgiveness model has four parts: uncovering your anger, deciding to forgive, working on forgiveness, and discovery and release from emotional prison. All take place through therapist-patient dialogue. Uncovering anger means examining how you’ve both avoided and dealt with it, and exploring how the offense and resulting anger has changed your health, worldview, and life in general. The phase involves learning about what forgiveness is and what it’s not, acknowledging that the ways you’ve dealt with your anger up until now haven’t worked, and setting the intention to forgive. Next, working on forgiveness entails confronting the pain the offense has caused and allowing yourself to experience it fully, then working toward developing some level of understanding and compassion for the offender. The final phase includes acknowledging that others have suffered as you have and that you’re not alone (for some, this means connecting with a support group of people who have endured a similar experience), examining what possible meaning your suffering could have for your life (learning a particular life lesson, perhaps contributing to one’s strength or character, or prompting one to help others), and taking action on whatever you determine to be your life purpose. Since developing that therapy model and pioneering the first studies, Enright and his colleagues have found positive results in drug rehabilitation participants (less anger, depression, and need for drugs compared to the control group receiving standard therapy), victims of domestic violence (decreased anxiety, depression, and post-traumatic stress disorder relative to the control group), and terminally ill cancer patients (more hope for the future and less anger than the control group). When it comes to determining the existence of a causal relationship between forgiveness and physical health, Enright says the most definitive study he has done was conducted with a team of researchers on cardiac patients. Published in 2009 in the journal Psychology & Health, their analysis found that when cardiac patients with coronary heart disease underwent forgiveness therapy, the rate of blood flow to their hearts improved more than that of the control group, which received only standard medical treatment and counseling about diet and exercise. “It wasn’t that they were cured—these were patients with serious heart problems,” Enright says. “But they were at less risk of pain and sudden death.” Those results echo studies by another Templeton grantee, Charlotte Witvliet, a psychology professor at Hope College; and Sonja Lyubomirsky, a psychology professor at the University of California, Riverside, and author of numerous books on happiness, which found that people who forgive more readily have fewer coronary heart problems than those who hold grudges. Perhaps the most comprehensive body of evidence showing links between forgiveness and health focuses on mood, says Dr. Frederic Luskin, the cofounder of the Stanford Forgiveness Project, an ongoing series of workshops and research studies at Stanford University. Researchers who measure emotional and psychological health outcomes following therapy that includes forgiveness are quantifying patients’ levels of anger, anxiety, and depression, concluding in multiple studies that forgiveness elevates mood and increases optimism, while not forgiving is positively correlated with depression, anxiety, and hostility. Like Enright, Luskin has developed ways to teach forgiveness in various places and with various groups, including war-ravaged populations in countries such as Northern Ireland and Sierra Leone, and he asserts that anyone—from jilted spouses to widows who have lost husbands to terrorism—can heal. Luskin developed a weeklong “forgiveness training” delivered in a group setting. In it, he leads participants through a series of discussions and exercises. The first steps involve teasing apart what he calls “your grievance story,” which is usually formed by taking something personally that wasn’t necessarily personal, and then blaming someone for your feelings. His argument is that when you blame someone for how you feel instead of holding them to account for their actions, you keep yourself stuck in victimhood and inaction (resenting your ex for her drinking and destructive behavior, for instance, instead of just seeking a restraining order). Luskin has participants “find the impersonal in the hurt” by realizing how many other people have experienced a similar offense or disappointment and how common it is, as well as acknowledging that most offenses are committed without the intention of hurting anyone personally. (If your mother yelled at you, for example, she likely did so not because her goal was to hurt your feelings and forever damage your self-confidence, but because she was stressed or afraid.) This doesn’t negate that often there is a personal aspect to an offense, Luskin says, but it can lessen the pain and blame. “When you don’t forgive you release all the chemicals of the stress response,” Luskin says. “Each time you react, adrenaline, cortisol, and norepinephrine enter the body. When it’s a chronic grudge, you could think about it twenty times a day, and those chemicals limit creativity, they limit problem-solving. Cortisol and norepinephrine cause your brain to enter what we call ‘the no- thinking zone,’ and over time, they lead you to feel helpless and like a victim. When you forgive, you wipe all of that clean.” One of the main areas funded by the Templeton grant was the neuroscience of forgiveness. Around the time of the award, functional MRI, or fMRI, scanners were becoming increasingly common and sparking new discoveries in a variety of areas. The machines enable neuroscientists to capture X-ray images of people’s brains in action to observe blood flow and see which brain components are activated in which situations. In 2001, Dr. Tom Farrow of the University of Sheffield in the United Kingdom used fMRI scanners to conduct the first scientific study of the “functional anatomy” of forgiveness. Using ten subjects, he had each person climb into his laboratory’s fMRI scanner and asked them to answer a series of questions designed to evoke empathy and forgiveness. The empathy-related questions asked participants to consider potential explanations for someone’s emotional state (if your boss is unusually quiet or withdrawn, for instance, is it more likely that her child was expelled from school or that her child was caught shoplifting?), while the forgiveness-related questions asked people to evaluate which crimes they considered more forgivable (a neighbor who recently lost his job getting arrested for assaulting his girlfriend or for assaulting his boss?). Farrow and his team found that empathy and “forgivability judgments,” basically contemplating whether a certain action deserves forgiveness, activate various parts of the frontal lobe, which is associated with problem-solving and reason. In contrast, a researcher named Dr. Pietro Pietrini at the University of Pisa in Italy showed in a 2000 fMRI study that anger and vengeance inhibited rational thinking and caused high activity in the amygdala, which is involved in the fight-or-flight response. Anger and rage, then, impede reason, but the tasks involved in the complex process of forgiveness activate the more recently evolved parts of our brain, such as the prefrontal cortex and posterior cingulate, which are concerned with problem-solving, morality, understanding the mental states of others, and cognitive control of emotions. Having cognitive control means inhibiting impulsive reactions fueled by rage and hatred toward a wrongdoer. This can be done through thought, such as by devising a new, less upsetting interpretation of a painful event. When it comes to being hurt, this can mean viewing an infraction as less personal than you thought, or developing an understanding of someone’s actions by considering his point of view. Psychologists call this “reframing” a painful memory. It’s a key part of both Enright’s forgiveness therapy and Luskin’s forgiveness training. Taking things less personally is something I realized would benefit me and reduce a lot of my suffering. In my new relationship with Anthony, for instance, I would sometimes feel hurt when he teased me about something, whether my penchant for driving under the speed limit or the time I left a steak to thaw on the counter and his hundred-pound dog easily ate it. When I realized that he didn’t mean to hurt my feelings and was just making a good-natured joke, I was less likely to take offense and get upset. Another way to reframe is to consider a range of possible points of view that led someone to act a certain way. This makes it more difficult to blame and demonize that person and continue generating the same level of resentment as you did before. I once spent days feeling resentful about a former editor’s criticism about a story—which I thought was harsh and took personally. When a colleague suggested that what he said likely came from a deep commitment to accuracy and excellence, I let it go and felt a lot better. A third way to reframe is to consider what constructive learning, meaning, or opportunity may have resulted from an offense and the suffering it caused. For Azim, that was the opportunity to work with youth and prevent violence, and for my more mundane example about editorial feedback, it was a lesson about being more diligent in checking my facts and considering my approach to a story. Thanks to fMRI scanners, we can now identify the parts of the brain that make this sort of reframing practice possible. In one study, Farrow focused on two groups of people who struggle with empathy and, by extension, forgiveness: schizophrenics and people suffering from post-traumatic stress disorder. Both showed inhibited activity in the areas of the brain involved in forgiveness processes such as empathy and viewing another person’s perspective. But after ten weeks of therapy that included the discussion and practice of forgiveness (and antipsychotic drugs for the schizophrenics), those brain areas’ functions improved. While Farrow didn’t use a control group to isolate and test the therapeutic forgiveness intervention specifically, the findings confirm the earlier evidence of so-called forgiveness areas in the brain, and show that psychological treatments such as cognitive behavioral therapy can improve this aspect of brain function. In a separate experiment, Pietrini asked ten participants to lie in the scanner and consider a fictional scenario in which they were wronged and then forgave. As with the prior studies, both the dorsal prefrontal cortex (involved in cognitive control) and the posterior cingulate (involved in understanding the mental states of others) lit up on the screen. But a third part was also involved: the anterior cingulate cortex, which mediates the perception and the suppression of moral pain (such as the feeling of being wronged). Pietrini’s interpretation? Forgiveness could be viewed as a sort of painkiller for moral distress. When Pietrini presented his findings at a 2009 conference, he described them as evidence that forgiveness likely evolved as a way to overcome pain and alleviate suffering, and that even though it involves parts of the brain responsible for reason, it also requires a counterintuitive, and some would argue, irrational, choice: “You wronged me, but I forgive you, anyway.” “A great deal of evidence converges suggesting that forgiveness is a positive, healthy strategy for the individual to overcome a situation that otherwise would be a major source of stress from a psychological and neurobiological point of view,” he wrote to me in an e-mail. “The fact that forgiving is a healthy resolution of the problems caused by injuries suggests that this process may have evolved as a favorable response that promotes human survival.” From "TRIUMPH OF THE HEART: Forgiveness in an Unforgiving World" by Megan Feldman Bettencourt. Published by arrangement with Avery, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. Copyright © 2015 by Megan Feldman Bettencourt.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 13:00

Robots are coming for your job: We must fix income inequality, volatile job markets now — or face sustained turmoil

Global warming in and of itself isn’t a problem. After all, life on earth has survived numerous cycles of cooling and heating. The real problem with global warming is how quickly it happens. If there isn’t enough time for living things (including us) to adapt, rapid changes in climate, not to mention more volatile weather patterns, can sow havoc. The consequences of catastrophic climate change can reverberate for centuries as species suffer horrific losses of their habitat, leading to mass extinctions. The impact of technological change on our labor markets works the same way. As long as change is gradual, the markets can respond. Too fast, and it’s chaos. And as with my particular environmental preferences, it creates winners and losers. The likely accelerating effect of recent advances in artificial intelligence on technological change is going to roil our labor markets in two fundamental ways. The first is the simple truth that most automation replaces workers, so it eliminates jobs. That means fewer places for people to work. This threat is easy to see and measure— employers roll in a robot and walk a worker to the door. But sometimes change is less visible. Each new workstation may eliminate the need for one-fifth of a salesperson, or free Skype calls may allow you to work more productively at home one day a week, deferring the need for that new hire until next quarter. If this happens slowly, the resulting improvements in productivity and reduced cost eventually create wealth, stimulating job growth that compensates for the losses. The growth may be directly in the newly improved enterprise, as lower prices and better quality increase sales, creating a need to hire more workers. Or it may be in distant parts of the economy where the customers who no longer need to pay as much for some product or service decide to spend the money they saved. If new drilling technologies cause natural gas prices to drop, there’s more left over from your paycheck to save for that sailboat you’ve got your eye on. But the second threat is much more subtle and difficult to predict. Many technological advances change the rules of the game by permitting businesses to reorganize and reengineer  the  way they operate. These organizational and process improvements often make obsolete not only jobs but skills. A teller may get laid off when a bank installs ATMs; the improved service creates a need to hire network engineers but not tellers. Even if the bank ultimately expands its total workforce, the tellers remain out of luck. Weavers can eventually learn to operate looms, gardeners to service lawnmowers, and doctors to use computers to select the right antibiotics—once they accept that synthetic intellects are superior to their own professional judgment. But learning the new skills doesn’t happen overnight, and sometimes the redundant workers simply aren’t capable of adapting—that will have to wait for a new generation of workers. For an example of labor market transformation that we have weathered successfully, consider agriculture. As recently as the early 1800s, farms employed a remarkable 80 percent of U.S. workers. Consider what this means. Producing food was by far the dominant thing people did for a living, and no doubt this pattern had been typical since the invention of agriculture about five thousand years ago. But by 1900, that figure had dropped in half, to 40 percent, and today it’s only 1.5 percent, including unpaid family and undocumented workers. Basically, we managed to automate nearly everyone out of a job, but instead of causing widespread unemployment, we freed people up for a host of other productive and wealth-producing activities. So over the last two centuries the U.S. economy was able to absorb on average about 1/2 percent loss of agricultural job opportunities each year without any obvious dislocations. Now imagine that this had happened in two decades instead of two centuries. Your father worked on a farm, and his father before him, as far back as anyone could remember. Then a Henry Ford of farming revolutionized the entire industry in what seemed like a flash. The ground shook with the rumble of shiny new plows, threshers, and harvesters; the air was thick with the smell of diesel. Food prices plummeted, and corporations bought up farmland everywhere with the  backing  of deep-pocketed Wall  Street financiers. Within a few years, your family’s farm was lost to foreclosure, along with every possession except the family Bible. You and your five brothers and sisters, with an average third-grade education, found your skills of shoeing horses, plowing straight furrows, and baling hay utterly useless, as did all of your neighbors. But you still had to eat. You knew someone who knew someone who operated one of the new machines twelve hours a day in return for three squares, who supposedly got the job in Topeka, so you moved to one of the vast tent cities ringing the major Midwestern cities in the hope of finding work—any kind of work. Before long, you got word that your parents sold the Bible to buy medicine for your youngest sister, but she died of dysentery anyway. Eventually you lost track of the rest of your other siblings. The 1 percent who still had jobs lived in tiny tract houses and barely got by, but they were nonetheless the envy of the rest—at least they had a solid roof over their heads. Each day, you waited in line outside their gated communities hoping for a chance to wash their clothes or deliver their bag lunches. Rumors spread that the daughters of the storied entrepreneur who changed the world had used his vast fortune to build a fabulous art museum made of crystal in a small town in Arkansas. But all this was before the revolution. After that, things got really bad. I’m going to argue that a similarly tectonic shift looms ahead, though doubtlessly less dramatic and more humane. Forged laborers will displace the need for most skilled labor; synthetic intellects will largely supplant the skilled trades of the educated. When initially deployed, many new technologies will substitute directly for workers, getting the job done pretty much the same way. But other innovations will not only idle the workers; they will eliminate the types of jobs that they perform. For example, consider the way Amazon constantly adapts the stock patterns in its warehouses. If a person were to do the warehouse planning (as in many more traditional companies), products might be organized in a logical and comprehensible way—identical items would be stored next to each other, for example, so when you needed to pick one, you knew where it was. But a synthetic intellect of the sort Amazon has built isn’t subject to this constraint. Like items can be located next to others that are frequently shipped with them, or on any shelf where they fit more compactly. To the human eye, it looks like chaos—products of different sizes and shapes are stacked randomly everywhere—which is why this type of warehouse organization is known as chaotic storage. But a synthetic intellect can keep track of everything and direct a worker to exactly the right place to fulfill an order far more efficiently than a human organizer could. A side effect of introducing this innovation is that it reduces the training and knowledge required of warehouse workers, making them more susceptible to replacement by forged laborers. These employees no longer have to be familiar with the location of products on the shelves; indeed, it would be near impossible to do so in such a haphazard and evolving environment. Having first simplified the skills required to get the job done, Amazon can now replace the workers that roam the warehouse floor picking those orders. This is likely why the company bought the robotics company Kiva Systems, reportedly for $775 million, in 2012. This is a single example of a profound shift that synthetic intellects will cause in our world. The need to impose order—not only for warehouses but for just about everything—is driven by the limitations of the human mind. Synthetic intellects suffer no such constraint, and their impact will turn tidiness to turmoil in many aspects of our lives. Our efforts to tame our intellectual and physical domains into manicured gardens will give way to tangled thickets, impenetrable by us. When most people think about automation, they usually have in mind only the simple replacement of labor or improving workers’ speed or productivity, not the more extensive disruption caused by process reengineering. That’s why some jobs that you might least expect to succumb to automation may nonetheless disappear. For instance, studies often cite jobs that require good people skills or powers of persuasion as examples of ones unlikely to be automated in the foreseeable future. But this isn’t necessarily the case. The ability to convince you that you look terrific in a particular outfit is certainly the hallmark of a successful salesperson. But why do you need that person when you can ask hundreds of real people? Imagine a clothing store where you are photographed in several different outfits, and the images are immediately (and anonymously, by obscuring your face) posted to a special website where visitors can offer their opinion as to which one makes you look slimmer. Within seconds, you get objective, statistically reliable feedback from impartial strangers, who earn points if you complete a purchase. (This concept is called “crowdsourcing.”) Why put your faith in a salesperson motivated by commission when you can find out for sure? Reflecting these two different effects of automation on labor (replacing workers and rendering skills obsolete), economists have two different names for the resulting unemployment. The first is “cyclical,” meaning that people are cycling in and out of jobs. In bad times, the pool of people who are between jobs may grow, leading to higher unemployment. But historically, as soon as the economy picks up, the idled workers find new jobs. Fewer people are unemployed and for shorter periods of time. This works just like the housing market: in a slow market, there are more houses available and the ones that are take longer to sell. But when the market turns around this excess inventory is quickly absorbed. I was surprised to learn just how much turnover there is in the U.S. labor market. In 2013, a fairly typical year, 40 percent of workers changed jobs. That’s a very fluid market. By contrast, less than 4 percent of homes are sold each year. So when we talk about 8 percent unemployment, it doesn’t take long for small changes in the rates of job creation and destruction to soak that up, or conversely to spill more people out of work. The other type of unemployment is called “structural,” which means that some group of unemployed simply can’t find suitable employment at all. They can send out résumés all day long, but no one wants to hire them, because their skills are a poor match for the available jobs. The equivalent in the housing market would be if the types of houses available weren’t suitable for the available buyers. Suddenly couples start having triplets instead of single kids and so need more bedrooms, or people start commuting to work in flying cars that can take off only from flat rooftops, while most houses have pitched roofs. As you can see from my fanciful examples, the factors that change the desirability of housing don’t usually change very fast, so builders and remodelers have plenty of time to adapt. But this isn’t true for automation because the pace of invention and the rate of adoption can change quickly and unpredictably, shifting the character of whole labor market segments far more rapidly than people can learn new skills—if they can be retrained at all. We’re buffeted about by these fickle winds precisely because they are hard to anticipate and virtually impossible to measure. Economists and academics who study labor markets  have a natural bias toward the quantifiable. This is understandable, because to credibly sound the alarm, they must have the hard data to back it up. Their work must stand up to objective, independent peer review, which basically means it must be reduced to numbers. But as I learned in business, spreadsheets and financial statements can capture only certain things, while trends that resist reduction to measurement often dominate the outcome. (Indeed, there’s an argument to be made that the troublesome and unpredictable business cycles plaguing our economy are largely driven by the fact that returns are easily quantified, but risks are not.) I can’t count the number of meticulously detailed yet bogus sales projections I’ve seen bamboozle management teams. At work I sometimes felt my most important contribution as a manager was anticipating that which had yet to manifest itself in quantifiable form. But talking about the overall labor market, unemployment statistics, or the aggregate rate of change obscures the reality of the situation because the landscape of useful skills shifts erratically. The complexity of this web of disappearing labor habitats and evolving job ecosystems resists analysis by traditional mathematical tools, which is why attempts to quantify this whole process tend to bog down in reams of charts and tables or devolve into hand-waving. Luckily I’m not bound by these same professional constraints, so fasten your seat belt for a quick tour of the future. My approach will be to look at some specific examples, then attempt to reason by analogy to get a broader picture. Let’s start with retail—the largest commercial job market, as determined by the U.S. Bureau of Labor Statistics (BLS). The BLS reports that about 10 percent of all U.S. workers are employed in retailing, or approximately 4.5 million people. To analyze trends, let’s use salespersons as a proxy for the whole group. The BLS projects that this labor force, which stood at 4.4 million in 2012, will grow by 10 percent to 4.9 million over the next ten years. But this is based on current demographic trends, not a qualitative analysis of what’s actually going on in the industry. To get a sense of what’s really going to happen, consider the effect on employment of the transition from bricks-and-mortar stores to online retailers. A useful way to analyze this is to use a statistic called revenue per employee. You take the total annual revenue of a company and divide it by the number of employees. It’s a standard measure of how efficient a company is, or at least how labor-efficient. Average revenue per employee for Amazon (the largest online retailer) over the past five years is around $855,000. Compare that to Walmart (the largest bricks-and-mortar retailer), whose revenue per employee is around $213,000—one of the highest of any retailer. This means that for each $1 million in sales, Walmart employs about five people. But for the same amount of sales, Amazon employs slightly more than one person. So for every $1 million in sales that shift from Walmart to Amazon, four jobs are potentially lost. Now, both companies sell pretty much the same stuff. And Walmart does a good portion of its sales online as well, so the job loss implied by the shift to online sales is understated. And neither company is standing still; both are likely to grow more efficient in the future. Excerpted from "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" by Jerry Kaplan, published by Yale University Press. Copyright c 2015 by Jerry Kaplan. Reprinted by permission of the publisher. All rights reserved.Global warming in and of itself isn’t a problem. After all, life on earth has survived numerous cycles of cooling and heating. The real problem with global warming is how quickly it happens. If there isn’t enough time for living things (including us) to adapt, rapid changes in climate, not to mention more volatile weather patterns, can sow havoc. The consequences of catastrophic climate change can reverberate for centuries as species suffer horrific losses of their habitat, leading to mass extinctions. The impact of technological change on our labor markets works the same way. As long as change is gradual, the markets can respond. Too fast, and it’s chaos. And as with my particular environmental preferences, it creates winners and losers. The likely accelerating effect of recent advances in artificial intelligence on technological change is going to roil our labor markets in two fundamental ways. The first is the simple truth that most automation replaces workers, so it eliminates jobs. That means fewer places for people to work. This threat is easy to see and measure— employers roll in a robot and walk a worker to the door. But sometimes change is less visible. Each new workstation may eliminate the need for one-fifth of a salesperson, or free Skype calls may allow you to work more productively at home one day a week, deferring the need for that new hire until next quarter. If this happens slowly, the resulting improvements in productivity and reduced cost eventually create wealth, stimulating job growth that compensates for the losses. The growth may be directly in the newly improved enterprise, as lower prices and better quality increase sales, creating a need to hire more workers. Or it may be in distant parts of the economy where the customers who no longer need to pay as much for some product or service decide to spend the money they saved. If new drilling technologies cause natural gas prices to drop, there’s more left over from your paycheck to save for that sailboat you’ve got your eye on. But the second threat is much more subtle and difficult to predict. Many technological advances change the rules of the game by permitting businesses to reorganize and reengineer  the  way they operate. These organizational and process improvements often make obsolete not only jobs but skills. A teller may get laid off when a bank installs ATMs; the improved service creates a need to hire network engineers but not tellers. Even if the bank ultimately expands its total workforce, the tellers remain out of luck. Weavers can eventually learn to operate looms, gardeners to service lawnmowers, and doctors to use computers to select the right antibiotics—once they accept that synthetic intellects are superior to their own professional judgment. But learning the new skills doesn’t happen overnight, and sometimes the redundant workers simply aren’t capable of adapting—that will have to wait for a new generation of workers. For an example of labor market transformation that we have weathered successfully, consider agriculture. As recently as the early 1800s, farms employed a remarkable 80 percent of U.S. workers. Consider what this means. Producing food was by far the dominant thing people did for a living, and no doubt this pattern had been typical since the invention of agriculture about five thousand years ago. But by 1900, that figure had dropped in half, to 40 percent, and today it’s only 1.5 percent, including unpaid family and undocumented workers. Basically, we managed to automate nearly everyone out of a job, but instead of causing widespread unemployment, we freed people up for a host of other productive and wealth-producing activities. So over the last two centuries the U.S. economy was able to absorb on average about 1/2 percent loss of agricultural job opportunities each year without any obvious dislocations. Now imagine that this had happened in two decades instead of two centuries. Your father worked on a farm, and his father before him, as far back as anyone could remember. Then a Henry Ford of farming revolutionized the entire industry in what seemed like a flash. The ground shook with the rumble of shiny new plows, threshers, and harvesters; the air was thick with the smell of diesel. Food prices plummeted, and corporations bought up farmland everywhere with the  backing  of deep-pocketed Wall  Street financiers. Within a few years, your family’s farm was lost to foreclosure, along with every possession except the family Bible. You and your five brothers and sisters, with an average third-grade education, found your skills of shoeing horses, plowing straight furrows, and baling hay utterly useless, as did all of your neighbors. But you still had to eat. You knew someone who knew someone who operated one of the new machines twelve hours a day in return for three squares, who supposedly got the job in Topeka, so you moved to one of the vast tent cities ringing the major Midwestern cities in the hope of finding work—any kind of work. Before long, you got word that your parents sold the Bible to buy medicine for your youngest sister, but she died of dysentery anyway. Eventually you lost track of the rest of your other siblings. The 1 percent who still had jobs lived in tiny tract houses and barely got by, but they were nonetheless the envy of the rest—at least they had a solid roof over their heads. Each day, you waited in line outside their gated communities hoping for a chance to wash their clothes or deliver their bag lunches. Rumors spread that the daughters of the storied entrepreneur who changed the world had used his vast fortune to build a fabulous art museum made of crystal in a small town in Arkansas. But all this was before the revolution. After that, things got really bad. I’m going to argue that a similarly tectonic shift looms ahead, though doubtlessly less dramatic and more humane. Forged laborers will displace the need for most skilled labor; synthetic intellects will largely supplant the skilled trades of the educated. When initially deployed, many new technologies will substitute directly for workers, getting the job done pretty much the same way. But other innovations will not only idle the workers; they will eliminate the types of jobs that they perform. For example, consider the way Amazon constantly adapts the stock patterns in its warehouses. If a person were to do the warehouse planning (as in many more traditional companies), products might be organized in a logical and comprehensible way—identical items would be stored next to each other, for example, so when you needed to pick one, you knew where it was. But a synthetic intellect of the sort Amazon has built isn’t subject to this constraint. Like items can be located next to others that are frequently shipped with them, or on any shelf where they fit more compactly. To the human eye, it looks like chaos—products of different sizes and shapes are stacked randomly everywhere—which is why this type of warehouse organization is known as chaotic storage. But a synthetic intellect can keep track of everything and direct a worker to exactly the right place to fulfill an order far more efficiently than a human organizer could. A side effect of introducing this innovation is that it reduces the training and knowledge required of warehouse workers, making them more susceptible to replacement by forged laborers. These employees no longer have to be familiar with the location of products on the shelves; indeed, it would be near impossible to do so in such a haphazard and evolving environment. Having first simplified the skills required to get the job done, Amazon can now replace the workers that roam the warehouse floor picking those orders. This is likely why the company bought the robotics company Kiva Systems, reportedly for $775 million, in 2012. This is a single example of a profound shift that synthetic intellects will cause in our world. The need to impose order—not only for warehouses but for just about everything—is driven by the limitations of the human mind. Synthetic intellects suffer no such constraint, and their impact will turn tidiness to turmoil in many aspects of our lives. Our efforts to tame our intellectual and physical domains into manicured gardens will give way to tangled thickets, impenetrable by us. When most people think about automation, they usually have in mind only the simple replacement of labor or improving workers’ speed or productivity, not the more extensive disruption caused by process reengineering. That’s why some jobs that you might least expect to succumb to automation may nonetheless disappear. For instance, studies often cite jobs that require good people skills or powers of persuasion as examples of ones unlikely to be automated in the foreseeable future. But this isn’t necessarily the case. The ability to convince you that you look terrific in a particular outfit is certainly the hallmark of a successful salesperson. But why do you need that person when you can ask hundreds of real people? Imagine a clothing store where you are photographed in several different outfits, and the images are immediately (and anonymously, by obscuring your face) posted to a special website where visitors can offer their opinion as to which one makes you look slimmer. Within seconds, you get objective, statistically reliable feedback from impartial strangers, who earn points if you complete a purchase. (This concept is called “crowdsourcing.”) Why put your faith in a salesperson motivated by commission when you can find out for sure? Reflecting these two different effects of automation on labor (replacing workers and rendering skills obsolete), economists have two different names for the resulting unemployment. The first is “cyclical,” meaning that people are cycling in and out of jobs. In bad times, the pool of people who are between jobs may grow, leading to higher unemployment. But historically, as soon as the economy picks up, the idled workers find new jobs. Fewer people are unemployed and for shorter periods of time. This works just like the housing market: in a slow market, there are more houses available and the ones that are take longer to sell. But when the market turns around this excess inventory is quickly absorbed. I was surprised to learn just how much turnover there is in the U.S. labor market. In 2013, a fairly typical year, 40 percent of workers changed jobs. That’s a very fluid market. By contrast, less than 4 percent of homes are sold each year. So when we talk about 8 percent unemployment, it doesn’t take long for small changes in the rates of job creation and destruction to soak that up, or conversely to spill more people out of work. The other type of unemployment is called “structural,” which means that some group of unemployed simply can’t find suitable employment at all. They can send out résumés all day long, but no one wants to hire them, because their skills are a poor match for the available jobs. The equivalent in the housing market would be if the types of houses available weren’t suitable for the available buyers. Suddenly couples start having triplets instead of single kids and so need more bedrooms, or people start commuting to work in flying cars that can take off only from flat rooftops, while most houses have pitched roofs. As you can see from my fanciful examples, the factors that change the desirability of housing don’t usually change very fast, so builders and remodelers have plenty of time to adapt. But this isn’t true for automation because the pace of invention and the rate of adoption can change quickly and unpredictably, shifting the character of whole labor market segments far more rapidly than people can learn new skills—if they can be retrained at all. We’re buffeted about by these fickle winds precisely because they are hard to anticipate and virtually impossible to measure. Economists and academics who study labor markets  have a natural bias toward the quantifiable. This is understandable, because to credibly sound the alarm, they must have the hard data to back it up. Their work must stand up to objective, independent peer review, which basically means it must be reduced to numbers. But as I learned in business, spreadsheets and financial statements can capture only certain things, while trends that resist reduction to measurement often dominate the outcome. (Indeed, there’s an argument to be made that the troublesome and unpredictable business cycles plaguing our economy are largely driven by the fact that returns are easily quantified, but risks are not.) I can’t count the number of meticulously detailed yet bogus sales projections I’ve seen bamboozle management teams. At work I sometimes felt my most important contribution as a manager was anticipating that which had yet to manifest itself in quantifiable form. But talking about the overall labor market, unemployment statistics, or the aggregate rate of change obscures the reality of the situation because the landscape of useful skills shifts erratically. The complexity of this web of disappearing labor habitats and evolving job ecosystems resists analysis by traditional mathematical tools, which is why attempts to quantify this whole process tend to bog down in reams of charts and tables or devolve into hand-waving. Luckily I’m not bound by these same professional constraints, so fasten your seat belt for a quick tour of the future. My approach will be to look at some specific examples, then attempt to reason by analogy to get a broader picture. Let’s start with retail—the largest commercial job market, as determined by the U.S. Bureau of Labor Statistics (BLS). The BLS reports that about 10 percent of all U.S. workers are employed in retailing, or approximately 4.5 million people. To analyze trends, let’s use salespersons as a proxy for the whole group. The BLS projects that this labor force, which stood at 4.4 million in 2012, will grow by 10 percent to 4.9 million over the next ten years. But this is based on current demographic trends, not a qualitative analysis of what’s actually going on in the industry. To get a sense of what’s really going to happen, consider the effect on employment of the transition from bricks-and-mortar stores to online retailers. A useful way to analyze this is to use a statistic called revenue per employee. You take the total annual revenue of a company and divide it by the number of employees. It’s a standard measure of how efficient a company is, or at least how labor-efficient. Average revenue per employee for Amazon (the largest online retailer) over the past five years is around $855,000. Compare that to Walmart (the largest bricks-and-mortar retailer), whose revenue per employee is around $213,000—one of the highest of any retailer. This means that for each $1 million in sales, Walmart employs about five people. But for the same amount of sales, Amazon employs slightly more than one person. So for every $1 million in sales that shift from Walmart to Amazon, four jobs are potentially lost. Now, both companies sell pretty much the same stuff. And Walmart does a good portion of its sales online as well, so the job loss implied by the shift to online sales is understated. And neither company is standing still; both are likely to grow more efficient in the future. Excerpted from "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" by Jerry Kaplan, published by Yale University Press. Copyright c 2015 by Jerry Kaplan. Reprinted by permission of the publisher. All rights reserved.Global warming in and of itself isn’t a problem. After all, life on earth has survived numerous cycles of cooling and heating. The real problem with global warming is how quickly it happens. If there isn’t enough time for living things (including us) to adapt, rapid changes in climate, not to mention more volatile weather patterns, can sow havoc. The consequences of catastrophic climate change can reverberate for centuries as species suffer horrific losses of their habitat, leading to mass extinctions. The impact of technological change on our labor markets works the same way. As long as change is gradual, the markets can respond. Too fast, and it’s chaos. And as with my particular environmental preferences, it creates winners and losers. The likely accelerating effect of recent advances in artificial intelligence on technological change is going to roil our labor markets in two fundamental ways. The first is the simple truth that most automation replaces workers, so it eliminates jobs. That means fewer places for people to work. This threat is easy to see and measure— employers roll in a robot and walk a worker to the door. But sometimes change is less visible. Each new workstation may eliminate the need for one-fifth of a salesperson, or free Skype calls may allow you to work more productively at home one day a week, deferring the need for that new hire until next quarter. If this happens slowly, the resulting improvements in productivity and reduced cost eventually create wealth, stimulating job growth that compensates for the losses. The growth may be directly in the newly improved enterprise, as lower prices and better quality increase sales, creating a need to hire more workers. Or it may be in distant parts of the economy where the customers who no longer need to pay as much for some product or service decide to spend the money they saved. If new drilling technologies cause natural gas prices to drop, there’s more left over from your paycheck to save for that sailboat you’ve got your eye on. But the second threat is much more subtle and difficult to predict. Many technological advances change the rules of the game by permitting businesses to reorganize and reengineer  the  way they operate. These organizational and process improvements often make obsolete not only jobs but skills. A teller may get laid off when a bank installs ATMs; the improved service creates a need to hire network engineers but not tellers. Even if the bank ultimately expands its total workforce, the tellers remain out of luck. Weavers can eventually learn to operate looms, gardeners to service lawnmowers, and doctors to use computers to select the right antibiotics—once they accept that synthetic intellects are superior to their own professional judgment. But learning the new skills doesn’t happen overnight, and sometimes the redundant workers simply aren’t capable of adapting—that will have to wait for a new generation of workers. For an example of labor market transformation that we have weathered successfully, consider agriculture. As recently as the early 1800s, farms employed a remarkable 80 percent of U.S. workers. Consider what this means. Producing food was by far the dominant thing people did for a living, and no doubt this pattern had been typical since the invention of agriculture about five thousand years ago. But by 1900, that figure had dropped in half, to 40 percent, and today it’s only 1.5 percent, including unpaid family and undocumented workers. Basically, we managed to automate nearly everyone out of a job, but instead of causing widespread unemployment, we freed people up for a host of other productive and wealth-producing activities. So over the last two centuries the U.S. economy was able to absorb on average about 1/2 percent loss of agricultural job opportunities each year without any obvious dislocations. Now imagine that this had happened in two decades instead of two centuries. Your father worked on a farm, and his father before him, as far back as anyone could remember. Then a Henry Ford of farming revolutionized the entire industry in what seemed like a flash. The ground shook with the rumble of shiny new plows, threshers, and harvesters; the air was thick with the smell of diesel. Food prices plummeted, and corporations bought up farmland everywhere with the  backing  of deep-pocketed Wall  Street financiers. Within a few years, your family’s farm was lost to foreclosure, along with every possession except the family Bible. You and your five brothers and sisters, with an average third-grade education, found your skills of shoeing horses, plowing straight furrows, and baling hay utterly useless, as did all of your neighbors. But you still had to eat. You knew someone who knew someone who operated one of the new machines twelve hours a day in return for three squares, who supposedly got the job in Topeka, so you moved to one of the vast tent cities ringing the major Midwestern cities in the hope of finding work—any kind of work. Before long, you got word that your parents sold the Bible to buy medicine for your youngest sister, but she died of dysentery anyway. Eventually you lost track of the rest of your other siblings. The 1 percent who still had jobs lived in tiny tract houses and barely got by, but they were nonetheless the envy of the rest—at least they had a solid roof over their heads. Each day, you waited in line outside their gated communities hoping for a chance to wash their clothes or deliver their bag lunches. Rumors spread that the daughters of the storied entrepreneur who changed the world had used his vast fortune to build a fabulous art museum made of crystal in a small town in Arkansas. But all this was before the revolution. After that, things got really bad. I’m going to argue that a similarly tectonic shift looms ahead, though doubtlessly less dramatic and more humane. Forged laborers will displace the need for most skilled labor; synthetic intellects will largely supplant the skilled trades of the educated. When initially deployed, many new technologies will substitute directly for workers, getting the job done pretty much the same way. But other innovations will not only idle the workers; they will eliminate the types of jobs that they perform. For example, consider the way Amazon constantly adapts the stock patterns in its warehouses. If a person were to do the warehouse planning (as in many more traditional companies), products might be organized in a logical and comprehensible way—identical items would be stored next to each other, for example, so when you needed to pick one, you knew where it was. But a synthetic intellect of the sort Amazon has built isn’t subject to this constraint. Like items can be located next to others that are frequently shipped with them, or on any shelf where they fit more compactly. To the human eye, it looks like chaos—products of different sizes and shapes are stacked randomly everywhere—which is why this type of warehouse organization is known as chaotic storage. But a synthetic intellect can keep track of everything and direct a worker to exactly the right place to fulfill an order far more efficiently than a human organizer could. A side effect of introducing this innovation is that it reduces the training and knowledge required of warehouse workers, making them more susceptible to replacement by forged laborers. These employees no longer have to be familiar with the location of products on the shelves; indeed, it would be near impossible to do so in such a haphazard and evolving environment. Having first simplified the skills required to get the job done, Amazon can now replace the workers that roam the warehouse floor picking those orders. This is likely why the company bought the robotics company Kiva Systems, reportedly for $775 million, in 2012. This is a single example of a profound shift that synthetic intellects will cause in our world. The need to impose order—not only for warehouses but for just about everything—is driven by the limitations of the human mind. Synthetic intellects suffer no such constraint, and their impact will turn tidiness to turmoil in many aspects of our lives. Our efforts to tame our intellectual and physical domains into manicured gardens will give way to tangled thickets, impenetrable by us. When most people think about automation, they usually have in mind only the simple replacement of labor or improving workers’ speed or productivity, not the more extensive disruption caused by process reengineering. That’s why some jobs that you might least expect to succumb to automation may nonetheless disappear. For instance, studies often cite jobs that require good people skills or powers of persuasion as examples of ones unlikely to be automated in the foreseeable future. But this isn’t necessarily the case. The ability to convince you that you look terrific in a particular outfit is certainly the hallmark of a successful salesperson. But why do you need that person when you can ask hundreds of real people? Imagine a clothing store where you are photographed in several different outfits, and the images are immediately (and anonymously, by obscuring your face) posted to a special website where visitors can offer their opinion as to which one makes you look slimmer. Within seconds, you get objective, statistically reliable feedback from impartial strangers, who earn points if you complete a purchase. (This concept is called “crowdsourcing.”) Why put your faith in a salesperson motivated by commission when you can find out for sure? Reflecting these two different effects of automation on labor (replacing workers and rendering skills obsolete), economists have two different names for the resulting unemployment. The first is “cyclical,” meaning that people are cycling in and out of jobs. In bad times, the pool of people who are between jobs may grow, leading to higher unemployment. But historically, as soon as the economy picks up, the idled workers find new jobs. Fewer people are unemployed and for shorter periods of time. This works just like the housing market: in a slow market, there are more houses available and the ones that are take longer to sell. But when the market turns around this excess inventory is quickly absorbed. I was surprised to learn just how much turnover there is in the U.S. labor market. In 2013, a fairly typical year, 40 percent of workers changed jobs. That’s a very fluid market. By contrast, less than 4 percent of homes are sold each year. So when we talk about 8 percent unemployment, it doesn’t take long for small changes in the rates of job creation and destruction to soak that up, or conversely to spill more people out of work. The other type of unemployment is called “structural,” which means that some group of unemployed simply can’t find suitable employment at all. They can send out résumés all day long, but no one wants to hire them, because their skills are a poor match for the available jobs. The equivalent in the housing market would be if the types of houses available weren’t suitable for the available buyers. Suddenly couples start having triplets instead of single kids and so need more bedrooms, or people start commuting to work in flying cars that can take off only from flat rooftops, while most houses have pitched roofs. As you can see from my fanciful examples, the factors that change the desirability of housing don’t usually change very fast, so builders and remodelers have plenty of time to adapt. But this isn’t true for automation because the pace of invention and the rate of adoption can change quickly and unpredictably, shifting the character of whole labor market segments far more rapidly than people can learn new skills—if they can be retrained at all. We’re buffeted about by these fickle winds precisely because they are hard to anticipate and virtually impossible to measure. Economists and academics who study labor markets  have a natural bias toward the quantifiable. This is understandable, because to credibly sound the alarm, they must have the hard data to back it up. Their work must stand up to objective, independent peer review, which basically means it must be reduced to numbers. But as I learned in business, spreadsheets and financial statements can capture only certain things, while trends that resist reduction to measurement often dominate the outcome. (Indeed, there’s an argument to be made that the troublesome and unpredictable business cycles plaguing our economy are largely driven by the fact that returns are easily quantified, but risks are not.) I can’t count the number of meticulously detailed yet bogus sales projections I’ve seen bamboozle management teams. At work I sometimes felt my most important contribution as a manager was anticipating that which had yet to manifest itself in quantifiable form. But talking about the overall labor market, unemployment statistics, or the aggregate rate of change obscures the reality of the situation because the landscape of useful skills shifts erratically. The complexity of this web of disappearing labor habitats and evolving job ecosystems resists analysis by traditional mathematical tools, which is why attempts to quantify this whole process tend to bog down in reams of charts and tables or devolve into hand-waving. Luckily I’m not bound by these same professional constraints, so fasten your seat belt for a quick tour of the future. My approach will be to look at some specific examples, then attempt to reason by analogy to get a broader picture. Let’s start with retail—the largest commercial job market, as determined by the U.S. Bureau of Labor Statistics (BLS). The BLS reports that about 10 percent of all U.S. workers are employed in retailing, or approximately 4.5 million people. To analyze trends, let’s use salespersons as a proxy for the whole group. The BLS projects that this labor force, which stood at 4.4 million in 2012, will grow by 10 percent to 4.9 million over the next ten years. But this is based on current demographic trends, not a qualitative analysis of what’s actually going on in the industry. To get a sense of what’s really going to happen, consider the effect on employment of the transition from bricks-and-mortar stores to online retailers. A useful way to analyze this is to use a statistic called revenue per employee. You take the total annual revenue of a company and divide it by the number of employees. It’s a standard measure of how efficient a company is, or at least how labor-efficient. Average revenue per employee for Amazon (the largest online retailer) over the past five years is around $855,000. Compare that to Walmart (the largest bricks-and-mortar retailer), whose revenue per employee is around $213,000—one of the highest of any retailer. This means that for each $1 million in sales, Walmart employs about five people. But for the same amount of sales, Amazon employs slightly more than one person. So for every $1 million in sales that shift from Walmart to Amazon, four jobs are potentially lost. Now, both companies sell pretty much the same stuff. And Walmart does a good portion of its sales online as well, so the job loss implied by the shift to online sales is understated. And neither company is standing still; both are likely to grow more efficient in the future. Excerpted from "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" by Jerry Kaplan, published by Yale University Press. Copyright c 2015 by Jerry Kaplan. Reprinted by permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 11:00

Meet the Tea Party’s evangelical quack: David Barton is Glenn Beck’s favorite “historian”

The popular dissemination of Reconstructionist ideas is evident in the framing and language used by people in the religious right, if you have an ear for it. I think of this as analogous to the way in which a New Englander can hear the difference between a Maine accent and a Boston one, or how a Southerner can tell if a speaker is from North Carolina or South Carolina; it is subtle, but it is undeniably there. There is perhaps no better example of Christian Reconstructionist influence on the broader culture than the work of Tea Party “historian” David Barton. Barton does not explicitly identify as a Christian Reconstructionist, and Christian Reconstructionists would not claim him as one of their own.2Barton does have ties to several Reconstructionist groups, including the Providence Foundation; he occasionally cites the work of Rousas Rushdoony and promotes views on race and slavery that are rooted in Rushdoony. While Barton doesn’t use the language of theonomy or postmillennialism, as we will see, he speaks of dominion, biblical law, the necessity of bringing every area of life under the lordship of Christ, and sphere sovereignty of biblically ordained institutions. He embraces the whole range of political views advocated by Reconstructionists from the right-to-life and creationism to more narrowly held positions on issues such as the history of slavery and opposition to the Federal Reserve System. As we shall see, the approach to history that has made Barton famous is rooted in Rushdoony’s biblical philosophy of history. Barton was born in 1954, raised in Aledo, Texas, and graduated from public high school in 1972, the same year his parents started a house church with Pentecostal leanings. By 1974 the church had moved into facilities that now also house the Christian school they started in 1981, as well as Barton’s organization, Wallbuilders. After high school, Barton attended Oral Roberts University, where he received a degree in religious education in 1976. Upon returning home, he became principal of Aledo Christian School until, a decade later, as he tells it in an interview, God led him to his first book by showing him the connection between the Supreme Court decisions on prayer and Bible reading and “plummeting” academic achievement scores and “soaring” student crime and immorality.
In July 1987, God impressed me to do two things. First, I was to search the library and find the date that prayer had been prohibited in public schools. Second, I was to obtain a record of national SAT scores . . . spanning several decades. I didn’t know why, but I somehow knew that these two pieces of information would be very important.
The result was his America; to Pray or Not to Pray, which is a statistical analysis of the “volume of prayers being offered” overlaid with data on a number of social problems, to compare the “prayer years with the post prayer years.” According to Barton, the drop in prayer was so dramatic that its impact was felt not just in the schools but in every aspect of our national life. Barton seemed unaware of the notion that correlation is not causation. A self-styled historian with no real academic credentials, Barton went on to build an extensive collection of primary source documents from America’s founding era and write several “Christian American history” books that argue that the founding fathers intended America to be a Christian nation and that argue for a Christian reading of the Constitution they wrote. This work has shaped a generation of Christian school and homeschool students. Despite being roundly rejected by scholars, Barton claims to be a “recognized authority in American history and the role of religion in public life.” For example, an amicus brief filed by Wallbuilders in McCollum v. California Department of Corrections and Rehabilitation claims Barton works as a consultant to national history textbook publishers. He has been appointed by the State Boards of Education in states such as California and Texas to help write the American history and government standards for students in those states. Mr. Barton also consults with Governors and State Boards of Education in several states, and he has testified in numerous state legislatures on American history. Examples include a 1998 appointment as an advisor to the California Academic Standards Commission and a 2009 appointment as a reviewer in the Texas Board of Education’s effort to revise the state’s social science curriculum. In each case, Barton was one of three conservative “outside experts” appointed to review the curriculum to ensure that children learn that America was founded on biblical principles. As “experts” they sought changes to the curriculum to ensure that Christianity was presented “as an overall force for good—and a key reason for American Exceptionalism, the notion that the country stands above and apart.” Indeed, when Barton invoked his position as a curriculum consultant on Jon Stewart’s Daily Show, Stewart asked for whom he had done this work, and Barton refused to name anyone, saying “if they don’t name names then I don’t.” In 2005 Barton was included in Time magazine’s list of the twenty-five most influential evangelicals, but it was his association with Fox News’ Glenn Beck, who called him the most important man in America, that catapulted him into another level of influence. By 2011 Barton could boast that Republican primary presidential candidates Newt Gingrich and Michele Bachmann consulted him. Bachmann even invited him to speak to her congressional Tea Party Caucus on the history of the Constitution. Mike Huckabee infamously said that “every American should be forced to listen to Barton at gunpoint.” Barton’s presentation style makes on-the-spot critical engagement difficult. He jumps, at lightning speed, from one piece of data to another, interpreted through his “biblical” framework; he creates a barrage of information, tied to small pieces of familiar truth and rooted in an apparently vast collection of primary documents. Barton is one of the very best examples of the way in which the Tea Party is about much more than taxes, and he’s been at the center of its rise. In addition to being promoted by Glenn Beck, he travels the country presenting his Constitutional Seminars and selling materials promoting his views to churches, civic organizations, Christian schools, and Christian homeschoolers. Barton’s work has been the subject of extensive critique by bloggers, reporters, and other critics, some of whom are scholars publishing peer-reviewed critiques, but, for the most part, scholars have not devoted a lot of attention to debunking his claims. Beginning in about 2011, two conservative Christian professors from Grove City College, Warren Throckmorton, professor of psychology, and Michael Coulter, professor of humanities and political science, published a critique of Barton’s The Jefferson Lies entitled Getting Jefferson Right: Fact Checking Claims about Our Third President. The book was received well by scholars, and the authors’ credentials as conservative Christians undermined Barton’s defense that criticism of his work was ideological rather than factual. The Jefferson Lies was withdrawn by its publisher. One might expect under the weight of such resounding rejection, Barton would disappear into obscurity. Yet Barton’s supporters remain as devoted as before. Criticism from scholars (whether Christian or not) is dismissed as liberal, socialist, and even pagan. Discredited in the larger culture, Barton remains influential in the conservative Christian subculture. Barton and the Constitution In 2011, Barton’s radio program Wallbuilders Live carried a three-part series on the Constitution and “the principles of limited government” that illustrated well how he draws the conclusions he does regarding what the Constitution meant to the founders. The spectrum of activists calling themselves “constitutionalists”—including Barton but ranging from avowed Reconstructionists to Tea Partiers who claim their movement is solely about taxes and limited government—read the Constitution in the context of the Declaration of Independence to invoke the authority of the Creator in an otherwise godless document. The first of Barton’s three-part series lays out exactly how this works. Many look at the US Constitution and see little mention of religion and wonder how conservative Christians can insist that it is a template for a Christian nation. But Barton is careful to speak, instead, of our “original national founding documents.” For Barton and his followers, the Declaration of Independence, though never ratified and carrying no legal authority, has the same status as the Constitution. Indeed, in their view, the Constitution can only be read in the context of the Declaration:
Go back to our original national document, our original founding document, the Declaration of Independence. In the first forty-six words . . . they tell us the philosophy of government that has produced America’s exceptionalism . . . two points immediately become clear in that opening statement of our first national government document. Number one, they say there is a divine creator, and number two, the divine creator gives guaranteed rights to men . . . there’s a God and God gives specific rights to men.
Barton asserts that the founders believed there were a handful of unalienable rights, the most important of which are life, liberty, and property. He occasionally acknowledges that the language in the Declaration is slightly different (life, liberty, and the pursuit of happiness), but he argues that the pursuit of happiness is grounded in property, making the two terms interchangeable. He more often uses the term “property.” These rights are understood to come directly from God, and the purpose of government (and therefore the Constitution the founders wrote) is limited to securing those rights. According to Barton, in language that became common Tea Party rhetoric, an inalienable right is “a right to which we are entitled by our all-wise and all-beneficent creator; a right that God gave you, not government.” Any other perceived rights, not understood as coming from God, cannot be legitimately protected by the civil government. This is the very point of criticism made of Supreme Court nominee Elena Kagan by Herb Titus, described earlier. Rooted in the three-part division of authority popularized by Rushdoony and the Reconstructionists, Barton argues that the Bible (which he believes the founders read in the same way he does and upon which he believes they based the Constitution) limits the jurisdiction of civil government. That life, liberty, and property are “among” the God-given rights that Barton finds in the Declaration left room for the articulation of more rights derived from God to be “incorporated” into the Constitution, most clearly in the Bill of Rights, which he calls “the capstone” to the Constitution. “They said, we’re going to name some other inalienable rights just to make sure that government does not get into these rights . . . When you look at the ten amendments that are the Bill of Rights, those are God-granted rights that government is not to intrude into.” He then offered some unique interpretations of the rights protected in the first ten amendments. The First Amendment religion clauses, for Barton, become “the right of public religious expression.” The Second Amendment right to keep and bear arms is, according to Barton, “what they called the biblical right of self-defense.” The Third Amendment prohibiting the coerced quartering of soldiers is the biblical and constitutional protection of the “the sanctity of the home.” Finally, all the protections against unjust prosecution in the Fifth Amendment are reduced to the protection of “the right to private property.” While the “limited government” enshrined in the Constitution protects basic rights given by God and precludes government from doing anything not within the purview of its biblical mandate, it also, according to Barton, prohibits abortion. Barton says that, according to the founders, the first example of “God-given inalienable rights is the right to life.” Barton claims that when the founders invoked the God-given right to life they intended to prohibit abortion. He claims that “abortion was a discussed item in the founding era.” As evidence he says, “as a matter of fact we have books in our library of original documents—observations on abortion back in 1808,” and that “early legislatures in the 1790s were dealing with legislation on the right to life on the abortion issue.” But Barton gives no examples and provides no references to any evidence. After this slippery claim, he goes on at length with quotes from founders on the right to life, none of which mention abortion. “They understood back then that abortion was a bad deal and that your first guaranteed inalienable right is a right to life. Consider how many other founding fathers talked about the right to life.” In another example of slipperiness, he quotes founder James Wilson: “Human life, from its commencement to its close, is protected by the common law. In the contemplations of law, life begins when the infant is first able to stir in the womb.” Realizing that this won’t do the work of banning abortion from conception, Barton redefines the question, moving the focus from the development of the fetus to what the mother “knows.” Very simply, he [Wilson] says as soon as you know you’re pregnant, as soon as you know there’s life in the womb, that life is protected by law. That’s where the technology difference is, we can know that there’s life in the womb much earlier today than what they knew back then. But the point is the same there: as soon as you know there’s a life there, it’s protected. But this is not what Wilson said, and Barton’s argument gets worse. In his view this understanding of the right to life is a bellwether for a number of other issues that are at the top of the religious right’s agenda: “Our philosophy of American exceptionalism is very simple: there is a God, he gives specific rights, [and] the purpose of government is to protect the rights he’s given.” If someone is “wrong” on “life issues,” they’re likely to be wrong on the right to self-defense (the right to own guns), the sanctity of the home (his interpretation of what it means to not have soldiers in your house), private property (his reading of the rights of the accused culminating in the protection against eminent domain), and “the traditional marriage issue” (for which he makes no connection to the founders or the Constitution). Barton’s interpretation doesn’t even resemble a close reading of the text with an eye toward the founders’ intentions—or any coherent application of the value of limited government—yet he successfully frames it as such in populist discourse. In 2011, the Ninth Circuit Court rejected an appeal challenging the policy of the California Department of Corrections and Rehabilitation that allows only leaders of “five faiths” (Protestant, Catholic, Muslim, Jewish, and Native American) to serve as paid chaplains (McCollum v. CDCR). The ruling had nothing to do with the legitimacy of the claim that the policy unconstitutionally favors some religions over others but rather whether McCollum (a Pagan minister) had standing to bring the case. An amicus brief filed by Wallbuilders in support of the CDCR to privilege the “five faiths” provides a glimpse into how Barton reads the Constitution. For him the Constitution represents a consensus—as though there is a singular view that can be attributed to “the founders.” Barton’s style of reading the Constitution is modeled on his style of reading the Bible, which he also treats as a coherent document that can be read from start to finish to yield a clear, undisputed, objective meaning, instead of a collection of fragmented texts written over a very long period of time in different cultures, assembled into larger texts, then chosen from an even larger collection of texts in a political process, translated from ancient languages, and finally interpreted in different ways by different communities. Every stage of that process continues to be profoundly disputed by scholars, and there is always an interpretative framework (albeit all too often an unrecognized one) underlying any reading of it. While the US Constitution is a newer document, and it is therefore somewhat less difficult to discern its meaning(s), the fact remains that it is the product of hard-fought compromise among leaders, bound in time and culture, who profoundly disagreed with each other. There is no reason to believe they thought they were writing a sacred text to which all subsequent generations of Americans were bound by a process that amounts to divining a singular “intent.” The argument Barton made in the brief, moreover, illustrates a second important point. He is being disingenuous when he insists he just wants everyone to have the opportunity to practice his religion freely. In his appearance on the Daily Show, he defended the practice of Christian religious observance in otherwise secular contexts when the majority wants it by saying that a Muslim-majority community should be able to make “Sharia law” the law of the land. There was a significant outcry from his anti-Muslim supporters, and he backtracked on the point in a subsequent episode of Wallbuilders Live. In this brief, however, he argued that only those religions that fit with what he thinks the founders meant by “religion” should be protected. Protected religion is either Christianity alone or perhaps the larger category of monotheism—Barton asserts that rights of conscience don’t extend to atheists either (and by implication also not to Buddhists and Hindus): “whether this Court agrees that ‘religion’ meant monotheism or believes that it meant Christianity . . . it is clear that atheism, heathenism, and paganism were not part of the definition of ‘religion.’” Barton has argued against the free exercise of rights of Muslims, as have other religious right promoters of Islamophobia, claiming Islam is “not a religion.” Indeed, the term “religion” does have a complicated history, and it has often been used (or denied) to legitimize dominance of one group over another. Initially Africans were said to be “without religion,” legitimizing their enslavement, and, in another example, Native Americans were considered “without religion” to justify taking their land. Barton’s brief is important because it made explicit that which he often tries to deny: that only Christianity (and maybe Judaism) is protected under his reading of the Constitution. Barton on the Free Market and Socialism On another segment of Wallbuilders Live, Barton and co-host Rick Green discussed the effort by the Obama administration to prohibit Internet service providers from charging for service based on usage (known as Net Neutrality) because it violates biblical economics and is “socialist.” It’s easy to dismiss that charge as nothing more than demagoguery, but, in fact, the discussion illustrates what they mean by socialism and, ultimately, how they understand freedom. Both points trace directly back to Rushdoony. Most of us understand socialism as a system that limits private ownership of property and in which power (political and economic) is centralized in the state; Tea Party accusations that any policy they oppose is “socialist” seem, at best, like hyperbole. But in Barton’s view, any move away from what he sees as an unfettered free market, any regulation or involvement on the part of government, is a move toward socialism—and of course he thinks that private ownership and free markets are biblically sanctioned. Net Neutrality prohibits ISPs from charging for Internet service based on usage. This seems straightforward to Barton and Green: “what they mean is we’re not going to let you choose who you need to charge more to.” Maybe more interesting, though, is the subsequent exchange between Rick Green and his “good friend” Texas congressman Joe Barton, who was sponsoring legislation to overturn the Obama administration’s Net Neutrality regulation. Joe Barton tried to explain Net Neutrality and, in the process, revealed important aspects of how such people understand freedom in entirely economic terms. Joe Barton says that we cannot regulate the Internet, it should be open and free. Democrats’ definition of Net Neutrality is we want to give FCC the authority to tell people who actually provide the Internet what they can and can’t do with it. Now, what people like yourself and myself mean [by freedom] is no government interference; it’s pretty straightforward. Republicans and conservatives have always tried to keep the Internet totally free. But of course they have not tried to keep it totally free, except in one very narrow economic sense. They certainly do not mean “free” in a way that includes broadly available access, because that’s socialism; “redistribution of wealth through the Internet . . . this is socialism on the Internet.” Nor do they mean free regarding content, as David Barton made explicit when he returned to the conversation at the end of the show saying, “We’re not suggesting moral license, we don’t want to have obscenity, pornography, child pornography . . . You still have moral laws to follow.” Economic freedom is nearly absolute, but it is still subordinate to moral law. At the height of the debate over the federal budget and the Tea Party demands that Congress not raise the debt ceiling during the summer of 2011, David Barton and company tackled the question posed by the “religious left” in the budget debate: What would Jesus cut? They devoted an entire episode of Wallbuilders Live to the question: “Why Do People Think Government’s Role Is to Take Care of the Poor?” The episode is promoted with the assertion that “The role of the government is not to exercise mercy, but to exercise justice. It is improper for government to take care of the poor. That is up to us, as individuals.” With guest Michael Youseff, who had recently written on his blog about the application of the Bible to government spending and the poor, David Barton and Rick Green invoked the framework for limited biblical jurisdiction developed and promoted by Rushdoony. They claimed that the Bible has “205 verses about taking care of the poor” and asserted that “only one is directed to government,” which simply requires no more than the poor be “treated fairly in court.” Barton and Green employ Rushdoony’s framework of three God-ordained spheres of authority and the view that any action on issues outside those responsibilities is tyrannical and socialist. The responsibility to take care of the poor is limited to families and churches. As we have seen, Rushdoony, Gary North, David Chilton, George Grant, and others have written on this topic. One of the more accessible places to find their view is in George Grant’s volume in Gary North’s Biblical Blueprints Series. Barton and Green borrow from them to assert that taking care of the poor is not the job of the government. Charity is up to individuals, families, and churches. Moreover, it should not be extended to everyone. The architects of the framework on which Barton bases his view are quite clear: biblical charity may extend to the four corners of the earth, but only to those who are in submission to biblical law as it is articulated by the Reconstructionists. Barton on Race David Barton is also the popularizer of a revisionist history of race in America that has become part of the Tea Party narrative. Drawn in part from the writings of Christian Reconstructionists, that narrative recasts modern-day Republicans as the racially inclusive party, and modern-day Democrats as the racists supportive of slavery and postemancipation racist policies. Barton’s website has included a “Black History” section for some time. Like Barton’s larger revisionist effort to develop and perpetuate the narrative that America is a Christian nation, the “Republicans-are-really-the-party-of-racial-equality” narrative is not entirely fictive. Some historical points Barton makes are true; but he and his biggest promoter, Glenn Beck, manipulate those points, remove all historical context, and add patently false historical claims in order to promote their political agenda. Barton appeared regularly on Beck’s show to disseminate his alternative reading of African American history, carrying with him, as he does, what he claims are original documents and artifacts that he flashes around for credibility. In June of 2010 I traveled to central Florida to attend a Tea Party event sponsored by the Florida chapter of Beck’s “9–12 Project” held at a Baptist church (with a Christian school that was established in the late 1970s). The church sanctuary was decked in patriotic trimmings, including eight big flags on the wall, bunting all over the altar area, and a collection of small flags on the altar itself. As I waited for the event to begin, I overheard people talking about homeschooling and David Barton’s work on “America’s Christian Heritage,” all while Aaron Copland’s “Fanfare for the Common Man” played over the sound system. For those unconvinced of the religious dimensions of at the Tea Party movement, the strain of it exhibited here was indistinguishable from the church-based political organizing efforts of the religious right dating back at least to the 1980s. As each local candidate spoke, it was clear how profoundly conservative, Republican, and Christian (in the religious right sense of Christian) this gathering was. The event was promoted as a response to charges of racism in the Tea Party movement. The banner at the entrance to the event read: “9–12 Project: not racist, not violent, just not silent anymore.” The pastor of the church introduced the meeting, the Tea Party–supported candidates for local office spoke, and all invoked “Christian American history” and the “religion of the founders.” The “9–12 Project” refers both to post-9/11 America (when “divisions didn’t matter”) and to the “nine principles and twelve values” of the group, initiated and promoted by Beck. The “principles” are a distillation of those in The Five Thousand Year Leap, a 1981 book by Cleon Skousen, which was referenced repeatedly by speakers at the event. The book has long been a favorite for Christian schools and homeschoolers and among Reconstructionists despite the fact that Skousen is a Mormon (perhaps because he is also a strong advocate of the free-market Austrian School of economics). I was surprised to learn that Skousen’s book was enjoying a resurgence in popularity as a result of Beck’s promotion and is available in a new edition with a preface by Beck. The fight over the degree to which America was “founded as a Christian nation” is important in that it is a fight over our mythic understanding of ourselves. That is, it is a fight over the narratives through which Americans construct a sense of what it means to be American and perpetuate that sense through the culture and in successive generations. Intended to counter the charges of racism made against the Tea Party movement, the main speaker was an African American, Franz Kebreau, from the National Association for the Advancement of Conservative People of all Colors (NAACPC). The event was in a more rural part of Florida than where I live, and I passed a number of Confederate flags on my way there. I expected an all-white crowd making arguments about “reverse discrimination,” libertarian arguments against violations of state sovereignty (especially with the Civil Rights Act), and maybe even some of the “slavery wasn’t as bad as people say” arguments. What I found surprised me. Kebreau gave a detailed lecture on the history of slavery and racism in America: a profoundly revisionist history. In Kebreau’s narrative, racism is a legacy of slavery, but it was a socially constructed mechanism by which people in power divided, threatened, and manipulated both blacks and whites. Many of the pieces of historical data he marshals in favor of this thesis are not unfamiliar to those who have studied this aspect of American history, but they are probably not as well known among Americans in general: some slave owners were black, not all slaves were black, black Africans played a huge role in the slave trade, and very few Southerners actually owned slaves. While at least most of these points are undeniably true, they were presented with a specific subtext: with the goal of lending credence to the view that contemporary critics of racism make too much of America’s history of slavery. In this view, it is Democrats who are primarily responsible for fostering racism to solidify power. Southern Democrats opposed civil rights laws, voting rights, integration, and so on. Northern Democrats fanned racial tensions by promoting social programs that made African Americans dependent on government. Race-baiting demagogues like Jesse Jackson and the Reverend Al Sharpton perpetuate the divisions today. In August of 2010, Beck held his Restoring Honor Rally, bringing many Tea Party groups—Tea Party Patriots, Freedom Works, 9–12 Project, Special Operations Warrior Foundation, and others—together at the Lincoln Memorial. While Beck initially promoted the event as a nonpolitical effort to return to the values of the founders, he claims he only realized later that he scheduled it on the anniversary of Martin Luther King Jr.’s “I Have a Dream” speech. He suggested that while he did not realize the significance of the date, “God might have had a hand” in the coincidence. Beck was criticized for both his timing and his crediting the Almighty. Beck fancies himself a contemporary King, “reclaiming the civil rights movement,” and while he was widely mocked for drawing this parallel, it was less recognized that he did it on a foundation laid by David Barton and his revisionist history, which relies in no small part on the work of Rushdoony. In his essay “The Founding Fathers and Slavery,” Barton quotes extensively from the writings of the founders and claims that many of them were abolitionists. He maintains that the overwhelming majority of the founders were “sincere Christians” who thought American slavery was “unbiblical,” blamed England for imposing the institution on the colonies, and set in motion processes to end it. Scholars dispense with these claims. According to Diana Butler Bass, “It was nearly universally accepted by white Christian men that the Bible taught, supported, or promoted slavery and it was rare to find a leading American intellectual, Christian or otherwise, who questioned the practice on the basis that it was ‘unbiblical.’ Some intellectuals thought it was counter to the Enlightenment.” Historian Mark Noll argues that the reverse of Barton’s view with regard to the British is correct: evangelicals in the Church of England, not in America, argued that slavery violated the Bible. Again, according to Bass, “the American biblical argument against slavery did not develop in any substantial way until the 1830s and 1840s. Even then, the anti-slavery argument was considered liberal and not quite in line with either scripture or tradition.” Another essay on Barton’s website, “Democrats and Republicans in Their Own Words: National Party Platforms on Specific Biblical Issues,” compares party platforms from 1840 to 1964—the period before Southern Democrats who blocked civil rights legislation began switching to the Republican Party. In Barton’s narrative, the modern Republican Party is the party more favorable to African Americans because the Republicans led the fight against slavery and for civil rights from the formation of the Republican Party as the “anti-slavery party” and the “election of Abraham Lincoln as the first Republican President,” to the Emancipation Proclamation, the Thirteenth and Fourteenth Amendments, the passage of civil rights laws during Reconstruction, and the election of blacks to office. Barton writes that while the Democratic Party platform was defending slavery, “the original Republican platform in 1856 had only nine planks— six of which were dedicated to ending slavery and securing equal rights for African-Americans.” Democrats, on the other hand, supported slavery, and they then sought to ban blacks from holding public office and to limit their right to vote via poll taxes, literacy tests, grandfather clauses, and general harassment and intimidation, and they established legal segregation under Jim Crow laws. Barton takes issue with the claim that “Southerners” fought for racist policies, because “just one type of Southern whites were the cause of the problem: Southern racist whites.” Rather, he argues (missing the logical inconsistency), we should lay the responsibility for racism at the feet of Democrats: Current writers and texts addressing the post-Civil War period often present an incomplete portrayal of that era . . . To make an accurate portrayal of black history, a distinction must be made between types of whites. Therefore, [it would be] much more historically correct— although more “politically incorrect”—were it to read: “Democratic legislatures in the South established whites-only voting in party primaries.” Because he says very little about contemporary Democrats, it’s clear that Barton’s purpose is to connect them with racist Southern Democrats, while completely ignoring the relationship of modern-day Republicans with racism. Most glaringly, the Republican “Southern strategy” is entirely missing from Barton’s account of the parties’ political strategies with regard to race. From the Johnson administration through the Nixon and Reagan campaigns, Republican strategists effectively used race as a “wedge issue.” Southern Democrats would not support efforts by the national party to secure civil rights for African Americans. By focusing on specific racial issues (like segregation), Republicans split off voters who had traditionally voted for Democrats. The contemporary “states’ rights” battle cry at the core of the conservative movement and Tea Party rhetoric is rooted in this very tactic. Barton and Beck want to rewrite American history on race and slavery in order to cleanse the founding fathers of responsibility for slavery and, more importantly, blame it and subsequent racism on Democrats. But Barton’s rewriting of the history of the founding era and the civil rights movement alone doesn’t quite accomplish that. He has to lower the bar even more and make slavery itself seem like it wasn’t quite as bad as we might think. And for that, he turns to Stephen McDowell of the Reconstructionist-oriented Providence Foundation. Wallbuilders’ website promotes a collection of “resources on African American History.” Much of the material is written by Barton himself, but one of the essays is McDowell’s, drawn almost entirely from Rushdoony’s work in the early 1960s. McDowell’s discussion of slavery, written in 2003, comes directly from Rushdoony’s The Institutes of Biblical Law. McDowell attributes his views to Rushdoony and uses precisely the language that Rushdoony used as early as the 1960s. Rushdoony’s writings on slavery are often cited by his critics. Rushdoony did argue that slavery is biblically permitted. While criticizing American slavery as violating a number of biblical requirements, he also defended it in his writings. By promoting McDowell, and by extension Rushdoony, Barton promotes a biblical worldview in which slavery is in some circumstances acceptable. This worldview downplays the dehumanization of slavery by explicitly arguing that God condones it in certain circumstances. McDowell writes that, while it was not part of “God’s plan” from the beginning, “slavery, in one form or another (including spiritual, mental, and physical), is always the fruit of disobedience to God and His law/ word,” meaning that the slave is justifiably being punished for his or her disobedience. McDowell argues that slavery is tightly regulated, though not forbidden, in the Bible, and that American Southern slavery was not “biblical” slavery because it was race-based. Following Rushdoony, he argues that there are two forms of biblically permissible slavery: indentured servitude, in which “servants were well treated and when released, given generous pay,” and slavery, in which, in exchange for being taken care of, one might choose to remain a slave. Moreover, he maintains that the Bible permits two forms of involuntary slavery: criminals who could not make restitution for their crimes could be sold into slavery and “pagans, [who] could be made permanent slaves.” Of course, Rushdoony defines “pagans” as simply non-Christians. This means that slavery was/is voluntary only for Christians; non-Christians can be held in nonvoluntary perpetual slavery. Barton shares this understanding of the legal status of “pagans,” at least in terms of their rights under the First Amendment. McDowell is explicit that race-based kidnapping and enforced slavery are unbiblical. In fact, they are punishable by death. All this comes directly from The Institutes of Biblical Law. McDowell argues, as did Rushdoony in the early 1960s, that while American slavery was not biblical slavery, neither was it the cause of the Civil War. The major point of dispute between North and South, they argue, was, not slavery but “centralism,”—that is, the increasing centralization of power in the federal government, an argument frequently echoed today by the states’ rights agitators and Tenth Amendment Tea Partiers. Although in one essay Barton parts company with Rushdoony and McDowell over the significance of slavery as a cause of the Civil War (Barton argues instead that slavery was a cause, in service of his argument that the present-day Republican Party is more racially inclusive than the Democrats), he nonetheless continues to promote, on his website, their view that slavery is biblical. The historical revisionism with regard to race in America that gained a hearing in the Tea Party (thanks to Glenn Beck and activists such as Franz Kebreau) is rooted in Barton’s and Wallbuilders’ writings, which have been deeply influenced by Rushdoony. Excerpted from "Building God's Kingdom: Inside the World of Christian Reconstruction" by Julie Ingersoll. Published by Oxford University Press. Copyright 2015 by Julie Ingersoll. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 09:00

The 12 most ludicrous ideas about women’s health from the GOP field

AlterNet There are 17 Republican candidates for president that get the New York Times stamp of legitimacy. In a field like that, standing out is hard. The easiest way to catch media attention---and attract voters in the notoriously conservative Republican primary voting base---is to get competitively nutters. Which most of the candidates are doing, hard, when it comes to bashing reproductive health care. It’s impossible to really hand out lifetime achievement awards when it comes to the ugliest slams against reproductive health care. But here are the worst things they’ve said recently. 1) Mike Huckabee. Huckabee is a notorious spewer of sexist garbage, but his latest –defending the Paraguay government forcing a 10-year-old rape victim to have her rapist’s baby---is low even for him. “When an abortion happens, there are two victims,” he argued. “One is the child, the other is that birth mother, who often will go through extraordinary guilt years later when she begins to think through what happened, with the baby, with her.” Yes, he tried to argue he wants a 10-year-old to endure childbirth for her own good, lest she feel “guilt” over reneging on their Huckabee-prescribed duty to having babies for rapists. Not very convincing, that. 2) Scott Walker. During the Fox News GOP debate, Walker affirmed his support for forcing pregnant women to give birth, even if their doctors tell them it will kill them. He doubled down later in an interview with Sean Hannity, saying, “I’ve said for years, medically there’s always a better choice than choosing between the life of an unborn baby and the life of the mother.” It is true that you don’t have to choose, since Walker’s preference, doing nothing, tends to kill both a woman and her fetus. How that’s “pro-life”, however, remains a mystery. 3) Ben Carson. “It brings up a very important issue and that is do those black lives matter,” he told Fox News host Eric Bolling recently when discussing Planned Parenthood. “The number one cause of death for black people is abortion.” Undermining the Black Lives Matter movement while implying that black women are somehow race traitors because they control their own bodies? It’s a two-fer---maybe a three-fer---of the kind of viciousness that motivates the modern American right. 4) Rick Santorum. “It is not any more than the Dred Scott decision was settled law to Abraham Lincoln,” Santorum said, during the Republican debate, about a recent court decision legalizing same-sex marriage. “This a rogue Supreme Court decision.” ““We passed a bill and we said, ‘Supreme Court, you’re wrong!,” he continued, citing a 2003 law he wrote that undermined Roe v Wade. Dred Scott v Sanford was a notorious 1856 case where the Supreme Court ruled that black people cannot be U.S. citizens. That’s right. Santorum was suggesting that denying black people their basic humanity is somehow the equivalent of letting women control their bodies or letting gay people marry for love. 5) Bobby Jindal. “Today's video of a Planned Parenthood official discussing the systematic harvesting and trafficking of human body parts is shocking and gruesome,” Jindal said in announcing an investigation of Planned Parenthood inspired by videos that have been repeatedly shown to be anti-choice hoaxes. Investigations into Planned Parenthood have found, no surprise, that there is no “trafficking of human body parts” going on. Jindal has yet to weigh in on what other surgeries should be banned because they are “gruesome”, a word that can be used to characterize all of them. 6) Marco Rubio. Rubio’s argument on CNN for why women should not be allowed to remove unwanted embryos from their uteruses: “It cannot turn into an animal. It can’t turn into a donkey.” "Well, if they can’t say it will be human life, what does it become, then?” he added. Could it become a cat?” All surgery, as well as tooth removal and hair brushing, removes living human cells, aka human life. It’s not donkey. It’s not cat. Human. We look forward to Rubio’s upcoming ban on dentistry on the grounds that human life is not cat life. 7) Carly Fiorina. Fiorina considered denying her daughter the HPV vaccine, even though nearly all sexually active people will get it at some point in their life. "And she got bullied. She got bullied by a school nurse saying: 'Do you know what your daughter is doing?'" Fiorina complained at a campaign event. Sorry, Fiorina, but assuming that your kid will likely grow up and have sex one day is not bullying. Signaling to your kid that you expect her to be a lifelong virgin or risk cervical cancer? Now that’s what I’d call bullying. 8) Jeb Bush. Bush got a lot of negative attention for a campaign event where he said, “I'm not sure we need a half a billion dollars for women's health issues.” His attempt to “clarify” this, however, showed that he really does mean it. He proposes taking the money away from family planning clinics like Planned Parenthood and redirecting it to general service community health centers. Which is to say, to take away money bookmarked for women’s health, forcing women to give up their gynecologists and go to general clinics instead, where they can expect longer wait times, less direct access to contraception and less access to specialized services. 9) Ted Cruz. When the hoax Planned Parenthood videos came out, Cruz floated a conspiracy theory accusing the media of censorship. “The mainstream media wants to do everything they can to hide these videos from the American people,” he argued. “And the reason is virtually every reporter, virtually every editor, virtually every person who makes decisions in the mainstream media is passionately pro-abortion.” In the real world, every major newspaper, cable news network, and many nightly news shows covered the videos. They also debunked the lies in the videos, though telling the truth is probably not what Cruz was hoping the “mainstream media” would do with these deceitful videos. 10) Donald Trump. Trump says a lot of foul things about women generally and reproductive health care generally, including calling Planned Parenthood an “abortion factory”. But he’s probably the candidate in the race who hates reproductive health care access the least, which is a sad statement about the state of the modern GOP. 11) Rand Paul. Paul has been pushing the idea of banning Medicaid patients from Planned Parenthood and redirecting them to already overcrowded general service clinics instead. “We’ve doubled the amount of money we put into women’s health care through government, and so it’s just an absurd argument to say we need Planned Parenthood,” he argued on Fox News last week. “It’s only about abortion.” In reality, 97 percent of Planned Parenthood’s services are not abortion and 0 percent of federal money goes to Planned Parenthood’s abortion services. Nor can women just go to a community health center. When Texas defunded Planned Parenthood, there were over 63,000 fewer claims for birth control services. Community health centers try to pick up the slack, but it’s more than they can handle. 12) Chris Christie. Christie’s attempts to ingratiate himself with the religious right brought him to start defunding Planned Parenthood in New Jersey years ago. But his enthusiasm for preventing women from using contraception stops at his bedroom door. “I’m a Catholic, but I’ve used birth control, and not just the rhythm method,” Christie recently told a New Hampshire crowd. Birth control for me but not for thee? It’s probably what all these candidates, none of whom have Duggar-size families, actually practice. But Christie doesn’t get bonus points for honesty. After all, he didn’t admit that this was hypocrisy and continues to bash Planned Parenthood every chance he gets. There are five other white guys in the race, all eager to dump on affordable contraception services and legal abortion. But, as of now, few have shown the vim to really stand out from the crowd in their tedious denunciations of reproductive health care technologies that, in the real world, are a normal part of everyday life. But give them time. It’s a long campaign season and the anti-woman competition is only just getting started. AlterNet There are 17 Republican candidates for president that get the New York Times stamp of legitimacy. In a field like that, standing out is hard. The easiest way to catch media attention---and attract voters in the notoriously conservative Republican primary voting base---is to get competitively nutters. Which most of the candidates are doing, hard, when it comes to bashing reproductive health care. It’s impossible to really hand out lifetime achievement awards when it comes to the ugliest slams against reproductive health care. But here are the worst things they’ve said recently. 1) Mike Huckabee. Huckabee is a notorious spewer of sexist garbage, but his latest –defending the Paraguay government forcing a 10-year-old rape victim to have her rapist’s baby---is low even for him. “When an abortion happens, there are two victims,” he argued. “One is the child, the other is that birth mother, who often will go through extraordinary guilt years later when she begins to think through what happened, with the baby, with her.” Yes, he tried to argue he wants a 10-year-old to endure childbirth for her own good, lest she feel “guilt” over reneging on their Huckabee-prescribed duty to having babies for rapists. Not very convincing, that. 2) Scott Walker. During the Fox News GOP debate, Walker affirmed his support for forcing pregnant women to give birth, even if their doctors tell them it will kill them. He doubled down later in an interview with Sean Hannity, saying, “I’ve said for years, medically there’s always a better choice than choosing between the life of an unborn baby and the life of the mother.” It is true that you don’t have to choose, since Walker’s preference, doing nothing, tends to kill both a woman and her fetus. How that’s “pro-life”, however, remains a mystery. 3) Ben Carson. “It brings up a very important issue and that is do those black lives matter,” he told Fox News host Eric Bolling recently when discussing Planned Parenthood. “The number one cause of death for black people is abortion.” Undermining the Black Lives Matter movement while implying that black women are somehow race traitors because they control their own bodies? It’s a two-fer---maybe a three-fer---of the kind of viciousness that motivates the modern American right. 4) Rick Santorum. “It is not any more than the Dred Scott decision was settled law to Abraham Lincoln,” Santorum said, during the Republican debate, about a recent court decision legalizing same-sex marriage. “This a rogue Supreme Court decision.” ““We passed a bill and we said, ‘Supreme Court, you’re wrong!,” he continued, citing a 2003 law he wrote that undermined Roe v Wade. Dred Scott v Sanford was a notorious 1856 case where the Supreme Court ruled that black people cannot be U.S. citizens. That’s right. Santorum was suggesting that denying black people their basic humanity is somehow the equivalent of letting women control their bodies or letting gay people marry for love. 5) Bobby Jindal. “Today's video of a Planned Parenthood official discussing the systematic harvesting and trafficking of human body parts is shocking and gruesome,” Jindal said in announcing an investigation of Planned Parenthood inspired by videos that have been repeatedly shown to be anti-choice hoaxes. Investigations into Planned Parenthood have found, no surprise, that there is no “trafficking of human body parts” going on. Jindal has yet to weigh in on what other surgeries should be banned because they are “gruesome”, a word that can be used to characterize all of them. 6) Marco Rubio. Rubio’s argument on CNN for why women should not be allowed to remove unwanted embryos from their uteruses: “It cannot turn into an animal. It can’t turn into a donkey.” "Well, if they can’t say it will be human life, what does it become, then?” he added. Could it become a cat?” All surgery, as well as tooth removal and hair brushing, removes living human cells, aka human life. It’s not donkey. It’s not cat. Human. We look forward to Rubio’s upcoming ban on dentistry on the grounds that human life is not cat life. 7) Carly Fiorina. Fiorina considered denying her daughter the HPV vaccine, even though nearly all sexually active people will get it at some point in their life. "And she got bullied. She got bullied by a school nurse saying: 'Do you know what your daughter is doing?'" Fiorina complained at a campaign event. Sorry, Fiorina, but assuming that your kid will likely grow up and have sex one day is not bullying. Signaling to your kid that you expect her to be a lifelong virgin or risk cervical cancer? Now that’s what I’d call bullying. 8) Jeb Bush. Bush got a lot of negative attention for a campaign event where he said, “I'm not sure we need a half a billion dollars for women's health issues.” His attempt to “clarify” this, however, showed that he really does mean it. He proposes taking the money away from family planning clinics like Planned Parenthood and redirecting it to general service community health centers. Which is to say, to take away money bookmarked for women’s health, forcing women to give up their gynecologists and go to general clinics instead, where they can expect longer wait times, less direct access to contraception and less access to specialized services. 9) Ted Cruz. When the hoax Planned Parenthood videos came out, Cruz floated a conspiracy theory accusing the media of censorship. “The mainstream media wants to do everything they can to hide these videos from the American people,” he argued. “And the reason is virtually every reporter, virtually every editor, virtually every person who makes decisions in the mainstream media is passionately pro-abortion.” In the real world, every major newspaper, cable news network, and many nightly news shows covered the videos. They also debunked the lies in the videos, though telling the truth is probably not what Cruz was hoping the “mainstream media” would do with these deceitful videos. 10) Donald Trump. Trump says a lot of foul things about women generally and reproductive health care generally, including calling Planned Parenthood an “abortion factory”. But he’s probably the candidate in the race who hates reproductive health care access the least, which is a sad statement about the state of the modern GOP. 11) Rand Paul. Paul has been pushing the idea of banning Medicaid patients from Planned Parenthood and redirecting them to already overcrowded general service clinics instead. “We’ve doubled the amount of money we put into women’s health care through government, and so it’s just an absurd argument to say we need Planned Parenthood,” he argued on Fox News last week. “It’s only about abortion.” In reality, 97 percent of Planned Parenthood’s services are not abortion and 0 percent of federal money goes to Planned Parenthood’s abortion services. Nor can women just go to a community health center. When Texas defunded Planned Parenthood, there were over 63,000 fewer claims for birth control services. Community health centers try to pick up the slack, but it’s more than they can handle. 12) Chris Christie. Christie’s attempts to ingratiate himself with the religious right brought him to start defunding Planned Parenthood in New Jersey years ago. But his enthusiasm for preventing women from using contraception stops at his bedroom door. “I’m a Catholic, but I’ve used birth control, and not just the rhythm method,” Christie recently told a New Hampshire crowd. Birth control for me but not for thee? It’s probably what all these candidates, none of whom have Duggar-size families, actually practice. But Christie doesn’t get bonus points for honesty. After all, he didn’t admit that this was hypocrisy and continues to bash Planned Parenthood every chance he gets. There are five other white guys in the race, all eager to dump on affordable contraception services and legal abortion. But, as of now, few have shown the vim to really stand out from the crowd in their tedious denunciations of reproductive health care technologies that, in the real world, are a normal part of everyday life. But give them time. It’s a long campaign season and the anti-woman competition is only just getting started.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 08:00

David Simon: Hyper-segregation is our national dynamic

ProPublica David Simon’s new HBO miniseries “Show Me a Hero,” which premiered last Sunday, is the harrowing tale of a hopeless battle. Based on a nonfiction book of the same title – written by former New York Times reporter Lisa Belkin – the show dramatizes the real fight that took place 25 years ago in Yonkers, New York, after a federal judge ordered public housing projects to be built in the wealthier (and whiter) parts of the city. In an interview with ProPublica, David Simon discussed the legacy of the Yonkers crisis and what desegregation is all about. The transcript has been edited for clarity and length. You said that you thought about this show many years ago. How has the project changed over time? Very little, sadly. We optioned this book shortly after it came out [in 1999], and we were fairly certain that the dynamic of hyper-segregation was a national dynamic, that we were not just writing a story about Yonkers. What do you mean by hyper-segregation? White people, by and large, are not very good at sharing physical space or power or many other kinds of social dynamics with significant numbers of people of color. It’s been documented time and time again. There is a great book by Andrew Hacker called “Two Nations.” My God, it’s almost a quarter century old, but it is an incredible primer on just how specific the desire of white America is to remain in a hyper-majority. The reason we wanted [Lisa Belkin’s] book was that Yonkers was a place where the housing department actually got the housing right. They didn’t overwhelm the neighborhood with a massive project or hundreds of walk-up units. They were trying to do scattered-site housing for the first time, which has been this quiet revolution in public housing. It works, it doesn’t destabilize neighborhoods. But you were dealing with people who were entrenched behind the same fears as previous generations…. This project kept getting bumped a little bit to the back burner but every time we bumped it, in talking about it with the HBO executives, we’d say, “You know what, look, it just happened to Baltimore.” They tried to do the same thing with scattered housing in eastern Baltimore County, and the white folks went batshit, batshit crazy. At every point, there was a new fresh example that the dynamic was still there, that the racial pathology was still intact. And I think it has only become more pronounced. The show was greenlit before Ferguson, before Baltimore, before Charleston. If you had written the screenplay after these events, would you have changed anything? No, no. First of all, “Show Me a Hero” is not about police violence. It’s certainly not about a white racist backlash against changing demographics, which is how I would characterize the Charleston or Lafayette shootings. Part of the implied power of the piece is we are taking you back 25 years and nothing has changed! Lisa Belkin wrote an op-ed in The New York Times a few days ago saying she viewed Yonkers, at the time when she was doing her reporting, as a place of hope. She expected desegregation to happen around the United States as a result. That didn’t happen. The NAACP didn’t pursue the same cases anywhere else. Nor did the Justice Department because of horrible resources. Why do you think that happened? Because of how blistering Yonkers was, how insanely volatile and irrational Yonkers was. You have to remember that this case was brought at the end of the Carter administration. There wasn’t a single civil rights action filed by the Justice Department from 1980 to 1988 that mattered. Reagan effectively shut down the civil rights division of the DOJ. Then you had Clinton, who was doing everything he could during the Gingrich years to maneuver to the center. The reason you didn’t have aggressive use of this legal precedent under Clinton is the same reason you have those omnibus crime bills that filled up prisons as fast as we can construct them. Bill Clinton’s triangulation with the political center made things like fair housing prohibitive for his political priorities. We haven’t seen any movement on this in any presidential administration until the last two years of Obama. They sort of opened the books on all their data to basically encourage the use of the Fair Housing Act to do precisely what they did in Yonkers. But notice that this is coming in the last two years of the administration, and it’s coming as an administrative act. In the show, no one really wants the housing either. The NAACP is already tired of the whole ordeal before any units were built. You have to remember, they filed the case in [‘80]. It was litigated. They are now in 1987, and they can’t get the goddamn city to name a geographic site to build house No. 1... But the truth is, the 200 units were built. They are still there, and there has been no increase in crime in those neighborhoods as a result of it. There has been no substantive decrease in the housing values in Yonkers. There was a brief dip as there was some fundamental white flight, mostly surrounding the school desegregation portion of the civil rights suit, which is a whole other can of worms. The population in Yonkers is now probably about 56 percent white, 44 percent people of color, heavily Latino. A lot of people, with a certain amount of unknowing racial malevolence, say “oh, look, it was 80–20 white, now it’s almost 50 percent people of color. See what happens? Look at all that white flight.” But in 25 years, the population of the New York metropolitan area has been transformed. What desegregation is about is not about keeping Yonkers 80–20 or 75–25. This is about the browning of America. We are becoming a less white country. The trick is, can we become more brown without destabilizing ourselves and without having gated white communities and ghettos? Did you hear from current residents as you were filming? We talked to all the people in the book, some of whom are still living in those townhouses. I mean, did I go take a poll of random people in Yonkers? No. But every time we set up and started filming, people would come over, and we talked to people who were unrelenting in their belief that what was done was illegal and that the judge had no right to do this and that it was an affront to their freedom and their liberty. They would come and tell us that, and we’d say “well, okay…” Did anyone object to you filming at, for example, the actual city hall of Yonkers? Or that this could reopen wounds? No, the mayor appeared with us in Yonkers. I’m sure there are people who didn’t want to see the story made at all. But, you know, I’m not used to making shows that everyone agrees with, so I wouldn’t know what that would feel like anyway. With two episodes down, is there anything that viewers should remember, have in mind, when they watch the next two on Sunday? I certainly don’t want to tell people what to watch until they watch it. Just that we were very true to the history. This is all predicated upon a 40-year history of American government at the federal, state and local level using public money to purposefully hyper-segregate our society. Poor people didn’t end up all packed into housing projects in one square mile of Yonkers by accident. It was a plan. It was a plan in Chicago, in Baltimore, in Dallas and everywhere that took federal housing money since the 1930s. The records, the history of it is in plain sight. I have nothing but contempt for anybody who says that [the racial integration of Yonkers] was social engineering by this judge. Really? You want to parse it that way? What bullshit. The social engineering begins in the 1930s, with FHA mortgages and with the first public housing monies in the New Deal. Republicans and Democrats are both complicit. The idea that the social engineering starts at the moment that somebody might want to restore somebody to their full civil rights, 40 years into the rigged game. And that’s when you object? Sorry, that’s racist to begin your argument there.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 07:30

“It’s not just a party, it’s our life”: Jazz musicians led the way back to the city after Katrina — but what is this “new” New Orleans?

The spot where St. Ann Street dead-ends into North Rampart in New Orleans, the dividing line between the French Quarter and the Tremé neighborhood, was quiet on a Friday night in November 2005. Once alight with bulbs that spelled out “Armstrong,” the large steel archway that frames the intersection was dark, its white paint overtaken by rust. Beneath it, a thick, carelessly wound chain bound two iron gates, from which dangled a steel padlock. The whole assembly looked as if it was meant to secure some oversized bicycle rather than the entrance to a 32-acre city park modeled after Copenhagen’s Tivoli Gardens and named for trumpeter Louis Armstrong. Armstrong Park was closed, had been since the flood that resulted from the levee failures following Hurricane Katrina. You could see Elizabeth Catlett’s bronze statue of Louis, trumpet in left hand, handkerchief in right, but only from a distance through bars. Armstrong in prison, it looked like. Or maybe this likeness of the trumpeter, who left New Orleans early in his career for fame and for good, was on the outside. Maybe all of us in New Orleans in late 2005, the locals who made their way back and those like me, who had shown up from afar, were locked away from something essential — a culture that has long defined this city and its inhabitants, and long helped its visitors find their true selves.   Aug. 29 will mark a decade since the 2005 disaster that we’ve come to know by the name Katrina — for the hurricane, a natural disaster — but that is more accurately understood as unnameable and unnatural, a failure of engineering and due diligence followed by a long wake of indifference or worse. The media coverage surrounding this 10th anniversary will likely constitute its own deluge, dominated by maudlin memories of catastrophe and self-righteous hype about recovery. The conflation of past misery and present cheerleading came clear in New Orleans Mayor Mitch Landrieu's June appearance on MSBNC’s “Morning Joe,” to promote the city’s campaign, “Katrina 10: Resilient New Orleans”: On-screen, stock 2005 footage showed fetid floodwaters and rescues in progress. Cut to Landrieu, in an interview, saying, “We’re doing great. We’re doing much better ... It’s a redemption, an incredible comeback story.” Within all the hoopla to come, expect trumpets and trombones and tubas and second-line parades in progress. The storied jazz culture of New Orleans will again provide prominent B-roll for TV. That culture belongs in the foreground. That’s the story I’ve been tracking. Now, a decade in, I want to know if those who most need and want New Orleans jazz culture now find themselves, amid all the rebuilding, estranged from it or feeling as if it may yet slip away. In October 2005, I wrote an essay for Salon in which I wondered whether musicians and other culture bearers of New Orleans would return to their devastated city at all. I worried over the prospect of a Disney-fied Crescent City, or whether the whole place would be turned into a museum piece. I asked if the culture born in New Orleans — which Ken Burns' PBS series “Jazz” famously cast as a signal of American values and virtues on the order of the Constitution — “still carried currency when it comes to the issues Katrina raised: identity, race, poverty and basic decency.” In the long wake of the flood, the ranks of jazz musicians, the brass-band-led Sunday second-line paraders and the feathered-and-beaded Mardi Gras Indians — the key players of indigenous New Orleans culture — did not simply return. They came back sooner and in greater numbers than the rest of the population. They led the way, and have maintained a vital sense of continuity. Louisiana State University sociologist Frederick Weil, who surveyed 6,000 New Orleans residents, told me, “By the standards of civic-engagement literature, the members of the Social Aid & Pleasure Clubs who sponsor weekly parades are ‘model citizens,’ scoring highest of any group. They are community leaders, supporting each other in times of need and providing concrete services.” David Simon’s HBO series “Treme,” a fictional depiction of post-Katrina New Orleans, captured this truth. In its premiere episode, before a word of dialogue was uttered, a saxophonist licked and then adjusted his reed. Slide oil was applied to a trombone. Two black kids danced to a faint parade rhythm. An unseen trumpet sounded an upward figure, followed by a tuba’s downward groove. The scene re-created the first second-line parade following Katrina — a memorial for a local chef, Austin Leslie. Simon told me then, in 2010, “Apart from culture, on some empirical level, it does not matter if all New Orleans washes into the Gulf, and if everyone from New Orleans ended up living in Houston or Baton Rouge or Atlanta. Culture is what brought this city back … New Orleans is coming back, and it’s sort of done it one second-line at a time.” "Initially, New Orleans jazz was a reflection of a way of life," clarinetist and professor Michael White told me back in 2006 at his office at Xavier University, while peering over a jagged pile that included the red notebook in which, during the weeks following Katrina, he jotted down the names and whereabouts of friends and colleagues. "It spoke of the way people walk, talk, eat, sleep, dance, drive, think, make jokes and dress. But I don't think America ever truly understood New Orleans culture, because the mind-set is so different here. So that whole tradition was hidden from most of America.” When I first got to New Orleans after the flood I was stunned first by just how much had been destroyed, and then later by just how little I knew. I’d been writing about jazz for 20 years. Yet I was profoundly ignorant about what it means to have a living music, one that flows from and embeds everyday life — a functional jazz culture of the sort that once existed in cities throughout the United States but now is exclusive to New Orleans. Before I spent months at a time in the city, before I spent countless hours with the people who make and support New Orleans jazz culture, I knew about but had not yet meaningfully felt the link to something fundamentally African, transplanted via the enslaved who passed through much of this hemisphere, who drummed and danced in Congo Square, a stone’s throw from where that Armstrong statue now stands. And I had no clue what a tenuous proposition this culture represents: What it was, is and maybe always will be up against. “There’s a feeling among many of us,” White told me in 2006, “that some of our older cultural institutions, like parades and jazz funerals, are in the way of progress and don’t fit in the new vision of New Orleans, that they should only be used in a limited way to boost the image of New Orleans, as opposed to being real, viable aspects of our lives.” Was New Orleans jazz culture welcomed back? Not exactly. If there’s a culture war going on in the city, that’s hardly news. According to historian Freddi Williams Evans’ book “Congo Square: African Roots in New Orleans,” Congo Square was codified by an 1817 city ordinance that restricted drumming and African dances to a single spot. Skip to 1996, when a photograph of a protest march that ran in the Times-Picayune newspaper showed a teenage snare drummer wearing a sign: “I Was Arrested for Playing Music.” The past decade lends a new chapter to such conflicts. In the years since the 2005 flood, tensions surrounding culture have flared. In 2007, a consortium of Social Aid and Pleasure Clubs defeated jacked-up city permit fees for their weekly brass-band-led second-line parades in federal court, on First Amendment grounds. ("Should the law not be enjoined," the complaint, filed on behalf of a consortium of Social Aid & Pleasure Clubs, aided by the American Civil Liberties Union, stated, "there is very little doubt that plaintiff's cultural tradition will cease to exist.") Later that year, police busted up a memorial procession for a beloved tuba player in dramatic fashion, reigniting a long-standing fight over who owns the streets. This narrative unfolded despite the city’s pervasive use of these traditions to rekindle a tourism business that, in 2014, hosted 9.5 million visitors who spent $6.8 billion (a record for visitor spending). "We rose out of water and debris to lead the way back to the life that we love," said Bennie Pete, sousaphonist and leader of the Hot 8 Brass Band, a local favorite, at a public forum on such matters in 2008. "It's not just a party, it's our life. We can sugarcoat it all kinds of ways, but the city looks at us as uncivilized. And that's why they try to confine us." During the past few years, as a yet undefined “new” New Orleans rubs up against whatever is left of the old one, brass bands have been shut down on their customary street corners. Music clubs have increasingly been hit with lawsuits and visited by the police responding to phoned-in complaints. A revival of rarely enforced ordinances regarding noise and zoning has met a fresh groundswell of activism. All this has happened in the context of swift gentrification of neighborhoods such as Tremé, long a hothouse for indigenous culture. The calls and responses of a storied musical tradition have often of late been drowned out by back-and-forth arguments over these matters. At issue recently have been noise ordinances, the implications of a new citywide Comprehensive Zoning Ordinance (especially as to where and when live music is allowed), and one particularly contentious item, Section 66-205, which states: “It shall be unlawful for any person to play musical instruments on public rights-of-way between the hours of 8:00 p.m. and 9:00 a.m.” (Never mind that tourists arrive with the precise expectation of hearing music played on the streets at night. Or that a city attorney had already declared that curfew unconstitutional.) The battles during the past decade over what would get rebuilt and what wouldn’t, who could return and who couldn’t, have in large part now given way to debates over the shape and character of the “new” city. Those who remember the green dots on maps issued in January 2006 by then-Mayor C. Ray Nagin’s Bring New Orleans Back Commission — targeting certain hard-hit areas of New Orleans as future park space — know that the city’s future and character has a lot to do with how its spaces are zoned and used. Amid the panic and fury of residents whose neighborhoods had been overlaid with those green dots, who had expected to return and rebuild, that 2006 map quickly met its demise. Yet many of its ominous implications have played out anyway through obstacles to rebuilding and land grabs. The wave of gentrification that has intensified in New Orleans during the past few years — especially as carried by what one writer referred to as “yurps” (young urban rebuilding professionals) — has been stunningly swift and dramatic. New Orleans has long held bohemian attraction: Now that allure is coupled with start-up cash. In any city, gentrification raises questions: What happens when those who build upon cultural cachet don’t want that culture next door? The levees that failed represented an isolated confluence of chance, faulty design and neglect, and yet pointed to dangers lapping at all our shores; this story of an embattled culture is unexceptional in the sense that it suggests similar conflicts in other cities and common threats to our collective cultural identity. Yet in New Orleans, such concerns are underscored by a legitimately exceptional truth — a functional jazz culture that is, for many, elemental to daily life and social cohesion, and that the city’s Convention & Visitors Bureau website correctly claims “bubbles up from the streets.” During a press conference at last year’s Jazz & Heritage Festival, Mayor Landrieu told me, “There is a way to organize culture without killing it.” Those words are either comforting or alarming, depending upon whom you ask. As David Freedman, the general manager of listener-supported WWOZ-FM (self-proclaimed “Guardians of the Groove”), told me, “An unintended consequence may be the death of spontaneous culture in New Orleans. Some may think this is good for tourism and development, but it is not good for the distinct musical traditions at the core of our identity.” As Alex Rawls, a veteran local music critic, said, “The Disneyfication of New Orleans that people talked about after Katrina was supposed to be quick and dramatic. The danger is not like that. If you take your hands off the wheel and let business interests rule, that sort of thing happens more gradually, almost without people noticing.” “In New Orleans, the music community has arguably been in a cultural crisis for two or three generations,” explained clarinetist Evan Christopher, who moved to New Orleans more than 20 years ago. “We have staved off cultural annihilation by embracing fictions in harmony with the tourism machine and smiled upon by the ‘New Right’ and their fetish for nostalgia. Post-Katrina, our community's leadership was nowhere to be seen and before half of our city had returned, 80 percent of us came back with hat in hand. The utterance of ‘jazz,’ which should have represented a true strategy of transformation or an answer to revitalization, quickly became an empty slogan hung from street lamps.” Beyond cultural policy is the stark reality facing the largely black communities that have long nurtured and still support the city’s indigenous culture. According to The Data Center, an independent research group based in New Orleans, there are 99,650 fewer black residents living in New Orleans now than in 2000. (The population is now 59 percent black, down from 66.7 percent in 2000.) The Urban League of Greater New Orleans released data points from a forthcoming “State of Black New Orleans” report revealing that the number of black children younger than 18 living in poverty in the city grew by 6.5 percent from 2005 to 2013. (In 2013, more than 54,000 black children younger than 18 — 50.5 percent — were considered to be poor.) The Urban League’s statistics show widening inequity as well— an 18 percent increase in the gap between the median income of black households and white households in the city. Overall, more than 35 percent of black families in New Orleans now live below the poverty line. “I think no one will disagree that there has been tremendous progress in New Orleans in many ways,” said Erika McConduit-Diggs, Greater New Orleans Urban League president. “You can tell that from bricks and mortar. But it’s more complex when you peel back layers and look at how African-American communities are faring. What was troubling for many residents before the storm is actually now worsened. We have relocated concentrated poverty.” Lolis Eric Elie, a former New Orleans Times-Picayune columnist and co-producer of the documentary "Faubourg Tremé: The Untold Story of Black New Orleans," said, “My concern after the flood was that one catastrophic event might mean that this tradition that has endured would die out in one fell swoop. Now, my concern is that economic conditions make it increasingly hard for people to do these things because they require time, effort and money.” The headline to a recent article in the New Orleans Advocate declared, “Katrina Scattered New Orleans’ Entrenched Social Networks Far and Wide.” Even those black residents who have returned to New Orleans now increasingly live in nearby suburban parishes, reporter Katy Reckdahl explained. “All of these departures have slashed at the city’s network of extended families,” she wrote, “the generations of children who stayed in the same neighborhoods, blocks or even houses one decade after the next.” For Tamara Jackson, president of the Social Aid & Pleasure Club Task Force, a consortium of the organizations that sponsor second-line parades, “Our culture is the one thing that keeps us bonded and united. The second lines bring us together, in our old neighborhoods, for four hours at a time.” But she wondered aloud: “Is the 'new' New Orleans for native New Orleanians, or is it for tourists?” Jordan Hirsch, who was the founding director of the nonprofit Sweet Home New Orleans, a grass-roots organization that supported indigenous culture in Katrina’s wake, told me, “In some ways, this geographic dispersal has compelled people to double down on these indigenous traditions. If you can’t walk out your door and participate the way you used to, well, you’ll work even harder to make it happen. Still, I don’t think we know yet how this will really work. The issues are sustainability and transmission of tradition, which used to come more naturally.” Complicating that picture is a near-total conversion of the New Orleans schools to a charter system, which buses students all over the city and may or may not continue a long tradition of school-band instruction. New Orleans jazz culture was born in opposition to challenges, a subversion of racism and classicism. In the Tremé neighborhood in 2007, a few nights after the police had shut down that memorial, the two musicians who had been arrested led another procession. Glen David Andrews put down his trombone and sang "I'll Fly Away," as drummer Derrick Tabb snapped out beats on his snare. A tight circle surrounded the musicians, as a middle-aged black woman turned to the man next to her. "They say they want to stop this?" she asked softly. "They will never stop this." Yet there’s a creeping and newfound sense of alienation. “We dragged this city back,” said the Hot 8’s Bennie Pete, “and now we’re being shown the door.” Not that new doors haven’t opened. “One of the things that changed for the good in New Orleans,” said Lolis Elie, “is an increased conversation about the culture.” In some ways, the issues surrounding New Orleans culture are more clearly defined right now, more out in the open; musicians and other cultural leaders may be a step closer to the bargaining table when it comes to city policies. That’s thanks in large part to the emergence of some grass-roots organizations. The Music and Culture Coalition of New Orleans (MACCNO), which began with lunchtime meetings in 2012 at a club owned by trumpeter Kermit Ruffins, seeks to “empower the New Orleans music and cultural community through collective self-representation advocating in the interests of cultural preservation, perpetuation and positive economic impact,” according to its website. It has served as a crucial source of information and advocacy. A board member of the newly formed nonprofit Crescent City Cultural Continuity Conservancy (C5) told me, “So much is changing so fast in New Orleans the cultural community is increasingly aware of the need to be visible advocates for aspects of the city's identity that may well be drowned in the sea change that defines New Orleans, circa 2015.” I’ll be honest. I’m ambivalent at best about this whole anniversary thing. I recall during the first anniversary of the flood, one Lower Ninth Ward family stood by and watched as a TV anchorwoman stood, microphone in hand, in front of their devastated home: “The producer said he doesn’t want us in the picture,” the father told me, holding his baby in his arms. I’ll never forget a moment during second-anniversary events, in 2007. At a “World Cultural Economic Forum” hosted by Mitch Landrieu (then Louisiana’s lieutenant governor), Denis G. Antoine, ambassador to the U.S. from Grenada, said, "If we're taking about rebuilding New Orleans, we have to ask: Which New Orleans are we talking about? We have to talk about social values and an ancestral past. There is an anthropological aspect to the nurturing of a new New Orleans and this will help direct what is appropriate and what is not.” (Well said, I thought.) He went on: “New Orleans is a perception. When we talk about safety: How safe do you feel? It's not just about crime, it's about how safe do you feel to be you?" When I returned to New Orleans to the mark the fifth anniversary, the word “resilience” popped up nearly everywhere—in city-sponsored press conferences, and on signs tacked to lampposts that read: “Stop calling me RESILIENT. Because every time you say, ‘Oh, they’re so resilient,’ that means you can do something else to me.” A stone's throw from a just-restored Mahalia Jackson Theater for the Performing Arts was that statue of Louis Armstrong, bound by ropes and secured by sandbags amid torn-up concrete and weeds, its base rusted and damaged—the unfortunate consequence of a renovation project initiated by then-Mayor C. Ray Nagin that had gone sour. Both statue and plaza have since been repaired, but in 2010 it seemed an apt image: In a city that has known devastation and government incompetence, can a celebrated homegrown culture once again find firm footing? I suppose I’m still wondering. Larry Blumenfeld writes regularly about jazz and culture for the Wall Street Journal, and at blogs.artinfo.com/blunotes. His Salon piece, “Band on the Run in New Orleans” was included in “Best Music Writing, 2008.” He will moderate a discussion, “Ten Years After: The State of New Orleans Music and Culture,” on Aug. 24 at Basin St. Station, in New Orleans. (The panel will be live-streamed by WWOZ-FM.)The spot where St. Ann Street dead-ends into North Rampart in New Orleans, the dividing line between the French Quarter and the Tremé neighborhood, was quiet on a Friday night in November 2005. Once alight with bulbs that spelled out “Armstrong,” the large steel archway that frames the intersection was dark, its white paint overtaken by rust. Beneath it, a thick, carelessly wound chain bound two iron gates, from which dangled a steel padlock. The whole assembly looked as if it was meant to secure some oversized bicycle rather than the entrance to a 32-acre city park modeled after Copenhagen’s Tivoli Gardens and named for trumpeter Louis Armstrong. Armstrong Park was closed, had been since the flood that resulted from the levee failures following Hurricane Katrina. You could see Elizabeth Catlett’s bronze statue of Louis, trumpet in left hand, handkerchief in right, but only from a distance through bars. Armstrong in prison, it looked like. Or maybe this likeness of the trumpeter, who left New Orleans early in his career for fame and for good, was on the outside. Maybe all of us in New Orleans in late 2005, the locals who made their way back and those like me, who had shown up from afar, were locked away from something essential — a culture that has long defined this city and its inhabitants, and long helped its visitors find their true selves.   Aug. 29 will mark a decade since the 2005 disaster that we’ve come to know by the name Katrina — for the hurricane, a natural disaster — but that is more accurately understood as unnameable and unnatural, a failure of engineering and due diligence followed by a long wake of indifference or worse. The media coverage surrounding this 10th anniversary will likely constitute its own deluge, dominated by maudlin memories of catastrophe and self-righteous hype about recovery. The conflation of past misery and present cheerleading came clear in New Orleans Mayor Mitch Landrieu's June appearance on MSBNC’s “Morning Joe,” to promote the city’s campaign, “Katrina 10: Resilient New Orleans”: On-screen, stock 2005 footage showed fetid floodwaters and rescues in progress. Cut to Landrieu, in an interview, saying, “We’re doing great. We’re doing much better ... It’s a redemption, an incredible comeback story.” Within all the hoopla to come, expect trumpets and trombones and tubas and second-line parades in progress. The storied jazz culture of New Orleans will again provide prominent B-roll for TV. That culture belongs in the foreground. That’s the story I’ve been tracking. Now, a decade in, I want to know if those who most need and want New Orleans jazz culture now find themselves, amid all the rebuilding, estranged from it or feeling as if it may yet slip away. In October 2005, I wrote an essay for Salon in which I wondered whether musicians and other culture bearers of New Orleans would return to their devastated city at all. I worried over the prospect of a Disney-fied Crescent City, or whether the whole place would be turned into a museum piece. I asked if the culture born in New Orleans — which Ken Burns' PBS series “Jazz” famously cast as a signal of American values and virtues on the order of the Constitution — “still carried currency when it comes to the issues Katrina raised: identity, race, poverty and basic decency.” In the long wake of the flood, the ranks of jazz musicians, the brass-band-led Sunday second-line paraders and the feathered-and-beaded Mardi Gras Indians — the key players of indigenous New Orleans culture — did not simply return. They came back sooner and in greater numbers than the rest of the population. They led the way, and have maintained a vital sense of continuity. Louisiana State University sociologist Frederick Weil, who surveyed 6,000 New Orleans residents, told me, “By the standards of civic-engagement literature, the members of the Social Aid & Pleasure Clubs who sponsor weekly parades are ‘model citizens,’ scoring highest of any group. They are community leaders, supporting each other in times of need and providing concrete services.” David Simon’s HBO series “Treme,” a fictional depiction of post-Katrina New Orleans, captured this truth. In its premiere episode, before a word of dialogue was uttered, a saxophonist licked and then adjusted his reed. Slide oil was applied to a trombone. Two black kids danced to a faint parade rhythm. An unseen trumpet sounded an upward figure, followed by a tuba’s downward groove. The scene re-created the first second-line parade following Katrina — a memorial for a local chef, Austin Leslie. Simon told me then, in 2010, “Apart from culture, on some empirical level, it does not matter if all New Orleans washes into the Gulf, and if everyone from New Orleans ended up living in Houston or Baton Rouge or Atlanta. Culture is what brought this city back … New Orleans is coming back, and it’s sort of done it one second-line at a time.” "Initially, New Orleans jazz was a reflection of a way of life," clarinetist and professor Michael White told me back in 2006 at his office at Xavier University, while peering over a jagged pile that included the red notebook in which, during the weeks following Katrina, he jotted down the names and whereabouts of friends and colleagues. "It spoke of the way people walk, talk, eat, sleep, dance, drive, think, make jokes and dress. But I don't think America ever truly understood New Orleans culture, because the mind-set is so different here. So that whole tradition was hidden from most of America.” When I first got to New Orleans after the flood I was stunned first by just how much had been destroyed, and then later by just how little I knew. I’d been writing about jazz for 20 years. Yet I was profoundly ignorant about what it means to have a living music, one that flows from and embeds everyday life — a functional jazz culture of the sort that once existed in cities throughout the United States but now is exclusive to New Orleans. Before I spent months at a time in the city, before I spent countless hours with the people who make and support New Orleans jazz culture, I knew about but had not yet meaningfully felt the link to something fundamentally African, transplanted via the enslaved who passed through much of this hemisphere, who drummed and danced in Congo Square, a stone’s throw from where that Armstrong statue now stands. And I had no clue what a tenuous proposition this culture represents: What it was, is and maybe always will be up against. “There’s a feeling among many of us,” White told me in 2006, “that some of our older cultural institutions, like parades and jazz funerals, are in the way of progress and don’t fit in the new vision of New Orleans, that they should only be used in a limited way to boost the image of New Orleans, as opposed to being real, viable aspects of our lives.” Was New Orleans jazz culture welcomed back? Not exactly. If there’s a culture war going on in the city, that’s hardly news. According to historian Freddi Williams Evans’ book “Congo Square: African Roots in New Orleans,” Congo Square was codified by an 1817 city ordinance that restricted drumming and African dances to a single spot. Skip to 1996, when a photograph of a protest march that ran in the Times-Picayune newspaper showed a teenage snare drummer wearing a sign: “I Was Arrested for Playing Music.” The past decade lends a new chapter to such conflicts. In the years since the 2005 flood, tensions surrounding culture have flared. In 2007, a consortium of Social Aid and Pleasure Clubs defeated jacked-up city permit fees for their weekly brass-band-led second-line parades in federal court, on First Amendment grounds. ("Should the law not be enjoined," the complaint, filed on behalf of a consortium of Social Aid & Pleasure Clubs, aided by the American Civil Liberties Union, stated, "there is very little doubt that plaintiff's cultural tradition will cease to exist.") Later that year, police busted up a memorial procession for a beloved tuba player in dramatic fashion, reigniting a long-standing fight over who owns the streets. This narrative unfolded despite the city’s pervasive use of these traditions to rekindle a tourism business that, in 2014, hosted 9.5 million visitors who spent $6.8 billion (a record for visitor spending). "We rose out of water and debris to lead the way back to the life that we love," said Bennie Pete, sousaphonist and leader of the Hot 8 Brass Band, a local favorite, at a public forum on such matters in 2008. "It's not just a party, it's our life. We can sugarcoat it all kinds of ways, but the city looks at us as uncivilized. And that's why they try to confine us." During the past few years, as a yet undefined “new” New Orleans rubs up against whatever is left of the old one, brass bands have been shut down on their customary street corners. Music clubs have increasingly been hit with lawsuits and visited by the police responding to phoned-in complaints. A revival of rarely enforced ordinances regarding noise and zoning has met a fresh groundswell of activism. All this has happened in the context of swift gentrification of neighborhoods such as Tremé, long a hothouse for indigenous culture. The calls and responses of a storied musical tradition have often of late been drowned out by back-and-forth arguments over these matters. At issue recently have been noise ordinances, the implications of a new citywide Comprehensive Zoning Ordinance (especially as to where and when live music is allowed), and one particularly contentious item, Section 66-205, which states: “It shall be unlawful for any person to play musical instruments on public rights-of-way between the hours of 8:00 p.m. and 9:00 a.m.” (Never mind that tourists arrive with the precise expectation of hearing music played on the streets at night. Or that a city attorney had already declared that curfew unconstitutional.) The battles during the past decade over what would get rebuilt and what wouldn’t, who could return and who couldn’t, have in large part now given way to debates over the shape and character of the “new” city. Those who remember the green dots on maps issued in January 2006 by then-Mayor C. Ray Nagin’s Bring New Orleans Back Commission — targeting certain hard-hit areas of New Orleans as future park space — know that the city’s future and character has a lot to do with how its spaces are zoned and used. Amid the panic and fury of residents whose neighborhoods had been overlaid with those green dots, who had expected to return and rebuild, that 2006 map quickly met its demise. Yet many of its ominous implications have played out anyway through obstacles to rebuilding and land grabs. The wave of gentrification that has intensified in New Orleans during the past few years — especially as carried by what one writer referred to as “yurps” (young urban rebuilding professionals) — has been stunningly swift and dramatic. New Orleans has long held bohemian attraction: Now that allure is coupled with start-up cash. In any city, gentrification raises questions: What happens when those who build upon cultural cachet don’t want that culture next door? The levees that failed represented an isolated confluence of chance, faulty design and neglect, and yet pointed to dangers lapping at all our shores; this story of an embattled culture is unexceptional in the sense that it suggests similar conflicts in other cities and common threats to our collective cultural identity. Yet in New Orleans, such concerns are underscored by a legitimately exceptional truth — a functional jazz culture that is, for many, elemental to daily life and social cohesion, and that the city’s Convention & Visitors Bureau website correctly claims “bubbles up from the streets.” During a press conference at last year’s Jazz & Heritage Festival, Mayor Landrieu told me, “There is a way to organize culture without killing it.” Those words are either comforting or alarming, depending upon whom you ask. As David Freedman, the general manager of listener-supported WWOZ-FM (self-proclaimed “Guardians of the Groove”), told me, “An unintended consequence may be the death of spontaneous culture in New Orleans. Some may think this is good for tourism and development, but it is not good for the distinct musical traditions at the core of our identity.” As Alex Rawls, a veteran local music critic, said, “The Disneyfication of New Orleans that people talked about after Katrina was supposed to be quick and dramatic. The danger is not like that. If you take your hands off the wheel and let business interests rule, that sort of thing happens more gradually, almost without people noticing.” “In New Orleans, the music community has arguably been in a cultural crisis for two or three generations,” explained clarinetist Evan Christopher, who moved to New Orleans more than 20 years ago. “We have staved off cultural annihilation by embracing fictions in harmony with the tourism machine and smiled upon by the ‘New Right’ and their fetish for nostalgia. Post-Katrina, our community's leadership was nowhere to be seen and before half of our city had returned, 80 percent of us came back with hat in hand. The utterance of ‘jazz,’ which should have represented a true strategy of transformation or an answer to revitalization, quickly became an empty slogan hung from street lamps.” Beyond cultural policy is the stark reality facing the largely black communities that have long nurtured and still support the city’s indigenous culture. According to The Data Center, an independent research group based in New Orleans, there are 99,650 fewer black residents living in New Orleans now than in 2000. (The population is now 59 percent black, down from 66.7 percent in 2000.) The Urban League of Greater New Orleans released data points from a forthcoming “State of Black New Orleans” report revealing that the number of black children younger than 18 living in poverty in the city grew by 6.5 percent from 2005 to 2013. (In 2013, more than 54,000 black children younger than 18 — 50.5 percent — were considered to be poor.) The Urban League’s statistics show widening inequity as well— an 18 percent increase in the gap between the median income of black households and white households in the city. Overall, more than 35 percent of black families in New Orleans now live below the poverty line. “I think no one will disagree that there has been tremendous progress in New Orleans in many ways,” said Erika McConduit-Diggs, Greater New Orleans Urban League president. “You can tell that from bricks and mortar. But it’s more complex when you peel back layers and look at how African-American communities are faring. What was troubling for many residents before the storm is actually now worsened. We have relocated concentrated poverty.” Lolis Eric Elie, a former New Orleans Times-Picayune columnist and co-producer of the documentary "Faubourg Tremé: The Untold Story of Black New Orleans," said, “My concern after the flood was that one catastrophic event might mean that this tradition that has endured would die out in one fell swoop. Now, my concern is that economic conditions make it increasingly hard for people to do these things because they require time, effort and money.” The headline to a recent article in the New Orleans Advocate declared, “Katrina Scattered New Orleans’ Entrenched Social Networks Far and Wide.” Even those black residents who have returned to New Orleans now increasingly live in nearby suburban parishes, reporter Katy Reckdahl explained. “All of these departures have slashed at the city’s network of extended families,” she wrote, “the generations of children who stayed in the same neighborhoods, blocks or even houses one decade after the next.” For Tamara Jackson, president of the Social Aid & Pleasure Club Task Force, a consortium of the organizations that sponsor second-line parades, “Our culture is the one thing that keeps us bonded and united. The second lines bring us together, in our old neighborhoods, for four hours at a time.” But she wondered aloud: “Is the 'new' New Orleans for native New Orleanians, or is it for tourists?” Jordan Hirsch, who was the founding director of the nonprofit Sweet Home New Orleans, a grass-roots organization that supported indigenous culture in Katrina’s wake, told me, “In some ways, this geographic dispersal has compelled people to double down on these indigenous traditions. If you can’t walk out your door and participate the way you used to, well, you’ll work even harder to make it happen. Still, I don’t think we know yet how this will really work. The issues are sustainability and transmission of tradition, which used to come more naturally.” Complicating that picture is a near-total conversion of the New Orleans schools to a charter system, which buses students all over the city and may or may not continue a long tradition of school-band instruction. New Orleans jazz culture was born in opposition to challenges, a subversion of racism and classicism. In the Tremé neighborhood in 2007, a few nights after the police had shut down that memorial, the two musicians who had been arrested led another procession. Glen David Andrews put down his trombone and sang "I'll Fly Away," as drummer Derrick Tabb snapped out beats on his snare. A tight circle surrounded the musicians, as a middle-aged black woman turned to the man next to her. "They say they want to stop this?" she asked softly. "They will never stop this." Yet there’s a creeping and newfound sense of alienation. “We dragged this city back,” said the Hot 8’s Bennie Pete, “and now we’re being shown the door.” Not that new doors haven’t opened. “One of the things that changed for the good in New Orleans,” said Lolis Elie, “is an increased conversation about the culture.” In some ways, the issues surrounding New Orleans culture are more clearly defined right now, more out in the open; musicians and other cultural leaders may be a step closer to the bargaining table when it comes to city policies. That’s thanks in large part to the emergence of some grass-roots organizations. The Music and Culture Coalition of New Orleans (MACCNO), which began with lunchtime meetings in 2012 at a club owned by trumpeter Kermit Ruffins, seeks to “empower the New Orleans music and cultural community through collective self-representation advocating in the interests of cultural preservation, perpetuation and positive economic impact,” according to its website. It has served as a crucial source of information and advocacy. A board member of the newly formed nonprofit Crescent City Cultural Continuity Conservancy (C5) told me, “So much is changing so fast in New Orleans the cultural community is increasingly aware of the need to be visible advocates for aspects of the city's identity that may well be drowned in the sea change that defines New Orleans, circa 2015.” I’ll be honest. I’m ambivalent at best about this whole anniversary thing. I recall during the first anniversary of the flood, one Lower Ninth Ward family stood by and watched as a TV anchorwoman stood, microphone in hand, in front of their devastated home: “The producer said he doesn’t want us in the picture,” the father told me, holding his baby in his arms. I’ll never forget a moment during second-anniversary events, in 2007. At a “World Cultural Economic Forum” hosted by Mitch Landrieu (then Louisiana’s lieutenant governor), Denis G. Antoine, ambassador to the U.S. from Grenada, said, "If we're taking about rebuilding New Orleans, we have to ask: Which New Orleans are we talking about? We have to talk about social values and an ancestral past. There is an anthropological aspect to the nurturing of a new New Orleans and this will help direct what is appropriate and what is not.” (Well said, I thought.) He went on: “New Orleans is a perception. When we talk about safety: How safe do you feel? It's not just about crime, it's about how safe do you feel to be you?" When I returned to New Orleans to the mark the fifth anniversary, the word “resilience” popped up nearly everywhere—in city-sponsored press conferences, and on signs tacked to lampposts that read: “Stop calling me RESILIENT. Because every time you say, ‘Oh, they’re so resilient,’ that means you can do something else to me.” A stone's throw from a just-restored Mahalia Jackson Theater for the Performing Arts was that statue of Louis Armstrong, bound by ropes and secured by sandbags amid torn-up concrete and weeds, its base rusted and damaged—the unfortunate consequence of a renovation project initiated by then-Mayor C. Ray Nagin that had gone sour. Both statue and plaza have since been repaired, but in 2010 it seemed an apt image: In a city that has known devastation and government incompetence, can a celebrated homegrown culture once again find firm footing? I suppose I’m still wondering. Larry Blumenfeld writes regularly about jazz and culture for the Wall Street Journal, and at blogs.artinfo.com/blunotes. His Salon piece, “Band on the Run in New Orleans” was included in “Best Music Writing, 2008.” He will moderate a discussion, “Ten Years After: The State of New Orleans Music and Culture,” on Aug. 24 at Basin St. Station, in New Orleans. (The panel will be live-streamed by WWOZ-FM.)

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 07:00

“Something happened to me in Selma”: 50 years ago, a young white seminary student risked everything for the call of civil rights

The ceremonies  that marked the 50th anniversary of the signing of the historic Voting Rights Act last Aug. 6 have not yet ended. Last weekend people from as far away as Alaska and the Virgin Islands returned to Selma, Alabama, to remember yet another tragedy that occurred half a century ago when, on Aug. 20, 1965, a 26-year-old Episcopal seminary student named Jonathan Daniels lost his life while fighting for civil rights. Jon Daniels, of Keene, New Hampshire, was one of the thousands who answered Dr. Martin Luther King's call to join his campaign for voting rights in Selma following the assault on the Edmund Pettus Bridge on "Bloody Sunday," March 7, 1965. He lived with a black family, tutored young students and demonstrated for equal rights. Nine days later he returned to his studies in Cambridge, Massachusetts,  but he was determined to make civil rights a part of his life. "Something happened to me in Selma ...," he later noted. "I could not stand by in benevolent dispassion any longer without compromising everything I know and love and  value. The imperative was too clear, the stakes too high , my own identity was called too nakedly into question ... I had been blinded by what I saw here (and elsewhere), and the road to Damascus led, for me, back here." He asked officials at the Episcopal Theological School to permit him to finish the semester in Selma. He would submit his academic assignments by mail while, at the same time, assisting the African-American community, who suffered so greatly from that tortured city's racial  problems. The school allowed him to go to Selma as a representative of ESCRU, the Episcopal Society for Cultural and Racial Unity. He wore the badge with the letters "ESCRU" proudly on his chest. That badge brought him trouble when he returned to Selma. Waiting in line at the local post office, a man looked him over, his eye caught by the seminarian's collar and the ESCRU button. "Know what he is?" the man asked a friend. "Why, he's a white niggah." At first, Daniels was startled to see others turn to study him, obviously thinking he was one of  the pariahs the segregationists called "outside agitators." But then Daniels felt not fear but pride, wishing he could announce, "I am indeed a 'white nigger.'" Later he reflected, "I wouldn't swap the blessings He has given me. But black would be a very wonderful, a very beautiful color to be." On Palm Sunday, he and his colleagues led a group of African-Americans to Selma's St. Paul's Episcopal Church where, after some resistance, they were allowed to attend morning services -- the first church to be desegregated in Selma. But they were forced to sit in the church's last row and let white parishioners be the first to receive communion. Later Daniels was accosted in the street by a well-dressed man, perhaps a lawyer or a banker, who asked him: "Are you the scum that's been going to the Episcopal church?" Then he answered his question: "S-C-U-M. That's what you are -- you and that nigger trash you bring with you." Daniels said he was "sorry" that the man was upset by having to share his church with the blacks who shared his faith but complimented him on how well he could  spell. In August, he joined activists from King's Southern Christian Leadership Conference (SCLC) and the Student Nonviolent Coordinating Committee (SNCC) who were working in "Bloody Lowndes County," so called because of the brutal treatment blacks received there since the end of Reconstruction. Of the almost 6,000 blacks who were eligible to vote, none was registered prior to the advent of the civil rights movement in Alabama and, despite the efforts of activists who began working there early in 1965, not a single black participated in the Voting Rights March. Stokely Carmichael, SNCC's most charismatic organizer, who was building a civil rights movement in the county, aptly called Lowndes “the epitome of the tight, insulated police state.” Carmichael's target on Aug. 14 was Fort Deposit, a Klan bastion and a town that treated its black population with special cruelty. Here they would lead local youth in a protest, the first in the town's history. FBI agents on the scene urged them to cancel the demonstration. An armed mob of white men armed with guns, clubs and bottles was gathering and the Bureau refused to protect the protesters. They went forward anyway, picketing a cafe, a dry goods store and a grocery. The police soon appeared and arrested the activists, including Stokely Carmichael, Jon Daniels, four SNCC workers, a Catholic priest from Chicago named Richard Morrisoe and 17 local youths. They were charged with "resisting arrest and picketing to cause blood." The mob turned their fury on reporters observing events from the relative safety of two cars. The reporters fled after the mob attacked their cars with baseball bats as they tried to  pull them into the street. Daniels and the others were taken to the newly built county jail in Hayneville, a sleepy little town of 400, whose town square was built around a 10-foot-tall monument dedicated to county men killed during the Civil War. Two months earlier, its courthouse was the site of the trial of Collie Leroy Wilkins, one of the Klansmen who had killed civil rights activist Viola Liuzzo, the Detroit homemaker and mother of five, who was shot while driving with a black colleague following the conclusion of the historic voting rights march from Selma to Montgomery on March 25. Townspeople resented the presence of foreign reporters -- more than 50, Americans, Englishmen and Swedes -- who had invaded the town in large numbers. The courtroom was filled with Klansmen who brought their wives and children to observe Imperial Klonsul Matthew Hobson Murphy defend one of their own. After a tumultuous trial, the jury was unable to reach a unanimous verdict so the judge declared a mistrial. Klansmen whooped, stamped their feet and clapped. "I'm glad that's over," a woman told Life magazine's John Frook. "Y'all can go back North now and let us have some peace and quiet." Virginia Foster Durr, a writer and friend of Rosa Parks who had observed the trial, did not share the woman's optimism. She sensed "there was killing in the air."  Most "of the people frighten me," she later wrote friends. "They are so insane and prejudiced." She feared that "killing would strike again, for the white people of Hayneville had condoned the killing, whatever they might say."    The Wilkins trial and the coming of the civil rights movement to Lowndes County also enraged a 54-year-old Hayneville native named Tom Coleman. The old Lowndes County courthouse was his second home; his grandfather had been county Sheriff early in the 20th century and his father, superintendent of the county school system, once had an office there. So did his sister who currently held that post, as well as the current circuit court clerk, who was married to his cousin. Coleman spent time there every day, chatting with bailiffs and the court reporters and playing dominos in the clerk’s office. Although lacking a formal education, Coleman eventually became the county’s chief engineer whose work crews often included convict labor, which he supervised.  One night in August 1959, a black prisoner turned violent and, arming himself with broken bottles, refused to surrender peacefully and moved toward Coleman who fired his shotgun, killing the man. Since this was clearly a case of self-defense, Coleman was never charged. Indeed, he was treated by local police as a hero, a role he liked. He eventually became an unpaid “special deputy sheriff,” formed close friendships with Selma Sheriff Jim Clark and other law enforcement agents, and was proud that his son joined them by becoming a state trooper. He claimed never to have joined the Klan but he was active in the local White Citizens Council and, like the Klan, saw himself as a defender of the Southern way of life. For Coleman, those who challenged Southern mores, like Jon Daniels, were public enemy No. 1. "The food is vile and we aren't allowed to bathe (whew!)," Jon Daniels wrote his mother on Aug. 17, his third day in jail. "But otherwise we are okay. Should be out in 2-3 days and back to work. As you can imagine, I'll have a tale or two to swap over our next martini." ESCRU had offered to provide the bail to free Daniels but he refused because it wasn't sufficient to cover his 19 colleagues also imprisoned. Then, suddenly, on Friday, Aug. 20, Daniels was informed that their bail had been waived and they were free to go. Hayneville's mayor, apparently fearing federal intervention, had ordered their release. One of the first to learn the news was Tom Coleman, who was at the courthouse playing dominoes with friends. Believing that the notorious Stokely Carmichael was now on the loose and trouble was sure to ensue, Coleman got his .12- gauge automatic shotgun and went immediately to the local grocery, where he offered to protect Virginia Varner, a longtime friend who owned the Cash Store. Coleman was misinformed -- Carmichael's bail was paid two days earlier and he was no longer in Hayneville. At 3 o'clock, Daniels and his friends gathered outside the jail, happy to see that everyone had survived their squalid captivity without the customary beatings or worse. They asked the jailer to protect them but he refused, urging them to get off the county's property. So they began to walk toward the courthouse square a few blocks away. Obviously SNCC headquarters had not been informed of their release, so Willie Vaughn went in search of a phone. Spotting the Varner Cash Store with its large Coca-Cola signs, Daniels, Rev. Morrisroe and two African-American girls, Ruby Sales, 20,  and Joyce Bailey, 19, walked toward the store to get a cold drink and something to eat. It did not occur to the tired and hungry activists that an interracial group, even one so young and harmless, might incite local racists. Jon Daniels opened the screen door for Ruby Sales and the two came face to face with Tom Coleman. “[G]et out, the store is closed,” he yelled. "[G]et off this property or I’ll blow your god-damned heads off, you sons of bitches.” Daniels pulled Ruby Sales behind him and tried to talk to the angry man. He was polite and, with his clerical collar, did not look like someone who posed a threat. But, without further words, Coleman fired his .12-gauge automatic shotgun, blowing a hole in Daniel’s chest, killing him instantly. Richard Morrisroe grabbed Joyce Bailey's hand and they turned to run but Coleman fired again, hitting the priest in the back and side, seriously, but not critically, injuring him. After threatening to kill others who approached, Coleman put down his weapon, drove to the sheriff’s office and telephoned Al Lingo, Alabama’s safety director. “I just shot two preachers,” he told him. “You better get on down here.” Daniel and Morrisroe’s friends held a rally later that night. “We’re going to tear this county down,” a saddened and angry Stokely Carmichael said. “Then we’re going to build it back brick by brick, until it’s a fit place for human beings.” Since March, four civil rights workers had been murdered — Jimmy Lee Jackson, Rev. Jim Reeb, Viola Liuzzo and now Jonathan Daniels. Soon Carmichael’s fury would result in the organization of a separate political party in Lowndes County: Its symbol was the black panther, its slogan, “Power for Black People.” To the citizens of Lowndes County, Tom Coleman was a hero -- “a hell of a nice guy,” people said. County Solicitor Carlton Purdue said that Coleman “was like the rest of us. He’s strong in his feelings.” Tom Coleman and his family were “all good friends” of his, he told reporters who had returned to Lowndes County to cover another murder trial. “If [Daniels and Morrisroe] had been tending to their own business, like I was tending to mine, they’d be living and enjoying themselves." These attitudes may explain why the Lowndes County Grand Jury charged Coleman not with first- or second-degree murder and attempted murder in Morrisroe’s case, but manslaughter and assault and battery. Alabama’s attorney general, Richmond Flowers, called the grand jury’s action “an abdication of … responsibility.” Lowndes County justice proceeded as usual, oblivious to outsiders’ criticism. In fact, the more the national media attacked Southern customs, the more its citizens embraced them. When Flowers asked Judge T. Werth Thagard, who was trying the case, for a two-month postponement until Father Morrisroe, his chief eyewitness, had recovered sufficiently to testify, the judge rejected the motion and declared, “The trial of Tom Coleman will begin tomorrow.” Flowers refused to participate so Thagard removed him and asked Carlton Purdue and Arthur Gamble to prosecute. When the trial began on Sept. 27, the courtroom was packed with Klansmen — including Liuzzo's killers, Collie Leroy Wilkins, Eugene Thomas and William Orville Eaton. Defense witnesses testified that Daniels threatened Coleman with a switchblade knife while Morrisroe pulled a gun, so Coleman was merely protecting himself when he shot them. The jury rejected Ruby Sales' eyewitness testimony, finding the lies more persuasive. In his closing statement defense attorney Joe Phelps said, “You know Tom Coleman and you know he had to do what he did,” while his co-counsel added: “God give us such men! Men with great hearts, strong minds, pure souls -- and ready hands!” Coleman had a God-given right “to defend himself and his lady.” On Wednesday, Sept. 29, just two days after the trial began, the jury began its deliberations. The “trial watchers,” awaiting the jury’s decision, were “busily talking in huddles,” not about the verdict -- which was never in doubt -- but about tomorrow’s football game between the University of Alabama and “Ole Miss.” After about 90 minutes, the jury found Coleman not guilty of all charges. Thagard thanked the jurors who, before heading to the clerk’s office to receive their stipend, walked over to Coleman and shook his hand. One asked, "We gonna be able to make that dove shoot now, ain’t we?” The NAACP called the jury’s verdict “a monstrous farce,” which encouraged “every Alabama bigot” to declare “open season on Negroes and their white friends.” The NAACP was right — Fort Deposit citizens were now seen driving cars with bumper stickers which read "Open Season." Despite this legal travesty, Jon Daniels did not die in vain. ESCRU and the American Civil Liberties Union filed a lawsuit that attacked the county legal system for prohibiting women and blacks from serving on juries. That October, the Justice Department intervened, ordering county officials to produce all jury records since 1915 and the results showed evidence of racial and gender discrimination and other legal improprieties. In 1966, the Federal Court of Appeals in Montgomery, Alabama, declared state laws excluding women from juries unconstitutional as well as any practice that prevented African-Americans from jury service. "The decision revolutionized the jury system in [Lowndes] County," writes historian Charles Eagles. "[And] indirectly it also promised greater protection for civil rights workers by guaranteeing that blacks would serve on the juries that indicted and tried people like Tom Coleman." Dr. Martin Luther King called Jon Daniels' sacrifice "one of the most heroic Christian deeds of which I have heard in my entire ministry." While countless African-Americans lost their lives in the South for simply being black or fighting for the right to vote, we should not forget their white allies, like Jon Daniels, who were jailed, beaten and murdered in the fight for civil rights.   The ceremonies  that marked the 50th anniversary of the signing of the historic Voting Rights Act last Aug. 6 have not yet ended. Last weekend people from as far away as Alaska and the Virgin Islands returned to Selma, Alabama, to remember yet another tragedy that occurred half a century ago when, on Aug. 20, 1965, a 26-year-old Episcopal seminary student named Jonathan Daniels lost his life while fighting for civil rights. Jon Daniels, of Keene, New Hampshire, was one of the thousands who answered Dr. Martin Luther King's call to join his campaign for voting rights in Selma following the assault on the Edmund Pettus Bridge on "Bloody Sunday," March 7, 1965. He lived with a black family, tutored young students and demonstrated for equal rights. Nine days later he returned to his studies in Cambridge, Massachusetts,  but he was determined to make civil rights a part of his life. "Something happened to me in Selma ...," he later noted. "I could not stand by in benevolent dispassion any longer without compromising everything I know and love and  value. The imperative was too clear, the stakes too high , my own identity was called too nakedly into question ... I had been blinded by what I saw here (and elsewhere), and the road to Damascus led, for me, back here." He asked officials at the Episcopal Theological School to permit him to finish the semester in Selma. He would submit his academic assignments by mail while, at the same time, assisting the African-American community, who suffered so greatly from that tortured city's racial  problems. The school allowed him to go to Selma as a representative of ESCRU, the Episcopal Society for Cultural and Racial Unity. He wore the badge with the letters "ESCRU" proudly on his chest. That badge brought him trouble when he returned to Selma. Waiting in line at the local post office, a man looked him over, his eye caught by the seminarian's collar and the ESCRU button. "Know what he is?" the man asked a friend. "Why, he's a white niggah." At first, Daniels was startled to see others turn to study him, obviously thinking he was one of  the pariahs the segregationists called "outside agitators." But then Daniels felt not fear but pride, wishing he could announce, "I am indeed a 'white nigger.'" Later he reflected, "I wouldn't swap the blessings He has given me. But black would be a very wonderful, a very beautiful color to be." On Palm Sunday, he and his colleagues led a group of African-Americans to Selma's St. Paul's Episcopal Church where, after some resistance, they were allowed to attend morning services -- the first church to be desegregated in Selma. But they were forced to sit in the church's last row and let white parishioners be the first to receive communion. Later Daniels was accosted in the street by a well-dressed man, perhaps a lawyer or a banker, who asked him: "Are you the scum that's been going to the Episcopal church?" Then he answered his question: "S-C-U-M. That's what you are -- you and that nigger trash you bring with you." Daniels said he was "sorry" that the man was upset by having to share his church with the blacks who shared his faith but complimented him on how well he could  spell. In August, he joined activists from King's Southern Christian Leadership Conference (SCLC) and the Student Nonviolent Coordinating Committee (SNCC) who were working in "Bloody Lowndes County," so called because of the brutal treatment blacks received there since the end of Reconstruction. Of the almost 6,000 blacks who were eligible to vote, none was registered prior to the advent of the civil rights movement in Alabama and, despite the efforts of activists who began working there early in 1965, not a single black participated in the Voting Rights March. Stokely Carmichael, SNCC's most charismatic organizer, who was building a civil rights movement in the county, aptly called Lowndes “the epitome of the tight, insulated police state.” Carmichael's target on Aug. 14 was Fort Deposit, a Klan bastion and a town that treated its black population with special cruelty. Here they would lead local youth in a protest, the first in the town's history. FBI agents on the scene urged them to cancel the demonstration. An armed mob of white men armed with guns, clubs and bottles was gathering and the Bureau refused to protect the protesters. They went forward anyway, picketing a cafe, a dry goods store and a grocery. The police soon appeared and arrested the activists, including Stokely Carmichael, Jon Daniels, four SNCC workers, a Catholic priest from Chicago named Richard Morrisoe and 17 local youths. They were charged with "resisting arrest and picketing to cause blood." The mob turned their fury on reporters observing events from the relative safety of two cars. The reporters fled after the mob attacked their cars with baseball bats as they tried to  pull them into the street. Daniels and the others were taken to the newly built county jail in Hayneville, a sleepy little town of 400, whose town square was built around a 10-foot-tall monument dedicated to county men killed during the Civil War. Two months earlier, its courthouse was the site of the trial of Collie Leroy Wilkins, one of the Klansmen who had killed civil rights activist Viola Liuzzo, the Detroit homemaker and mother of five, who was shot while driving with a black colleague following the conclusion of the historic voting rights march from Selma to Montgomery on March 25. Townspeople resented the presence of foreign reporters -- more than 50, Americans, Englishmen and Swedes -- who had invaded the town in large numbers. The courtroom was filled with Klansmen who brought their wives and children to observe Imperial Klonsul Matthew Hobson Murphy defend one of their own. After a tumultuous trial, the jury was unable to reach a unanimous verdict so the judge declared a mistrial. Klansmen whooped, stamped their feet and clapped. "I'm glad that's over," a woman told Life magazine's John Frook. "Y'all can go back North now and let us have some peace and quiet." Virginia Foster Durr, a writer and friend of Rosa Parks who had observed the trial, did not share the woman's optimism. She sensed "there was killing in the air."  Most "of the people frighten me," she later wrote friends. "They are so insane and prejudiced." She feared that "killing would strike again, for the white people of Hayneville had condoned the killing, whatever they might say."    The Wilkins trial and the coming of the civil rights movement to Lowndes County also enraged a 54-year-old Hayneville native named Tom Coleman. The old Lowndes County courthouse was his second home; his grandfather had been county Sheriff early in the 20th century and his father, superintendent of the county school system, once had an office there. So did his sister who currently held that post, as well as the current circuit court clerk, who was married to his cousin. Coleman spent time there every day, chatting with bailiffs and the court reporters and playing dominos in the clerk’s office. Although lacking a formal education, Coleman eventually became the county’s chief engineer whose work crews often included convict labor, which he supervised.  One night in August 1959, a black prisoner turned violent and, arming himself with broken bottles, refused to surrender peacefully and moved toward Coleman who fired his shotgun, killing the man. Since this was clearly a case of self-defense, Coleman was never charged. Indeed, he was treated by local police as a hero, a role he liked. He eventually became an unpaid “special deputy sheriff,” formed close friendships with Selma Sheriff Jim Clark and other law enforcement agents, and was proud that his son joined them by becoming a state trooper. He claimed never to have joined the Klan but he was active in the local White Citizens Council and, like the Klan, saw himself as a defender of the Southern way of life. For Coleman, those who challenged Southern mores, like Jon Daniels, were public enemy No. 1. "The food is vile and we aren't allowed to bathe (whew!)," Jon Daniels wrote his mother on Aug. 17, his third day in jail. "But otherwise we are okay. Should be out in 2-3 days and back to work. As you can imagine, I'll have a tale or two to swap over our next martini." ESCRU had offered to provide the bail to free Daniels but he refused because it wasn't sufficient to cover his 19 colleagues also imprisoned. Then, suddenly, on Friday, Aug. 20, Daniels was informed that their bail had been waived and they were free to go. Hayneville's mayor, apparently fearing federal intervention, had ordered their release. One of the first to learn the news was Tom Coleman, who was at the courthouse playing dominoes with friends. Believing that the notorious Stokely Carmichael was now on the loose and trouble was sure to ensue, Coleman got his .12- gauge automatic shotgun and went immediately to the local grocery, where he offered to protect Virginia Varner, a longtime friend who owned the Cash Store. Coleman was misinformed -- Carmichael's bail was paid two days earlier and he was no longer in Hayneville. At 3 o'clock, Daniels and his friends gathered outside the jail, happy to see that everyone had survived their squalid captivity without the customary beatings or worse. They asked the jailer to protect them but he refused, urging them to get off the county's property. So they began to walk toward the courthouse square a few blocks away. Obviously SNCC headquarters had not been informed of their release, so Willie Vaughn went in search of a phone. Spotting the Varner Cash Store with its large Coca-Cola signs, Daniels, Rev. Morrisroe and two African-American girls, Ruby Sales, 20,  and Joyce Bailey, 19, walked toward the store to get a cold drink and something to eat. It did not occur to the tired and hungry activists that an interracial group, even one so young and harmless, might incite local racists. Jon Daniels opened the screen door for Ruby Sales and the two came face to face with Tom Coleman. “[G]et out, the store is closed,” he yelled. "[G]et off this property or I’ll blow your god-damned heads off, you sons of bitches.” Daniels pulled Ruby Sales behind him and tried to talk to the angry man. He was polite and, with his clerical collar, did not look like someone who posed a threat. But, without further words, Coleman fired his .12-gauge automatic shotgun, blowing a hole in Daniel’s chest, killing him instantly. Richard Morrisroe grabbed Joyce Bailey's hand and they turned to run but Coleman fired again, hitting the priest in the back and side, seriously, but not critically, injuring him. After threatening to kill others who approached, Coleman put down his weapon, drove to the sheriff’s office and telephoned Al Lingo, Alabama’s safety director. “I just shot two preachers,” he told him. “You better get on down here.” Daniel and Morrisroe’s friends held a rally later that night. “We’re going to tear this county down,” a saddened and angry Stokely Carmichael said. “Then we’re going to build it back brick by brick, until it’s a fit place for human beings.” Since March, four civil rights workers had been murdered — Jimmy Lee Jackson, Rev. Jim Reeb, Viola Liuzzo and now Jonathan Daniels. Soon Carmichael’s fury would result in the organization of a separate political party in Lowndes County: Its symbol was the black panther, its slogan, “Power for Black People.” To the citizens of Lowndes County, Tom Coleman was a hero -- “a hell of a nice guy,” people said. County Solicitor Carlton Purdue said that Coleman “was like the rest of us. He’s strong in his feelings.” Tom Coleman and his family were “all good friends” of his, he told reporters who had returned to Lowndes County to cover another murder trial. “If [Daniels and Morrisroe] had been tending to their own business, like I was tending to mine, they’d be living and enjoying themselves." These attitudes may explain why the Lowndes County Grand Jury charged Coleman not with first- or second-degree murder and attempted murder in Morrisroe’s case, but manslaughter and assault and battery. Alabama’s attorney general, Richmond Flowers, called the grand jury’s action “an abdication of … responsibility.” Lowndes County justice proceeded as usual, oblivious to outsiders’ criticism. In fact, the more the national media attacked Southern customs, the more its citizens embraced them. When Flowers asked Judge T. Werth Thagard, who was trying the case, for a two-month postponement until Father Morrisroe, his chief eyewitness, had recovered sufficiently to testify, the judge rejected the motion and declared, “The trial of Tom Coleman will begin tomorrow.” Flowers refused to participate so Thagard removed him and asked Carlton Purdue and Arthur Gamble to prosecute. When the trial began on Sept. 27, the courtroom was packed with Klansmen — including Liuzzo's killers, Collie Leroy Wilkins, Eugene Thomas and William Orville Eaton. Defense witnesses testified that Daniels threatened Coleman with a switchblade knife while Morrisroe pulled a gun, so Coleman was merely protecting himself when he shot them. The jury rejected Ruby Sales' eyewitness testimony, finding the lies more persuasive. In his closing statement defense attorney Joe Phelps said, “You know Tom Coleman and you know he had to do what he did,” while his co-counsel added: “God give us such men! Men with great hearts, strong minds, pure souls -- and ready hands!” Coleman had a God-given right “to defend himself and his lady.” On Wednesday, Sept. 29, just two days after the trial began, the jury began its deliberations. The “trial watchers,” awaiting the jury’s decision, were “busily talking in huddles,” not about the verdict -- which was never in doubt -- but about tomorrow’s football game between the University of Alabama and “Ole Miss.” After about 90 minutes, the jury found Coleman not guilty of all charges. Thagard thanked the jurors who, before heading to the clerk’s office to receive their stipend, walked over to Coleman and shook his hand. One asked, "We gonna be able to make that dove shoot now, ain’t we?” The NAACP called the jury’s verdict “a monstrous farce,” which encouraged “every Alabama bigot” to declare “open season on Negroes and their white friends.” The NAACP was right — Fort Deposit citizens were now seen driving cars with bumper stickers which read "Open Season." Despite this legal travesty, Jon Daniels did not die in vain. ESCRU and the American Civil Liberties Union filed a lawsuit that attacked the county legal system for prohibiting women and blacks from serving on juries. That October, the Justice Department intervened, ordering county officials to produce all jury records since 1915 and the results showed evidence of racial and gender discrimination and other legal improprieties. In 1966, the Federal Court of Appeals in Montgomery, Alabama, declared state laws excluding women from juries unconstitutional as well as any practice that prevented African-Americans from jury service. "The decision revolutionized the jury system in [Lowndes] County," writes historian Charles Eagles. "[And] indirectly it also promised greater protection for civil rights workers by guaranteeing that blacks would serve on the juries that indicted and tried people like Tom Coleman." Dr. Martin Luther King called Jon Daniels' sacrifice "one of the most heroic Christian deeds of which I have heard in my entire ministry." While countless African-Americans lost their lives in the South for simply being black or fighting for the right to vote, we should not forget their white allies, like Jon Daniels, who were jailed, beaten and murdered in the fight for civil rights.   The ceremonies  that marked the 50th anniversary of the signing of the historic Voting Rights Act last Aug. 6 have not yet ended. Last weekend people from as far away as Alaska and the Virgin Islands returned to Selma, Alabama, to remember yet another tragedy that occurred half a century ago when, on Aug. 20, 1965, a 26-year-old Episcopal seminary student named Jonathan Daniels lost his life while fighting for civil rights. Jon Daniels, of Keene, New Hampshire, was one of the thousands who answered Dr. Martin Luther King's call to join his campaign for voting rights in Selma following the assault on the Edmund Pettus Bridge on "Bloody Sunday," March 7, 1965. He lived with a black family, tutored young students and demonstrated for equal rights. Nine days later he returned to his studies in Cambridge, Massachusetts,  but he was determined to make civil rights a part of his life. "Something happened to me in Selma ...," he later noted. "I could not stand by in benevolent dispassion any longer without compromising everything I know and love and  value. The imperative was too clear, the stakes too high , my own identity was called too nakedly into question ... I had been blinded by what I saw here (and elsewhere), and the road to Damascus led, for me, back here." He asked officials at the Episcopal Theological School to permit him to finish the semester in Selma. He would submit his academic assignments by mail while, at the same time, assisting the African-American community, who suffered so greatly from that tortured city's racial  problems. The school allowed him to go to Selma as a representative of ESCRU, the Episcopal Society for Cultural and Racial Unity. He wore the badge with the letters "ESCRU" proudly on his chest. That badge brought him trouble when he returned to Selma. Waiting in line at the local post office, a man looked him over, his eye caught by the seminarian's collar and the ESCRU button. "Know what he is?" the man asked a friend. "Why, he's a white niggah." At first, Daniels was startled to see others turn to study him, obviously thinking he was one of  the pariahs the segregationists called "outside agitators." But then Daniels felt not fear but pride, wishing he could announce, "I am indeed a 'white nigger.'" Later he reflected, "I wouldn't swap the blessings He has given me. But black would be a very wonderful, a very beautiful color to be." On Palm Sunday, he and his colleagues led a group of African-Americans to Selma's St. Paul's Episcopal Church where, after some resistance, they were allowed to attend morning services -- the first church to be desegregated in Selma. But they were forced to sit in the church's last row and let white parishioners be the first to receive communion. Later Daniels was accosted in the street by a well-dressed man, perhaps a lawyer or a banker, who asked him: "Are you the scum that's been going to the Episcopal church?" Then he answered his question: "S-C-U-M. That's what you are -- you and that nigger trash you bring with you." Daniels said he was "sorry" that the man was upset by having to share his church with the blacks who shared his faith but complimented him on how well he could  spell. In August, he joined activists from King's Southern Christian Leadership Conference (SCLC) and the Student Nonviolent Coordinating Committee (SNCC) who were working in "Bloody Lowndes County," so called because of the brutal treatment blacks received there since the end of Reconstruction. Of the almost 6,000 blacks who were eligible to vote, none was registered prior to the advent of the civil rights movement in Alabama and, despite the efforts of activists who began working there early in 1965, not a single black participated in the Voting Rights March. Stokely Carmichael, SNCC's most charismatic organizer, who was building a civil rights movement in the county, aptly called Lowndes “the epitome of the tight, insulated police state.” Carmichael's target on Aug. 14 was Fort Deposit, a Klan bastion and a town that treated its black population with special cruelty. Here they would lead local youth in a protest, the first in the town's history. FBI agents on the scene urged them to cancel the demonstration. An armed mob of white men armed with guns, clubs and bottles was gathering and the Bureau refused to protect the protesters. They went forward anyway, picketing a cafe, a dry goods store and a grocery. The police soon appeared and arrested the activists, including Stokely Carmichael, Jon Daniels, four SNCC workers, a Catholic priest from Chicago named Richard Morrisoe and 17 local youths. They were charged with "resisting arrest and picketing to cause blood." The mob turned their fury on reporters observing events from the relative safety of two cars. The reporters fled after the mob attacked their cars with baseball bats as they tried to  pull them into the street. Daniels and the others were taken to the newly built county jail in Hayneville, a sleepy little town of 400, whose town square was built around a 10-foot-tall monument dedicated to county men killed during the Civil War. Two months earlier, its courthouse was the site of the trial of Collie Leroy Wilkins, one of the Klansmen who had killed civil rights activist Viola Liuzzo, the Detroit homemaker and mother of five, who was shot while driving with a black colleague following the conclusion of the historic voting rights march from Selma to Montgomery on March 25. Townspeople resented the presence of foreign reporters -- more than 50, Americans, Englishmen and Swedes -- who had invaded the town in large numbers. The courtroom was filled with Klansmen who brought their wives and children to observe Imperial Klonsul Matthew Hobson Murphy defend one of their own. After a tumultuous trial, the jury was unable to reach a unanimous verdict so the judge declared a mistrial. Klansmen whooped, stamped their feet and clapped. "I'm glad that's over," a woman told Life magazine's John Frook. "Y'all can go back North now and let us have some peace and quiet." Virginia Foster Durr, a writer and friend of Rosa Parks who had observed the trial, did not share the woman's optimism. She sensed "there was killing in the air."  Most "of the people frighten me," she later wrote friends. "They are so insane and prejudiced." She feared that "killing would strike again, for the white people of Hayneville had condoned the killing, whatever they might say."    The Wilkins trial and the coming of the civil rights movement to Lowndes County also enraged a 54-year-old Hayneville native named Tom Coleman. The old Lowndes County courthouse was his second home; his grandfather had been county Sheriff early in the 20th century and his father, superintendent of the county school system, once had an office there. So did his sister who currently held that post, as well as the current circuit court clerk, who was married to his cousin. Coleman spent time there every day, chatting with bailiffs and the court reporters and playing dominos in the clerk’s office. Although lacking a formal education, Coleman eventually became the county’s chief engineer whose work crews often included convict labor, which he supervised.  One night in August 1959, a black prisoner turned violent and, arming himself with broken bottles, refused to surrender peacefully and moved toward Coleman who fired his shotgun, killing the man. Since this was clearly a case of self-defense, Coleman was never charged. Indeed, he was treated by local police as a hero, a role he liked. He eventually became an unpaid “special deputy sheriff,” formed close friendships with Selma Sheriff Jim Clark and other law enforcement agents, and was proud that his son joined them by becoming a state trooper. He claimed never to have joined the Klan but he was active in the local White Citizens Council and, like the Klan, saw himself as a defender of the Southern way of life. For Coleman, those who challenged Southern mores, like Jon Daniels, were public enemy No. 1. "The food is vile and we aren't allowed to bathe (whew!)," Jon Daniels wrote his mother on Aug. 17, his third day in jail. "But otherwise we are okay. Should be out in 2-3 days and back to work. As you can imagine, I'll have a tale or two to swap over our next martini." ESCRU had offered to provide the bail to free Daniels but he refused because it wasn't sufficient to cover his 19 colleagues also imprisoned. Then, suddenly, on Friday, Aug. 20, Daniels was informed that their bail had been waived and they were free to go. Hayneville's mayor, apparently fearing federal intervention, had ordered their release. One of the first to learn the news was Tom Coleman, who was at the courthouse playing dominoes with friends. Believing that the notorious Stokely Carmichael was now on the loose and trouble was sure to ensue, Coleman got his .12- gauge automatic shotgun and went immediately to the local grocery, where he offered to protect Virginia Varner, a longtime friend who owned the Cash Store. Coleman was misinformed -- Carmichael's bail was paid two days earlier and he was no longer in Hayneville. At 3 o'clock, Daniels and his friends gathered outside the jail, happy to see that everyone had survived their squalid captivity without the customary beatings or worse. They asked the jailer to protect them but he refused, urging them to get off the county's property. So they began to walk toward the courthouse square a few blocks away. Obviously SNCC headquarters had not been informed of their release, so Willie Vaughn went in search of a phone. Spotting the Varner Cash Store with its large Coca-Cola signs, Daniels, Rev. Morrisroe and two African-American girls, Ruby Sales, 20,  and Joyce Bailey, 19, walked toward the store to get a cold drink and something to eat. It did not occur to the tired and hungry activists that an interracial group, even one so young and harmless, might incite local racists. Jon Daniels opened the screen door for Ruby Sales and the two came face to face with Tom Coleman. “[G]et out, the store is closed,” he yelled. "[G]et off this property or I’ll blow your god-damned heads off, you sons of bitches.” Daniels pulled Ruby Sales behind him and tried to talk to the angry man. He was polite and, with his clerical collar, did not look like someone who posed a threat. But, without further words, Coleman fired his .12-gauge automatic shotgun, blowing a hole in Daniel’s chest, killing him instantly. Richard Morrisroe grabbed Joyce Bailey's hand and they turned to run but Coleman fired again, hitting the priest in the back and side, seriously, but not critically, injuring him. After threatening to kill others who approached, Coleman put down his weapon, drove to the sheriff’s office and telephoned Al Lingo, Alabama’s safety director. “I just shot two preachers,” he told him. “You better get on down here.” Daniel and Morrisroe’s friends held a rally later that night. “We’re going to tear this county down,” a saddened and angry Stokely Carmichael said. “Then we’re going to build it back brick by brick, until it’s a fit place for human beings.” Since March, four civil rights workers had been murdered — Jimmy Lee Jackson, Rev. Jim Reeb, Viola Liuzzo and now Jonathan Daniels. Soon Carmichael’s fury would result in the organization of a separate political party in Lowndes County: Its symbol was the black panther, its slogan, “Power for Black People.” To the citizens of Lowndes County, Tom Coleman was a hero -- “a hell of a nice guy,” people said. County Solicitor Carlton Purdue said that Coleman “was like the rest of us. He’s strong in his feelings.” Tom Coleman and his family were “all good friends” of his, he told reporters who had returned to Lowndes County to cover another murder trial. “If [Daniels and Morrisroe] had been tending to their own business, like I was tending to mine, they’d be living and enjoying themselves." These attitudes may explain why the Lowndes County Grand Jury charged Coleman not with first- or second-degree murder and attempted murder in Morrisroe’s case, but manslaughter and assault and battery. Alabama’s attorney general, Richmond Flowers, called the grand jury’s action “an abdication of … responsibility.” Lowndes County justice proceeded as usual, oblivious to outsiders’ criticism. In fact, the more the national media attacked Southern customs, the more its citizens embraced them. When Flowers asked Judge T. Werth Thagard, who was trying the case, for a two-month postponement until Father Morrisroe, his chief eyewitness, had recovered sufficiently to testify, the judge rejected the motion and declared, “The trial of Tom Coleman will begin tomorrow.” Flowers refused to participate so Thagard removed him and asked Carlton Purdue and Arthur Gamble to prosecute. When the trial began on Sept. 27, the courtroom was packed with Klansmen — including Liuzzo's killers, Collie Leroy Wilkins, Eugene Thomas and William Orville Eaton. Defense witnesses testified that Daniels threatened Coleman with a switchblade knife while Morrisroe pulled a gun, so Coleman was merely protecting himself when he shot them. The jury rejected Ruby Sales' eyewitness testimony, finding the lies more persuasive. In his closing statement defense attorney Joe Phelps said, “You know Tom Coleman and you know he had to do what he did,” while his co-counsel added: “God give us such men! Men with great hearts, strong minds, pure souls -- and ready hands!” Coleman had a God-given right “to defend himself and his lady.” On Wednesday, Sept. 29, just two days after the trial began, the jury began its deliberations. The “trial watchers,” awaiting the jury’s decision, were “busily talking in huddles,” not about the verdict -- which was never in doubt -- but about tomorrow’s football game between the University of Alabama and “Ole Miss.” After about 90 minutes, the jury found Coleman not guilty of all charges. Thagard thanked the jurors who, before heading to the clerk’s office to receive their stipend, walked over to Coleman and shook his hand. One asked, "We gonna be able to make that dove shoot now, ain’t we?” The NAACP called the jury’s verdict “a monstrous farce,” which encouraged “every Alabama bigot” to declare “open season on Negroes and their white friends.” The NAACP was right — Fort Deposit citizens were now seen driving cars with bumper stickers which read "Open Season." Despite this legal travesty, Jon Daniels did not die in vain. ESCRU and the American Civil Liberties Union filed a lawsuit that attacked the county legal system for prohibiting women and blacks from serving on juries. That October, the Justice Department intervened, ordering county officials to produce all jury records since 1915 and the results showed evidence of racial and gender discrimination and other legal improprieties. In 1966, the Federal Court of Appeals in Montgomery, Alabama, declared state laws excluding women from juries unconstitutional as well as any practice that prevented African-Americans from jury service. "The decision revolutionized the jury system in [Lowndes] County," writes historian Charles Eagles. "[And] indirectly it also promised greater protection for civil rights workers by guaranteeing that blacks would serve on the juries that indicted and tried people like Tom Coleman." Dr. Martin Luther King called Jon Daniels' sacrifice "one of the most heroic Christian deeds of which I have heard in my entire ministry." While countless African-Americans lost their lives in the South for simply being black or fighting for the right to vote, we should not forget their white allies, like Jon Daniels, who were jailed, beaten and murdered in the fight for civil rights.   

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 06:00

Black America owes no forgiveness: How Christianity hinders racial justice

AlterNet On the one-year anniversary of the death of an 18-year-old black teenager named Michael Brown by a (now confessed racist) white police officer named Darren Wilson in Ferguson, Missouri, Brown’s mother, Lezley McSpadden, was asked if she forgave Darren Wilson for his cruel and wanton act of legal murder. She told Al Jazeera that she will “never forgive” Darren Wilson and that “he’s evil, his acts were devilish.” Her response is unusual. Its candor is refreshing. Lezley McSpadden’s truth-telling reveals the full humanity and emotions of black folks, and by doing so defies the norms which demand that when Black Americans suffer they do so stoically, and always in such a way where forgiveness for racist violence is a given, an unearned expectation of White America. The expectation that black people will always and immediately forgive the violence done to them by the State, or individual white people, is a bizarre and sick American ritual. The necropolis of black bodies in the Age of Obama provides many examples of the ritual. Less than a month after her son Samuel Dubose was executed by a thug cop, his mother, Audrey Dubose was asked during a press conference, if she forgave Ray Tensing. She answered “I can forgive him. I can forgive anybody. God forgave us." After Dylann Roof massacred nine black Americans in a Charleston, South Carolina church their families were asked to forgive the white racist terrorist. Rituals reinforce social norms, values, and beliefs. Rituals can empower some groups and individuals; rituals can also serve to weaken and oppress others. The ritual of immediate and expected black forgiveness for the historic and contemporary suffering visited upon the black community by White America reflects the complexities of the color line. Black Americans may publicly—and this says nothing of just and righteous private anger, upset, and desire for justice and revenge—be so quick to forgive white violence and injustice because it a tactic and strategy for coping with life in a historically white supremacist society. If black folks publicly expressed their anger and lack of forgiveness at centuries of white transgressions they could and were beaten, raped, murdered, shot, stabbed, burned alive, run out of town, hung, put in prisons, locked up in insane asylums, fired from their jobs, their land stolen from them, and kicked out of schools. Even in the post civil rights era and the Age of Obama, being branded with the veritable scarlet letter of being an “angry” black man or “angry” black woman, can result in their life opportunities being significantly reduced. The African-American church is also central to the black American ritual of forgiveness. A belief in fantastical and mythological beings was used to fuel struggle and resistance in a long march of liberation and dignity against white supremacy, injustice, and degradation. The notion of “Christian forgiveness” as taught by the black church could also be a practical means of self-medication, one designed to stave off existential malaise, and to heal oneself in the face of the quotidian struggles of life under American Apartheid. Likewise, some used Christianity and the black church to teach passivity and weakness in the face of white terrorism because some great reward supposedly awaits those who suffer on Earth. The public mask of public black forgiveness and peace was also a tool that was used during the long Black Freedom Struggle as a means of demonstrating the honor, humanity, dignity, and civic virtue of black Americans--a group who only wanted their just and paid for in blood (and free labor) civil rights. The ritual of immediate and expected black forgiveness fulfills the expectations of the White Gaze and the White Racial Frame. A lack of empathy from White America towards Black America is central to the ritual: if white folks could truly feel the pain of black people (and First Nations, Hispanics and Latinos, and other people of color) in these times of meanness, cruelty, and violence, then immediate forgiveness would not be an expectation. Many white Americans actually believe thatblack people are superhuman, magical, and do not feel pain. This cannot help but to somehow factor into the public ritual of black people saying “I forgive” the violence visited upon them by white cops, paramilitaries, hate mongers, bureaucrats, and the State. Whiteness is central here too. Whiteness imagines itself as benign, just, and innocent. Therefore, too many white people (especially those who have not acknowledged, renounced, and rejected white privilege) view white on black racial violence as some type of ahistorical outlier, something that is not part of a pattern, a punctuation or disruption in American life, something not inherent to it, and thus not a norm of the country’s social and political life. Here, the ritual of African-American forgiveness allows White America absolution and innocence without having to put in the deeds and necessary hard work for true justice, fairness, and equal democracy on both sides of the color line. The black forgiveness ritual’s heaviest anchor is white anxiety and fear. As I wrote in an earlier piece, White America is deeply terrified, and has been since before the Founding, of black righteous anger, and that white people in this country would be held accountable for the actions done both in their name, and for their collective benefit against black people. This ahistorical and delusional dread (where in fact white Americans are experts in the practice of collective violence against people of color; the reverse has never been true) was summoned in the antebellum period by worries of “slave revolts”. It still resonates in the 20th and 21st centuries with white racial paranoia about “ghetto” or “black” riots, as well as the persistent bugaboo that is “black crime”. When black people say “we forgive” it is a salve for those white worries and fears. The absurdity and uniqueness of black Americans being naturally expected to immediately forgive the crimes and harm done to them by white people is highlighted precisely by how (White) America, both as an aggregate and as individuals, are not burdened with such a task. One of the greatest privileges that comes with being “white” in America is the permission and encouragement to hold onto a sense of injustice, grievance, anger, and pain. Consider the following. The family of Kathryn Steinle—and whose death is the macabre subject for Donald Trump’s race-baiting obsessions with “illegal” immigrants from Mexico—has not been publicly asked to forgive Francisco Sanchez, the man who killed her. The families of the children murdered by the gun toting mass shooter Adam Lanza in Newtown, Connecticut were not publicly asked during a press conference if they forgave the killer. The families of the 70 people wounded and 12 killed by James Holmes in a Colorado movie theater were not asked during a press conference to publicly forgive him. And of course, the families of those killed on September 11, 2001, when agents of Al Qaeda attacked the United States, were not asked several weeks after the event if they forgave Osama bin Laden and his agents. More than ten years after that faithful morning when the United States was attacked by Al Qaeda—and an era of national derangement and perpetual war was ushered into being—there are survivors who will still not forgive those who wrought devastation onto their lives. Some of them shared with the National Catholic Review how continue to nurture their anger: Mr. Haberman admitted, “That’s a tough one for me. When I sit in court with these guys, can I forgive them? I have a hard time. I mean, they don’t want my forgiveness. I think justice is the word.” Dorine and Martin Toyen of Avon, Conn., lost their daughter Amy, 24, in the World Trade Center. She was engaged to be married. “Her whole life was taken away from her,” said Ms. Toyen. “There is no way I could ever forgive them.” Mr. Toyen concurred. “I want justice, not forgiveness,” he said. “I’m still very bitter. Rage.” If the accused “are found guilty, then I would have no qualms with the death penalty.” Ms. Noeth said the death penalty would be too easy. “The people that we lost suffered a lot more than that. I think they deserve as much pain as can possibly be inflicted on them.” If a reporter or other interviewer publicly asked those people who had their love ones stolen from them either on 9/11, at Newtown, or Littleton, if they forgave the monsters who hurt them so deeply, said person would (rightfully) be derided, mocked, and likely fired. (White) Americans are not expected to forgive those who transgress them. Black Americans who have lost their loved ones to police thuggery, violence, or other types of white on black racial terrorism and murder should be allowed the same latitude and freedom of expression and feelings. Of course, they are not—such a right exists outside the ritual that is Black America’s expected forgiveness for all the racist grievances and wrongs suffered by it. This public ritual is a performance. It gives the white American public what they expect while concealing the true and private feelings of many black Americans, the latter being a people who are not foolish or naïve enough to perpetually forgive, forget, and turn the other cheek when faced with perpetual abuse. Perhaps, one day there will be a moment when a black American who has suffered unjust loss and pain will tell the reporter who immediately asks them, “do you forgive the thug cop or racial terrorist who killed your unarmed child/friend/brother/sister/husband/wife?” and they will reply, “Hell no! Not now, not ever, and you can go fuck yourself for asking such a question.” We are allowed to dream. Such a moment of honesty and sharing will be a true step forward for racial justice and respect across the color line, as opposed to the charade and Kabuki-like Theater that now passes for the obligatory and weak “national conversation on race” that the American people have been repeatedly afflicted with in the post civil rights era. AlterNet On the one-year anniversary of the death of an 18-year-old black teenager named Michael Brown by a (now confessed racist) white police officer named Darren Wilson in Ferguson, Missouri, Brown’s mother, Lezley McSpadden, was asked if she forgave Darren Wilson for his cruel and wanton act of legal murder. She told Al Jazeera that she will “never forgive” Darren Wilson and that “he’s evil, his acts were devilish.” Her response is unusual. Its candor is refreshing. Lezley McSpadden’s truth-telling reveals the full humanity and emotions of black folks, and by doing so defies the norms which demand that when Black Americans suffer they do so stoically, and always in such a way where forgiveness for racist violence is a given, an unearned expectation of White America. The expectation that black people will always and immediately forgive the violence done to them by the State, or individual white people, is a bizarre and sick American ritual. The necropolis of black bodies in the Age of Obama provides many examples of the ritual. Less than a month after her son Samuel Dubose was executed by a thug cop, his mother, Audrey Dubose was asked during a press conference, if she forgave Ray Tensing. She answered “I can forgive him. I can forgive anybody. God forgave us." After Dylann Roof massacred nine black Americans in a Charleston, South Carolina church their families were asked to forgive the white racist terrorist. Rituals reinforce social norms, values, and beliefs. Rituals can empower some groups and individuals; rituals can also serve to weaken and oppress others. The ritual of immediate and expected black forgiveness for the historic and contemporary suffering visited upon the black community by White America reflects the complexities of the color line. Black Americans may publicly—and this says nothing of just and righteous private anger, upset, and desire for justice and revenge—be so quick to forgive white violence and injustice because it a tactic and strategy for coping with life in a historically white supremacist society. If black folks publicly expressed their anger and lack of forgiveness at centuries of white transgressions they could and were beaten, raped, murdered, shot, stabbed, burned alive, run out of town, hung, put in prisons, locked up in insane asylums, fired from their jobs, their land stolen from them, and kicked out of schools. Even in the post civil rights era and the Age of Obama, being branded with the veritable scarlet letter of being an “angry” black man or “angry” black woman, can result in their life opportunities being significantly reduced. The African-American church is also central to the black American ritual of forgiveness. A belief in fantastical and mythological beings was used to fuel struggle and resistance in a long march of liberation and dignity against white supremacy, injustice, and degradation. The notion of “Christian forgiveness” as taught by the black church could also be a practical means of self-medication, one designed to stave off existential malaise, and to heal oneself in the face of the quotidian struggles of life under American Apartheid. Likewise, some used Christianity and the black church to teach passivity and weakness in the face of white terrorism because some great reward supposedly awaits those who suffer on Earth. The public mask of public black forgiveness and peace was also a tool that was used during the long Black Freedom Struggle as a means of demonstrating the honor, humanity, dignity, and civic virtue of black Americans--a group who only wanted their just and paid for in blood (and free labor) civil rights. The ritual of immediate and expected black forgiveness fulfills the expectations of the White Gaze and the White Racial Frame. A lack of empathy from White America towards Black America is central to the ritual: if white folks could truly feel the pain of black people (and First Nations, Hispanics and Latinos, and other people of color) in these times of meanness, cruelty, and violence, then immediate forgiveness would not be an expectation. Many white Americans actually believe thatblack people are superhuman, magical, and do not feel pain. This cannot help but to somehow factor into the public ritual of black people saying “I forgive” the violence visited upon them by white cops, paramilitaries, hate mongers, bureaucrats, and the State. Whiteness is central here too. Whiteness imagines itself as benign, just, and innocent. Therefore, too many white people (especially those who have not acknowledged, renounced, and rejected white privilege) view white on black racial violence as some type of ahistorical outlier, something that is not part of a pattern, a punctuation or disruption in American life, something not inherent to it, and thus not a norm of the country’s social and political life. Here, the ritual of African-American forgiveness allows White America absolution and innocence without having to put in the deeds and necessary hard work for true justice, fairness, and equal democracy on both sides of the color line. The black forgiveness ritual’s heaviest anchor is white anxiety and fear. As I wrote in an earlier piece, White America is deeply terrified, and has been since before the Founding, of black righteous anger, and that white people in this country would be held accountable for the actions done both in their name, and for their collective benefit against black people. This ahistorical and delusional dread (where in fact white Americans are experts in the practice of collective violence against people of color; the reverse has never been true) was summoned in the antebellum period by worries of “slave revolts”. It still resonates in the 20th and 21st centuries with white racial paranoia about “ghetto” or “black” riots, as well as the persistent bugaboo that is “black crime”. When black people say “we forgive” it is a salve for those white worries and fears. The absurdity and uniqueness of black Americans being naturally expected to immediately forgive the crimes and harm done to them by white people is highlighted precisely by how (White) America, both as an aggregate and as individuals, are not burdened with such a task. One of the greatest privileges that comes with being “white” in America is the permission and encouragement to hold onto a sense of injustice, grievance, anger, and pain. Consider the following. The family of Kathryn Steinle—and whose death is the macabre subject for Donald Trump’s race-baiting obsessions with “illegal” immigrants from Mexico—has not been publicly asked to forgive Francisco Sanchez, the man who killed her. The families of the children murdered by the gun toting mass shooter Adam Lanza in Newtown, Connecticut were not publicly asked during a press conference if they forgave the killer. The families of the 70 people wounded and 12 killed by James Holmes in a Colorado movie theater were not asked during a press conference to publicly forgive him. And of course, the families of those killed on September 11, 2001, when agents of Al Qaeda attacked the United States, were not asked several weeks after the event if they forgave Osama bin Laden and his agents. More than ten years after that faithful morning when the United States was attacked by Al Qaeda—and an era of national derangement and perpetual war was ushered into being—there are survivors who will still not forgive those who wrought devastation onto their lives. Some of them shared with the National Catholic Review how continue to nurture their anger: Mr. Haberman admitted, “That’s a tough one for me. When I sit in court with these guys, can I forgive them? I have a hard time. I mean, they don’t want my forgiveness. I think justice is the word.” Dorine and Martin Toyen of Avon, Conn., lost their daughter Amy, 24, in the World Trade Center. She was engaged to be married. “Her whole life was taken away from her,” said Ms. Toyen. “There is no way I could ever forgive them.” Mr. Toyen concurred. “I want justice, not forgiveness,” he said. “I’m still very bitter. Rage.” If the accused “are found guilty, then I would have no qualms with the death penalty.” Ms. Noeth said the death penalty would be too easy. “The people that we lost suffered a lot more than that. I think they deserve as much pain as can possibly be inflicted on them.” If a reporter or other interviewer publicly asked those people who had their love ones stolen from them either on 9/11, at Newtown, or Littleton, if they forgave the monsters who hurt them so deeply, said person would (rightfully) be derided, mocked, and likely fired. (White) Americans are not expected to forgive those who transgress them. Black Americans who have lost their loved ones to police thuggery, violence, or other types of white on black racial terrorism and murder should be allowed the same latitude and freedom of expression and feelings. Of course, they are not—such a right exists outside the ritual that is Black America’s expected forgiveness for all the racist grievances and wrongs suffered by it. This public ritual is a performance. It gives the white American public what they expect while concealing the true and private feelings of many black Americans, the latter being a people who are not foolish or naïve enough to perpetually forgive, forget, and turn the other cheek when faced with perpetual abuse. Perhaps, one day there will be a moment when a black American who has suffered unjust loss and pain will tell the reporter who immediately asks them, “do you forgive the thug cop or racial terrorist who killed your unarmed child/friend/brother/sister/husband/wife?” and they will reply, “Hell no! Not now, not ever, and you can go fuck yourself for asking such a question.” We are allowed to dream. Such a moment of honesty and sharing will be a true step forward for racial justice and respect across the color line, as opposed to the charade and Kabuki-like Theater that now passes for the obligatory and weak “national conversation on race” that the American people have been repeatedly afflicted with in the post civil rights era.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 05:00