Lily Salter's Blog, page 1007

August 24, 2015

The line-up for the first week of “The Late Show with Stephen Colbert” was just announced — and you’re going to want to see this

"The Late Show With Stephen Colbert" lineup is the gift that keeps on giving. Two weeks ago, it was announced that presidential candidate Jeb Bush would be joining George Clooney for the first (Sept. 8) episode -- at which point, Salon declared the lineup officially and "totally bonkers." Today, CBS has ushered in even more big, marquee-names for the first week of the show. That list includes actors, politicians and even tech CEOs. Below, the official roster for Sept. 8-11: September 8:  George Clooney, Republican presidential candidate Jeb Bush Musical performance: Jon Batiste and Stay Human September 9:  Scarlett Johansson, SpaceX and Tesla Motors CEO Elon Musk Musical performance: Kendrick Lamar September 10:  Uber CEO Travis Kalanick Musical performance: Toby Keith September 11: Amy Schumer, Stephen King Musical performance: Troubled Waters"The Late Show With Stephen Colbert" lineup is the gift that keeps on giving. Two weeks ago, it was announced that presidential candidate Jeb Bush would be joining George Clooney for the first (Sept. 8) episode -- at which point, Salon declared the lineup officially and "totally bonkers." Today, CBS has ushered in even more big, marquee-names for the first week of the show. That list includes actors, politicians and even tech CEOs. Below, the official roster for Sept. 8-11: September 8:  George Clooney, Republican presidential candidate Jeb Bush Musical performance: Jon Batiste and Stay Human September 9:  Scarlett Johansson, SpaceX and Tesla Motors CEO Elon Musk Musical performance: Kendrick Lamar September 10:  Uber CEO Travis Kalanick Musical performance: Toby Keith September 11: Amy Schumer, Stephen King Musical performance: Troubled Waters"The Late Show With Stephen Colbert" lineup is the gift that keeps on giving. Two weeks ago, it was announced that presidential candidate Jeb Bush would be joining George Clooney for the first (Sept. 8) episode -- at which point, Salon declared the lineup officially and "totally bonkers." Today, CBS has ushered in even more big, marquee-names for the first week of the show. That list includes actors, politicians and even tech CEOs. Below, the official roster for Sept. 8-11: September 8:  George Clooney, Republican presidential candidate Jeb Bush Musical performance: Jon Batiste and Stay Human September 9:  Scarlett Johansson, SpaceX and Tesla Motors CEO Elon Musk Musical performance: Kendrick Lamar September 10:  Uber CEO Travis Kalanick Musical performance: Toby Keith September 11: Amy Schumer, Stephen King Musical performance: Troubled Waters"The Late Show With Stephen Colbert" lineup is the gift that keeps on giving. Two weeks ago, it was announced that presidential candidate Jeb Bush would be joining George Clooney for the first (Sept. 8) episode -- at which point, Salon declared the lineup officially and "totally bonkers." Today, CBS has ushered in even more big, marquee-names for the first week of the show. That list includes actors, politicians and even tech CEOs. Below, the official roster for Sept. 8-11: September 8:  George Clooney, Republican presidential candidate Jeb Bush Musical performance: Jon Batiste and Stay Human September 9:  Scarlett Johansson, SpaceX and Tesla Motors CEO Elon Musk Musical performance: Kendrick Lamar September 10:  Uber CEO Travis Kalanick Musical performance: Toby Keith September 11: Amy Schumer, Stephen King Musical performance: Troubled Waters

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 24, 2015 12:14

Antibiotic-resistant superbugs found lurking in 1 in 5 conventional ground beef samples

If that raw hamburger meat you bought to cook for dinner hasn't given you a stomach ache yet, this might: according to a Consumer Reports investigation, store-bought ground beef is teeming with dangerous bacteria, including "superbugs" resistant to three or more classes of antibiotics, as well as a whole lot of poop. That's a big problem, the report warns, because of Americans' penchant for under-cooked meat. But the study, which analyzed 300 packages of meat purchased from grocery, big-box, and natural food stores across 26 U.S. cities, found some important differences dependent on how the beef was raised: either conventionally -- in grain and soy feedlots where food is supplemented with antibiotics and other growth-promoting drugs -- or what the report terms "sustainably": meaning, in this case, that no antibiotics were used, and which also could include organic or grass-fed cattle. According to the researchers, conventionally raised samples turned out to have more bacteria, in general. And 18 percent contained at least one strain of bacteria resistant to the drugs most commonly used in human medicine, compared to just 9 percent of more sustainably raised samples, and 6 percent of grass-fed. CR-Health-Where-Superbugs-Lurk-Chart-08-15 The use of low-level antibiotics on feedlots -- including some that are important to human medicine -- may be responsible for the discrepancy. Consumer Reports suggests that overall higher amount of bacteria in conventional beef, meanwhile, may be a function of the conditions in which the animals are raised, in cramped, feces-ridden, stress-producing spaces; fed, with diets can include "candy, chicken coop waste and the slaughterhouse remains of pigs and chickens; and slaughtered, in a rapid manner that can increase the odds of contamination occurring. It's all reason, per Consumer Reports, to look at labels, and strive to purchase sustainably raised beef "whenever possible." Doing so won't ensure a safe meal, however. Because while a higher proportion of conventional beef tested positive for superbugs, each and every sample tested by Consumer Reports was found to contain either enterococcus and/or nontoxin-producing E. coli, which signify fecal contamination. And 10 percent of all samples contained a strain of S. aureus that can make you sick even if you fully cook your meat. CR-Health-How-Much-Bacteria-Chart-08-15 Anyone else craving a veggie burger?If that raw hamburger meat you bought to cook for dinner hasn't given you a stomach ache yet, this might: according to a Consumer Reports investigation, store-bought ground beef is teeming with dangerous bacteria, including "superbugs" resistant to three or more classes of antibiotics, as well as a whole lot of poop. That's a big problem, the report warns, because of Americans' penchant for under-cooked meat. But the study, which analyzed 300 packages of meat purchased from grocery, big-box, and natural food stores across 26 U.S. cities, found some important differences dependent on how the beef was raised: either conventionally -- in grain and soy feedlots where food is supplemented with antibiotics and other growth-promoting drugs -- or what the report terms "sustainably": meaning, in this case, that no antibiotics were used, and which also could include organic or grass-fed cattle. According to the researchers, conventionally raised samples turned out to have more bacteria, in general. And 18 percent contained at least one strain of bacteria resistant to the drugs most commonly used in human medicine, compared to just 9 percent of more sustainably raised samples, and 6 percent of grass-fed. CR-Health-Where-Superbugs-Lurk-Chart-08-15 The use of low-level antibiotics on feedlots -- including some that are important to human medicine -- may be responsible for the discrepancy. Consumer Reports suggests that overall higher amount of bacteria in conventional beef, meanwhile, may be a function of the conditions in which the animals are raised, in cramped, feces-ridden, stress-producing spaces; fed, with diets can include "candy, chicken coop waste and the slaughterhouse remains of pigs and chickens; and slaughtered, in a rapid manner that can increase the odds of contamination occurring. It's all reason, per Consumer Reports, to look at labels, and strive to purchase sustainably raised beef "whenever possible." Doing so won't ensure a safe meal, however. Because while a higher proportion of conventional beef tested positive for superbugs, each and every sample tested by Consumer Reports was found to contain either enterococcus and/or nontoxin-producing E. coli, which signify fecal contamination. And 10 percent of all samples contained a strain of S. aureus that can make you sick even if you fully cook your meat. CR-Health-How-Much-Bacteria-Chart-08-15 Anyone else craving a veggie burger?If that raw hamburger meat you bought to cook for dinner hasn't given you a stomach ache yet, this might: according to a Consumer Reports investigation, store-bought ground beef is teeming with dangerous bacteria, including "superbugs" resistant to three or more classes of antibiotics, as well as a whole lot of poop. That's a big problem, the report warns, because of Americans' penchant for under-cooked meat. But the study, which analyzed 300 packages of meat purchased from grocery, big-box, and natural food stores across 26 U.S. cities, found some important differences dependent on how the beef was raised: either conventionally -- in grain and soy feedlots where food is supplemented with antibiotics and other growth-promoting drugs -- or what the report terms "sustainably": meaning, in this case, that no antibiotics were used, and which also could include organic or grass-fed cattle. According to the researchers, conventionally raised samples turned out to have more bacteria, in general. And 18 percent contained at least one strain of bacteria resistant to the drugs most commonly used in human medicine, compared to just 9 percent of more sustainably raised samples, and 6 percent of grass-fed. CR-Health-Where-Superbugs-Lurk-Chart-08-15 The use of low-level antibiotics on feedlots -- including some that are important to human medicine -- may be responsible for the discrepancy. Consumer Reports suggests that overall higher amount of bacteria in conventional beef, meanwhile, may be a function of the conditions in which the animals are raised, in cramped, feces-ridden, stress-producing spaces; fed, with diets can include "candy, chicken coop waste and the slaughterhouse remains of pigs and chickens; and slaughtered, in a rapid manner that can increase the odds of contamination occurring. It's all reason, per Consumer Reports, to look at labels, and strive to purchase sustainably raised beef "whenever possible." Doing so won't ensure a safe meal, however. Because while a higher proportion of conventional beef tested positive for superbugs, each and every sample tested by Consumer Reports was found to contain either enterococcus and/or nontoxin-producing E. coli, which signify fecal contamination. And 10 percent of all samples contained a strain of S. aureus that can make you sick even if you fully cook your meat. CR-Health-How-Much-Bacteria-Chart-08-15 Anyone else craving a veggie burger?

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 24, 2015 12:13

August 23, 2015

The porn women actually want to watch

AlterNet It’s no secret that most of the porn out there is filmed through a male lens. That’s not to say women aren’t watching porn; some are. But there’s a lot of debate surrounding what porn for women actually looks like. The average person may envision a scene spotlighting intimacy, passion and connection, and they wouldn’t be wrong. But they wouldn’t be right, either. About a year ago, the Pornhub analytics team decided to examine the viewing habits of their female viewers. The 2014 results confirmed what most of us already knew: plenty of women like porn. But one little detail may surprise some people. It turns out that women like lesbian porn, specifically. Pornhub’s lesbian category was ranked number one among female viewers. Gay male porn came in second place. For men, the gay category falls into seventh place, a noteworthy finding since, as Pornhub writes, “this category ranks higher with the sex opposite to that which this type of content is intended for." Next came the teen, for-women and ebony categories. This year’s results provide us with even more insight. While the lesbian category still holds the number-one spot, and gay (male) number two, the number-three spot now goes to the big dick category. “Squirting” fell to the number 10 spot. The statisticians also looked into what search terms grew the most in popularity among female users over the course of the past year. Searches for “Real celebrity sex tape” grew by 1028%. The team suspects Kim Kardashian’s Paper Mag cover had something to do with it. But perhaps the more interesting growth comes from specifically female-focused searches, like “man eating pussy,” “guy eating girl out,” “guy fingering pussy” and “hardcore pussy licking,” all of which saw substantial growth. It seems like women like watching guys (and girls) eat pussy. A lot, and a lot more than men do. The team found that searches for “guy licking pussy” and “pussy licking orgasm” were conducted 722% and 934% more often by women than by men. It might also be worth noting that women who use Pornhub search for the terms “squirting orgasm,” “daddy,” “big dick” and “big black dick” significantly more than men do. So where in the world are all these porn-watching women? All over, it seems. As of 2015, 24% of the world’s Pornhub visitors are women; a whole percentage point higher than it was last year. When it comes to the United States, the states with the highest proportion of female viewers may surprise you. Mississippi came in at number one, Georgia at number two and South Carolina in the number-three slot. North Dakota, Vermont and Alaska had the lowest proportions of female viewers. Who knows, maybe hotter temperatures make for hornier women. Maybe it’s something else. The 2015 findings also included a category the 2014 result skipped over: time spent per visit. The Pornhub statisticians found that women tend to spend more time watching porn while on the site than their male counterparts. The average woman spends a “lavish” 10 minutes and 10 seconds per visit, compared to men who spend just 9 minutes and 22 seconds browsing the Pornhub selection. Filipino women, on average, indulge with an impressive 13:31 minutes on the site, while Russian women keep it rushed with a modest 8 minutes and 2 seconds. Russia was the only country listed in which the women spent less time watching porn than the men. While women in Vermont may not watch as much porn as those in the southern states, the ones who do like to take their time. Ladies in Vermont spend an average of 12 minutes watching porn, the longest reported timeframe in the country. Maybe we can chalk it up to the fact that women may take longer to reach orgasm. Maybe it’s because it’s not hard for (some) women to achieve multiple orgasms. Or maybe it’s simply because women like to take their time. And how should we explain the rest of these results? Of course, “different strokes for different folks” applies, but the universality of some of these trends can’t be ignored. It isn’t hard to explain away the fact that women favor scenes involving cunnilingus. As many as 70% of women need clitoral stimulation to achieve an orgasm. Oral sex is a great means for delivering said stimulation. Lesbian scenes often provide a focus on female pleasure that’s notably absent in many of the hetero scenes out there. But gay male sex? The answer to this is open-ended. But an erect penis is an erotic image for many women. The image of a man becoming aroused while experiencing the sensation of being penetrated may be something some women find worth watching. It's not often heterosexual women get to identify so carnally with the ones they lay with. Other theories hold that gay porn features fewer degrading acts inflicted on women, because there are no women. It's also possible that gay (and lesbian) scenes simply provide a perfect platform for equal-opportunity orgasms to take place. Because the only thing that's sexier than watching someone get off is watching everyone getting off. The data provides a lot to talk about. But perhaps the most compelling bit is that, just as not all little girls like Barbie dolls, not all women like cute, cuddly sex. Quite the opposite, actually. Female sexuality is complicated and varied, and it can keep up with even the most ambitious of male fantasies. AlterNet It’s no secret that most of the porn out there is filmed through a male lens. That’s not to say women aren’t watching porn; some are. But there’s a lot of debate surrounding what porn for women actually looks like. The average person may envision a scene spotlighting intimacy, passion and connection, and they wouldn’t be wrong. But they wouldn’t be right, either. About a year ago, the Pornhub analytics team decided to examine the viewing habits of their female viewers. The 2014 results confirmed what most of us already knew: plenty of women like porn. But one little detail may surprise some people. It turns out that women like lesbian porn, specifically. Pornhub’s lesbian category was ranked number one among female viewers. Gay male porn came in second place. For men, the gay category falls into seventh place, a noteworthy finding since, as Pornhub writes, “this category ranks higher with the sex opposite to that which this type of content is intended for." Next came the teen, for-women and ebony categories. This year’s results provide us with even more insight. While the lesbian category still holds the number-one spot, and gay (male) number two, the number-three spot now goes to the big dick category. “Squirting” fell to the number 10 spot. The statisticians also looked into what search terms grew the most in popularity among female users over the course of the past year. Searches for “Real celebrity sex tape” grew by 1028%. The team suspects Kim Kardashian’s Paper Mag cover had something to do with it. But perhaps the more interesting growth comes from specifically female-focused searches, like “man eating pussy,” “guy eating girl out,” “guy fingering pussy” and “hardcore pussy licking,” all of which saw substantial growth. It seems like women like watching guys (and girls) eat pussy. A lot, and a lot more than men do. The team found that searches for “guy licking pussy” and “pussy licking orgasm” were conducted 722% and 934% more often by women than by men. It might also be worth noting that women who use Pornhub search for the terms “squirting orgasm,” “daddy,” “big dick” and “big black dick” significantly more than men do. So where in the world are all these porn-watching women? All over, it seems. As of 2015, 24% of the world’s Pornhub visitors are women; a whole percentage point higher than it was last year. When it comes to the United States, the states with the highest proportion of female viewers may surprise you. Mississippi came in at number one, Georgia at number two and South Carolina in the number-three slot. North Dakota, Vermont and Alaska had the lowest proportions of female viewers. Who knows, maybe hotter temperatures make for hornier women. Maybe it’s something else. The 2015 findings also included a category the 2014 result skipped over: time spent per visit. The Pornhub statisticians found that women tend to spend more time watching porn while on the site than their male counterparts. The average woman spends a “lavish” 10 minutes and 10 seconds per visit, compared to men who spend just 9 minutes and 22 seconds browsing the Pornhub selection. Filipino women, on average, indulge with an impressive 13:31 minutes on the site, while Russian women keep it rushed with a modest 8 minutes and 2 seconds. Russia was the only country listed in which the women spent less time watching porn than the men. While women in Vermont may not watch as much porn as those in the southern states, the ones who do like to take their time. Ladies in Vermont spend an average of 12 minutes watching porn, the longest reported timeframe in the country. Maybe we can chalk it up to the fact that women may take longer to reach orgasm. Maybe it’s because it’s not hard for (some) women to achieve multiple orgasms. Or maybe it’s simply because women like to take their time. And how should we explain the rest of these results? Of course, “different strokes for different folks” applies, but the universality of some of these trends can’t be ignored. It isn’t hard to explain away the fact that women favor scenes involving cunnilingus. As many as 70% of women need clitoral stimulation to achieve an orgasm. Oral sex is a great means for delivering said stimulation. Lesbian scenes often provide a focus on female pleasure that’s notably absent in many of the hetero scenes out there. But gay male sex? The answer to this is open-ended. But an erect penis is an erotic image for many women. The image of a man becoming aroused while experiencing the sensation of being penetrated may be something some women find worth watching. It's not often heterosexual women get to identify so carnally with the ones they lay with. Other theories hold that gay porn features fewer degrading acts inflicted on women, because there are no women. It's also possible that gay (and lesbian) scenes simply provide a perfect platform for equal-opportunity orgasms to take place. Because the only thing that's sexier than watching someone get off is watching everyone getting off. The data provides a lot to talk about. But perhaps the most compelling bit is that, just as not all little girls like Barbie dolls, not all women like cute, cuddly sex. Quite the opposite, actually. Female sexuality is complicated and varied, and it can keep up with even the most ambitious of male fantasies.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 17:00

I’m black, but I’m uncomfortable around black people

It happened. I failed the “black” test. My hair stylist and I were chatting while she was taking a break from retightening my locs. I made a funny quip, and she extended her palm so that we could partake in the standard Black American handshake. In what was most likely the longest three seconds in the universe, I stared at her hand in befuddlement, trying to figure out what she was doing. By the time I realized that this was the handshake, it was too late. I tried to recover with some weird amalgamation of a fist bump and a high five, but the damage had been done. I had revealed myself to be the Carlton to her Fresh Prince.

I replayed the scene over and over in my head during my walk to the train. How could I have been so oblivious to an obvious cultural norm? This set off a mini existential crisis where I came to one of my greatest philosophical epiphanies: I’m uncomfortable around black people. This is a peculiar realization being that I am also a black person.

But you see, my stylist embodies a certain Harlem black cool I’ve always been told (by white people) that I lack. Every time I walk into the black barbershop where she does hair, I feel like I’m going to be “found out.” In my mind when other black people see me, they’re thinking: “She may look black, but she’s not black black, if you know what I mean.”

Where does this discomfort come from? And why do I think of Blackness as a test I am doomed to fail?

Like most psychological problems, it all began in my childhood, specifically the eight years I spent living in all white towns in rural Wisconsin. If there was one phrase I heard more than “nigger,” it was “You’re not black.” Talk about irony.  

Sometimes it was phrased as a “compliment,” meaning you’re one of the good black people. But other times it was meant so white people, whose sole interaction with black culture came through the distorted lens of racist media, could assert their own twisted version of blackness over me.

“I’m blacker than you because I know more Tupac songs than you.”

“You’re not black. Your lips aren’t even that big.”

“You’re not even that black. Look, my ass is fatter than yours.”

“I know so many white girls that can gangsta walk better than you.”

“You’re not black, you can’t even dance!”

It didn’t surprise me that Rachel Dolezal truly thought she was black. I’ve long known that, for many white people, being black is simply checking off a list of well-worn stereotypes.

I always brushed off those comments, because I knew I was black enough to be called “nigger.”  I was black enough that white people stared at me everywhere I went in those lily-white towns. And I was black enough to be accused of stealing during shopping trips.

But if you hear something enough, it can seep into your unconscious and start to guide your decisions. Somewhere along the way I started believing that I wasn’t black enough, whatever that meant. This is the clusterfuck of all realizations: Racism made me uncomfortable around my own people. Ain’t that some shit?

And it even affected my college experience. I never applied to any historical black colleges because I thought everyone would make fun of me because my black wasn’t cool enough. I was more comfortable with the thought of being around white people, where my blackness was for sure going to be denigrated in one form or another, than I was with the thought of being around my own people. By that time I had already accepted racism as a staple of life, but the thought of possibly being rejected by people that looked like me was too much to bear.

Recently I was hanging out with a friend who was born and raised in Harlem. For me she represents the epitome of black cool and I envy that she grew up around black people her entire life. She told me that because of her alternative interests, namely metal music, she was accused of “acting white” by her high school peers.

No black person has ever outright accused me of not being black enough, while that’s all she ever experienced as a teenager. Our childhoods couldn’t have been any more different, but we both grappled with having our own blackness invalidated by superficial parameters.

In the foreword for the book “Black Cool: One Thousand Streams of Blackness,” Henry Louis Gates, Jr. writes: “There are 40 million black people in this country, and there are 40 million ways to be black … I do not mean to suggest that we are all of us in our own separate boxes, that one black life bears no relation to another. Of course not. We are not a monolith, but we are a community.”

It’s taken some time, but now I’m aware that there is no “black test” and that, even though I’m more Carlton than Fresh Prince, my blackness is still valid. My hair stylist doesn’t see me as some racial imposter. To her, I’m just some weirdo who doesn’t know how to do a proper handshake. Resisting the temptation to police my own blackness and the blackness of others has been a gradual process, but a necessary one.

And who knows what I’ve missed out on? How many friends I could’ve made, how many organizations I didn’t join out of fear. For years I isolated myself from the community that Henry Louis Gates, Jr. talks about, keeping potential sources of emotional support at arm’s length. And with new hashtags popping up every day, strong emotional support systems are needed more than ever.

White supremacy takes on many forms. It’s most visible as the daily physical assault on black lives. But we shouldn’t underestimate the psychological effects of something as seemingly simple as how we define what it means to be black.

It happened. I failed the “black” test. My hair stylist and I were chatting while she was taking a break from retightening my locs. I made a funny quip, and she extended her palm so that we could partake in the standard Black American handshake. In what was most likely the longest three seconds in the universe, I stared at her hand in befuddlement, trying to figure out what she was doing. By the time I realized that this was the handshake, it was too late. I tried to recover with some weird amalgamation of a fist bump and a high five, but the damage had been done. I had revealed myself to be the Carlton to her Fresh Prince.

I replayed the scene over and over in my head during my walk to the train. How could I have been so oblivious to an obvious cultural norm? This set off a mini existential crisis where I came to one of my greatest philosophical epiphanies: I’m uncomfortable around black people. This is a peculiar realization being that I am also a black person.

But you see, my stylist embodies a certain Harlem black cool I’ve always been told (by white people) that I lack. Every time I walk into the black barbershop where she does hair, I feel like I’m going to be “found out.” In my mind when other black people see me, they’re thinking: “She may look black, but she’s not black black, if you know what I mean.”

Where does this discomfort come from? And why do I think of Blackness as a test I am doomed to fail?

Like most psychological problems, it all began in my childhood, specifically the eight years I spent living in all white towns in rural Wisconsin. If there was one phrase I heard more than “nigger,” it was “You’re not black.” Talk about irony.  

Sometimes it was phrased as a “compliment,” meaning you’re one of the good black people. But other times it was meant so white people, whose sole interaction with black culture came through the distorted lens of racist media, could assert their own twisted version of blackness over me.

“I’m blacker than you because I know more Tupac songs than you.”

“You’re not black. Your lips aren’t even that big.”

“You’re not even that black. Look, my ass is fatter than yours.”

“I know so many white girls that can gangsta walk better than you.”

“You’re not black, you can’t even dance!”

It didn’t surprise me that Rachel Dolezal truly thought she was black. I’ve long known that, for many white people, being black is simply checking off a list of well-worn stereotypes.

I always brushed off those comments, because I knew I was black enough to be called “nigger.”  I was black enough that white people stared at me everywhere I went in those lily-white towns. And I was black enough to be accused of stealing during shopping trips.

But if you hear something enough, it can seep into your unconscious and start to guide your decisions. Somewhere along the way I started believing that I wasn’t black enough, whatever that meant. This is the clusterfuck of all realizations: Racism made me uncomfortable around my own people. Ain’t that some shit?

And it even affected my college experience. I never applied to any historical black colleges because I thought everyone would make fun of me because my black wasn’t cool enough. I was more comfortable with the thought of being around white people, where my blackness was for sure going to be denigrated in one form or another, than I was with the thought of being around my own people. By that time I had already accepted racism as a staple of life, but the thought of possibly being rejected by people that looked like me was too much to bear.

Recently I was hanging out with a friend who was born and raised in Harlem. For me she represents the epitome of black cool and I envy that she grew up around black people her entire life. She told me that because of her alternative interests, namely metal music, she was accused of “acting white” by her high school peers.

No black person has ever outright accused me of not being black enough, while that’s all she ever experienced as a teenager. Our childhoods couldn’t have been any more different, but we both grappled with having our own blackness invalidated by superficial parameters.

In the foreword for the book “Black Cool: One Thousand Streams of Blackness,” Henry Louis Gates, Jr. writes: “There are 40 million black people in this country, and there are 40 million ways to be black … I do not mean to suggest that we are all of us in our own separate boxes, that one black life bears no relation to another. Of course not. We are not a monolith, but we are a community.”

It’s taken some time, but now I’m aware that there is no “black test” and that, even though I’m more Carlton than Fresh Prince, my blackness is still valid. My hair stylist doesn’t see me as some racial imposter. To her, I’m just some weirdo who doesn’t know how to do a proper handshake. Resisting the temptation to police my own blackness and the blackness of others has been a gradual process, but a necessary one.

And who knows what I’ve missed out on? How many friends I could’ve made, how many organizations I didn’t join out of fear. For years I isolated myself from the community that Henry Louis Gates, Jr. talks about, keeping potential sources of emotional support at arm’s length. And with new hashtags popping up every day, strong emotional support systems are needed more than ever.

White supremacy takes on many forms. It’s most visible as the daily physical assault on black lives. But we shouldn’t underestimate the psychological effects of something as seemingly simple as how we define what it means to be black.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 17:00

You can’t make “Game of Thrones” on a YouTube budget: Why “it’s the best of times and the worst of times” for prestige TV

One subject that’s developed a kind of consensus around it – something fanboys and sophisticates, couch potatoes and cineastes, optimists and pessimists alike can agree on – is the superiority of today’s television. The business changed after “The Sopranos” in a way that allowed all kinds of good things to happen, especially on cable and streaming services, where shows as different as “The Wire,” “Orange Is the New Black,” “Breaking Bad,” “Mad Men” and “Game of Thrones” have dealt with serious social problems, showcased first-rate acting, looked at history in fresh ways, and accomplished all kinds of other storytelling feats that TV has rarely done as well. Some of these shows have been compared to serious novels and the great work of cinema’s auteurs. Brett Martin, who wrote the definitive book – “Difficult Men” – on what his subtitle calls TV’s “creative revolution” – says things seem to still be going in the right direction. “It’s a lot of talented people doing a lot of great work,” he told Salon. “It’s giving money to people with talent and vision.” But can the golden age last forever? Some observers wonder. At the recent television press tour, FX Networks CEO John Landgraf warned that we may be reaching peak TV, with the television business “in the late stages of a bubble. We’re seeing a desperate scrum — everyone is trying to jockey for position. We’re playing a game of musical chairs, and they’re starting to take away chairs.” The stock prices for television and media companies have fallen substantially in recent weeks, and insiders worry about cord-cutting, un-bundling, online viewing and piracy. All of this sounds abstract, but the ability of networks to fund ambitious shows – most of them with very high production costs – has to do not just with a creative revolution, but an economic one. People pay for television in a way they rarely do for other forms of culture, and when they stop, so will the flow of great programming. We spoke to Robert Levine, author of “Free Ride: How Digital Parasites are Destroying the Culture Business and How the Culture Business Can Fight Back,” about why TV has thrived, what makes HBO different, and the threats to the next generation of “Game of Thrones” or “Mad Men.” Let’s start out with the fact that television seems like the most successful cultural form these days. How has it done better – made more money, employed creative people, improved at the level of quality – in a way that movies, books, music and other genres haven’t? The reason TV has done the best is because of the way it’s sold. If you think of the way books or movies or music is sold, the dominant mode was always – and still is – that you buy what you want to see. In TV, you buy all of it. If you [previously] bought four albums, you just then buy two. It was tricky, until recently – and still is – to buy half as much cable because it’s packaged. People complain about it: I pay a lot for ESPN and I’m not sure I’ve ever turned it on. But that created an incentive for [cable companies] to create shows with a lot of appeal for a small audience. You’ve had upward competition instead of down. Many of the people running entertainment corporations came from the television divisions: It’s a reflection of how dominant TV became. These were once movie studios who picked up television channels: Now they’re entertainment conglomerates fueled by television. And movies are a small subset of that. Television drives revenue and profit. Let’s break it down for a minute to one show. HBO has had an enormous popular and critical hit with “Game of Thrones.” What needed to go right, during this precarious period for other kinds of culture, for that show to be made? It’s got an enormous production budget by the standards of 20 years ago. You’re not buying HBO by the episode. You’re not even buying it by the show. Their primary method of doing business is you buy all of HBO. That gives them an incentive – like the larger incentive for cable – they have to produce something a small number of people love, not something a large number of people like. I would pay for HBO just for “Game of Thrones”; they’re sort of in the business of taking moonshots. Or “True Detective” – a lot of people feel very strongly about it, a lot of people don’t like it… But they don’t have to produce something that a lot of people will watch so they can sit through ads. They have to produce something that a few people might love. That’s the model that drove “Game of Thrones.” That niche model, then, is very different from the way TV worked in the 50s, 60s, 70s. Some of these things that seemed like a niche became much bigger: Take “The Walking Dead”… I don’t think people thought it would become so successful. In the ’50s, ’60s, and ’70s, it was pretty simple: x was the number of people watching, y is what you charged for an ad, x times y was the money you made. You had to get as many people to watch as possible. Remember, too, the setting was different: When I was growing up, we had one TV in the den. What did we watch? It wasn’t what my dad liked best or what I liked best, but what we could all agree on. It wasn’t the lowest, but the broadest common denominator: What will the whole family tolerate? Now there’s another screen, or you can watch it later. You get more different kinds of choices. There’s more good TV, more bad TV; there’s an explosion of niches. It’s not just that TV got smarter: TV got more everything. So it was important, for “Game of Thrones” to be possible, that it was on a network that had a lot going on, and that people couldn’t use in an a la carte way. If you got HBO, you were paying a lot of money for a lot of programming: Some of them might work, some might not. But all of them helped fund the other, make either possible. That’s true to some extent on every channel. But on other channels, there’s more need to justify every show: Show x needs to bring in y to sell z ads… HBO is more of a coherent whole, so there’s more of an urge to do prestige projects. Same with Showtime and some of the other channels? And same with Netflix and Amazon. And some of the other cable channels. HBO is now competing with Netflix, so it has to deliver more quality content than Nexflix – that’s a great competition. Amazon feels like it’s in the same place, and that’s gonna get a lot more TV. What’s next, then? Right now, it’s the best of times and the worst of times. Over the last decade or more, you’ve had a bunch of incredible incentives to develop a lot of interesting TV for specific audiences. That’s resulted in a lot of people rushing into that business. When HBO had “The Sopranos” and “Sex and the City,” there was nothing else like them. Now there’s a lot of sophisticated shows. But people only have so much time. If you look at the comments made by the head of FX – I don’t know if we’ve reached peak television, but we’ve reached peak, quote unquote, quality television. There are only so many shows I can watch where I have to keep track of dozens of characters. And all those shows are considered more and more important; these expensive shows have good secondary markets – online, DVDs, overseas, airplanes, hotels, you name it. But how many of those can people watch? Increasingly, you see quality shows with expensive development budgets get ignored in a way that might not have been five or 10 years ago. “Masters of Sex” is better than it is popular. “Sense8” is supposed to be really good; I’ve not gotten a chance to watch it yet. I think FX is doing a lot of interesting stuff. But it can be hard to get enough people to watch. At the same time, you have a phenomenon where more and more people are watching this stuff online -- stuff you’re not paying extra for like FX and AMC. If you’re watching them online, they may not be making that much money on them. It’s the best of times because the current model is robust. But it’s the worst of times because we’ve probably reached peak TV. People are watching shows that don’t make those shows that much money. You saw a lot of downer movement in media stocks this week. Look at Comedy Central: “Inside Amy Schumer” is a really cool show that a lot of people are watching, but a lot of them are watching it on YouTube – that doesn’t fund the watching of more shows. I don’t think anyone has really figured out what to do about that. Is that the kind of thing that drove the stock decline? Broadly speaking – that the cable model that seemed so strong and so good [may be weak.] A lot of younger people are not subscribing. And you can watch a lot of stuff online – perfectly legally in a lot of cases. That doesn’t bring in a lot of revenue. There may be a way to produce “Amy Schumer” for an online audience, maybe. Definitely you could not do “Game of Thrones” that way. So what happens when HBO in five years is getting two-thirds the revenues it is now? What happens to the expensive, high-end shows that they deliver? The future may be pretty good for HBO. The question is, Will it be as good for Showtime [and others]? I have Netflix, I have cable, I have Amazon. That’s a lot of different TV. And there’s Hulu… How many of these things are people gonna subscribe too? Besides the money, what a pain in the ass. People might want one or two things and see anything else as a hassle. So what does that mean for the sophisticated shows that people have gotten accustomed to? Does that winnowing pose a threat to them? Right now it’s a winner-take-all market: You’re gonna see people risking stuff. There’s that show “Vinyl,” produced by Mick Jagger and Martin Scorsese: People are gonna throw a lot of things against the wall to see what sticks. What’s gonna happen is what happened to movies: People will say, The really hard part is the marketing, let’s try to base this on a new factor. That needn’t get you “Ant-Man” -- “Better Call Saul” has a built-in audience. We’re already seeing some of this: The two biggest new AMC shows are “Better Call Saul” and “Fear of the Walking Dead,” because they have a built-in audience. That doesn’t have to lead you to mediocrity, but there’s a lot of pressure… People thought “Tyrant” was a good show on FX, or this Denis Leary rock ‘n’ roll show. But it’s a lot easier to do a spinoff of something. When you have a rush into a profitable business, you tend to get an emphasis on marketing. You can spend a lot of money, or you can ride on things people know. Does Turtle from “Entourage” get his own show? Remember “Joanie Loves Chachi”? It’s not like the characters were so unbelievably fascinating. It was, “Hey, we need to launch 10 new shows this year. The other networks are launching 10 new shows. What can we do to set ourselves apart? We can adapt something… Or find a story people already know and keep telling it.” At some point, people will stop making high-end expensive shows. So there’s so much stuff out there competing for people’s time, we might get a “Games of Thrones” – that had a fanbase from the books – but we might not get “Mad Men” or “The Wire,” which don’t have immediate name recognition. It’s not that we won’t get it. But it will be harder. And even if you do everything right, you can have business problems. Why? People are watching stuff online. If you have to do these shows on a YouTube budget, it’s gonna be really hard.One subject that’s developed a kind of consensus around it – something fanboys and sophisticates, couch potatoes and cineastes, optimists and pessimists alike can agree on – is the superiority of today’s television. The business changed after “The Sopranos” in a way that allowed all kinds of good things to happen, especially on cable and streaming services, where shows as different as “The Wire,” “Orange Is the New Black,” “Breaking Bad,” “Mad Men” and “Game of Thrones” have dealt with serious social problems, showcased first-rate acting, looked at history in fresh ways, and accomplished all kinds of other storytelling feats that TV has rarely done as well. Some of these shows have been compared to serious novels and the great work of cinema’s auteurs. Brett Martin, who wrote the definitive book – “Difficult Men” – on what his subtitle calls TV’s “creative revolution” – says things seem to still be going in the right direction. “It’s a lot of talented people doing a lot of great work,” he told Salon. “It’s giving money to people with talent and vision.” But can the golden age last forever? Some observers wonder. At the recent television press tour, FX Networks CEO John Landgraf warned that we may be reaching peak TV, with the television business “in the late stages of a bubble. We’re seeing a desperate scrum — everyone is trying to jockey for position. We’re playing a game of musical chairs, and they’re starting to take away chairs.” The stock prices for television and media companies have fallen substantially in recent weeks, and insiders worry about cord-cutting, un-bundling, online viewing and piracy. All of this sounds abstract, but the ability of networks to fund ambitious shows – most of them with very high production costs – has to do not just with a creative revolution, but an economic one. People pay for television in a way they rarely do for other forms of culture, and when they stop, so will the flow of great programming. We spoke to Robert Levine, author of “Free Ride: How Digital Parasites are Destroying the Culture Business and How the Culture Business Can Fight Back,” about why TV has thrived, what makes HBO different, and the threats to the next generation of “Game of Thrones” or “Mad Men.” Let’s start out with the fact that television seems like the most successful cultural form these days. How has it done better – made more money, employed creative people, improved at the level of quality – in a way that movies, books, music and other genres haven’t? The reason TV has done the best is because of the way it’s sold. If you think of the way books or movies or music is sold, the dominant mode was always – and still is – that you buy what you want to see. In TV, you buy all of it. If you [previously] bought four albums, you just then buy two. It was tricky, until recently – and still is – to buy half as much cable because it’s packaged. People complain about it: I pay a lot for ESPN and I’m not sure I’ve ever turned it on. But that created an incentive for [cable companies] to create shows with a lot of appeal for a small audience. You’ve had upward competition instead of down. Many of the people running entertainment corporations came from the television divisions: It’s a reflection of how dominant TV became. These were once movie studios who picked up television channels: Now they’re entertainment conglomerates fueled by television. And movies are a small subset of that. Television drives revenue and profit. Let’s break it down for a minute to one show. HBO has had an enormous popular and critical hit with “Game of Thrones.” What needed to go right, during this precarious period for other kinds of culture, for that show to be made? It’s got an enormous production budget by the standards of 20 years ago. You’re not buying HBO by the episode. You’re not even buying it by the show. Their primary method of doing business is you buy all of HBO. That gives them an incentive – like the larger incentive for cable – they have to produce something a small number of people love, not something a large number of people like. I would pay for HBO just for “Game of Thrones”; they’re sort of in the business of taking moonshots. Or “True Detective” – a lot of people feel very strongly about it, a lot of people don’t like it… But they don’t have to produce something that a lot of people will watch so they can sit through ads. They have to produce something that a few people might love. That’s the model that drove “Game of Thrones.” That niche model, then, is very different from the way TV worked in the 50s, 60s, 70s. Some of these things that seemed like a niche became much bigger: Take “The Walking Dead”… I don’t think people thought it would become so successful. In the ’50s, ’60s, and ’70s, it was pretty simple: x was the number of people watching, y is what you charged for an ad, x times y was the money you made. You had to get as many people to watch as possible. Remember, too, the setting was different: When I was growing up, we had one TV in the den. What did we watch? It wasn’t what my dad liked best or what I liked best, but what we could all agree on. It wasn’t the lowest, but the broadest common denominator: What will the whole family tolerate? Now there’s another screen, or you can watch it later. You get more different kinds of choices. There’s more good TV, more bad TV; there’s an explosion of niches. It’s not just that TV got smarter: TV got more everything. So it was important, for “Game of Thrones” to be possible, that it was on a network that had a lot going on, and that people couldn’t use in an a la carte way. If you got HBO, you were paying a lot of money for a lot of programming: Some of them might work, some might not. But all of them helped fund the other, make either possible. That’s true to some extent on every channel. But on other channels, there’s more need to justify every show: Show x needs to bring in y to sell z ads… HBO is more of a coherent whole, so there’s more of an urge to do prestige projects. Same with Showtime and some of the other channels? And same with Netflix and Amazon. And some of the other cable channels. HBO is now competing with Netflix, so it has to deliver more quality content than Nexflix – that’s a great competition. Amazon feels like it’s in the same place, and that’s gonna get a lot more TV. What’s next, then? Right now, it’s the best of times and the worst of times. Over the last decade or more, you’ve had a bunch of incredible incentives to develop a lot of interesting TV for specific audiences. That’s resulted in a lot of people rushing into that business. When HBO had “The Sopranos” and “Sex and the City,” there was nothing else like them. Now there’s a lot of sophisticated shows. But people only have so much time. If you look at the comments made by the head of FX – I don’t know if we’ve reached peak television, but we’ve reached peak, quote unquote, quality television. There are only so many shows I can watch where I have to keep track of dozens of characters. And all those shows are considered more and more important; these expensive shows have good secondary markets – online, DVDs, overseas, airplanes, hotels, you name it. But how many of those can people watch? Increasingly, you see quality shows with expensive development budgets get ignored in a way that might not have been five or 10 years ago. “Masters of Sex” is better than it is popular. “Sense8” is supposed to be really good; I’ve not gotten a chance to watch it yet. I think FX is doing a lot of interesting stuff. But it can be hard to get enough people to watch. At the same time, you have a phenomenon where more and more people are watching this stuff online -- stuff you’re not paying extra for like FX and AMC. If you’re watching them online, they may not be making that much money on them. It’s the best of times because the current model is robust. But it’s the worst of times because we’ve probably reached peak TV. People are watching shows that don’t make those shows that much money. You saw a lot of downer movement in media stocks this week. Look at Comedy Central: “Inside Amy Schumer” is a really cool show that a lot of people are watching, but a lot of them are watching it on YouTube – that doesn’t fund the watching of more shows. I don’t think anyone has really figured out what to do about that. Is that the kind of thing that drove the stock decline? Broadly speaking – that the cable model that seemed so strong and so good [may be weak.] A lot of younger people are not subscribing. And you can watch a lot of stuff online – perfectly legally in a lot of cases. That doesn’t bring in a lot of revenue. There may be a way to produce “Amy Schumer” for an online audience, maybe. Definitely you could not do “Game of Thrones” that way. So what happens when HBO in five years is getting two-thirds the revenues it is now? What happens to the expensive, high-end shows that they deliver? The future may be pretty good for HBO. The question is, Will it be as good for Showtime [and others]? I have Netflix, I have cable, I have Amazon. That’s a lot of different TV. And there’s Hulu… How many of these things are people gonna subscribe too? Besides the money, what a pain in the ass. People might want one or two things and see anything else as a hassle. So what does that mean for the sophisticated shows that people have gotten accustomed to? Does that winnowing pose a threat to them? Right now it’s a winner-take-all market: You’re gonna see people risking stuff. There’s that show “Vinyl,” produced by Mick Jagger and Martin Scorsese: People are gonna throw a lot of things against the wall to see what sticks. What’s gonna happen is what happened to movies: People will say, The really hard part is the marketing, let’s try to base this on a new factor. That needn’t get you “Ant-Man” -- “Better Call Saul” has a built-in audience. We’re already seeing some of this: The two biggest new AMC shows are “Better Call Saul” and “Fear of the Walking Dead,” because they have a built-in audience. That doesn’t have to lead you to mediocrity, but there’s a lot of pressure… People thought “Tyrant” was a good show on FX, or this Denis Leary rock ‘n’ roll show. But it’s a lot easier to do a spinoff of something. When you have a rush into a profitable business, you tend to get an emphasis on marketing. You can spend a lot of money, or you can ride on things people know. Does Turtle from “Entourage” get his own show? Remember “Joanie Loves Chachi”? It’s not like the characters were so unbelievably fascinating. It was, “Hey, we need to launch 10 new shows this year. The other networks are launching 10 new shows. What can we do to set ourselves apart? We can adapt something… Or find a story people already know and keep telling it.” At some point, people will stop making high-end expensive shows. So there’s so much stuff out there competing for people’s time, we might get a “Games of Thrones” – that had a fanbase from the books – but we might not get “Mad Men” or “The Wire,” which don’t have immediate name recognition. It’s not that we won’t get it. But it will be harder. And even if you do everything right, you can have business problems. Why? People are watching stuff online. If you have to do these shows on a YouTube budget, it’s gonna be really hard.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 15:00

The science of forgiveness: “When you don’t forgive you release all the chemicals of the stress response”

The Burn Surgeon: How Anger Can Impede Healing In 1978, Dr. Dabney Ewin, a surgeon specializing in burns, was on duty in a New Orleans emergency room when a man was brought in on a gurney. A worker at the Kaiser Aluminum plant, the patient had slipped and fallen into a vat of 950-degree molten aluminum up to his knees. Ewin did something that most would consider strange at best or the work of a charlatan at worst: He hypnotized the burned man. Without a swinging pocket watch or any other theatrical antics, the surgeon did what’s now known in the field of medical hypnosis as an “induction,” instructing the man to relax, breathe deeply, and close his eyes. He told him to imagine that his legs—scorched to the knees and now packed in ice—did not feel hot or painful but “cool and comfortable.” Ewin had found that doing this—in addition to standard treatments—improved his patients’ outcomes. And that’s what happened with the Kaiser Aluminum worker. While such severe burns would normally require months to heal, multiple skin grafts, and maybe even lead to amputation if excessive swelling cut off the blood supply, the man healed in just eighteen days—without a single skin graft. As Ewin continued using hypnosis to expedite his burn patients’ recoveries, he added another unorthodox practice to his regimen: He talked to his patients about anger and forgiveness. He noticed that people coming into the ER with burns were often very angry, and not without reason. They were, as he put it, “all burned up,” both literally and figuratively. Hurt and in severe pain due to their own reckless mistake or someone else’s, as they described the accident that left them burned, their words were tinged with angry guilt or blame. He concluded that their anger may have been interfering with their ability to heal by preventing them from relaxing and focusing on getting better. “I was listening to my patients and feeling what they were feeling,” Ewin told me. “It became obvious that this had to be dealt with. Their attitude affected the healing of their burns, and this was particularly true of skin grafts. With someone who’s real angry, we’d put three or four skin grafts on, but his body would reject them.” Whenever a patient seemed angry, Ewin would help them forgive themselves or the person who hurt them, either through a simple conversation or through hypnosis. Ewin, now eighty-eight and semiretired after practicing surgery and teaching medical hypnosis at the Tulane University School of Medicine for more than thirty years, became interested in hypnosis while he was a young doctor training under the legendary Dr. Champ Lyons, who pioneered the use of penicillin and treated survivors of the famous Cocoanut Grove nightclub fire in Boston in 1942. As Ewin learned to stabilize patients and conduct skin grafts, he wondered about an intriguing practice that he’d learned of from his great uncle. As an independently wealthy “man of leisure” in Nashville, this uncle had dabbled in hypnosis. He even held séances, which had become so popular in the late 1800s that First Lady Mary Todd Lincoln held them in the White House to attempt to reach the spirit of her dead son. (President Abraham Lincoln reportedly attended.) Many of the most popular séance leaders were eventually exposed as frauds exploiting the grief-stricken, but Ewin’s uncle found another forum for hypnosis that was less controversial than hypnotizing an audience into believing that dead friends were speaking to them. He hypnotized the patients of surgeon friends before they went under the knife in order to minimize their pain. (This was before anesthesia was widely used.) Ewin took a few hypnosis courses to find out more. “I figured it couldn’t hurt,” he told me in his friendly New Orleans drawl when I reached him at home by phone. Once he started trying hypnosis on his burn patients, he noticed a difference immediately. If he could reach them within half an hour of the injury, the hypnotic suggestions of “coolness and calm” seemed to halt the continued burning response of the skin that usually occurs for twelve to twenty-four hours, leading to speedier recoveries. (While there are no empirical studies of hypnosis on burn patients and Ewin’s data is anecdotal, multiple studies do show that hypnosis can alleviate symptoms and improve medical outcomes in various scenarios, from asthma and warts to childbirth and post-traumatic stress disorder.) Once Ewin began helping his patients forgive, he noticed even more improvement. “What you’re thinking and feeling affects your body,” he would explain to his patients, using the analogy of something embarrassing causing someone to blush. “What you’re feeling will affect the healing of your skin, and we want you to put all your energy into healing.” At this point, he would learn how the victim had unthinkingly opened a blast furnace without turning it off, or how the workmen at a construction site had repeatedly told the boss about a dangerously placed can of gasoline, to no avail. “I’d do hypnosis with them and help them forgive themselves or the other person,” Ewin said. “I’d say, ‘You can still pursue damages through an attorney. You’re entitled to be angry, but for now I’m asking you to abandon your entitlement and let it go, to direct your energy toward healing, and turn this over to God or nature or whoever you worship. It’s not up to you to get revenge on yourself or someone else. When you know at a feeling level that you’re letting it go, raise your hand.’ Then I’d shut up, they’d raise their hand, and I’d know that skin graft was gonna take.” Ewin taught other burn doctors what he discovered, and has received letters from colleagues in burn units around the world thanking him for helping them achieve faster recovery times for their patients. The Investor Turned Research Patron: How Forgiveness Hit Mainstream Science Like Dabney Ewin, John Templeton was a son of the South, a man of letters who came of age during the Depression and combined his success with less mainstream pursuits. Born to a middle-class family in Winchester, Tennessee, in 1912, Templeton managed to put himself through Yale after the 1929 stock market crash and became a Rhodes Scholar at Oxford. He launched his career on Wall Street by taking the “buy low, sell high” mantra to the extreme, borrowing money at the onset of World War II to buy one hundred shares each in 104 companies selling at one dollar per share or less, including 34 companies that were in bankruptcy. He reaped a healthy profit on all but four. Templeton entered the mutual funds business in the fifties, eventually selling his Templeton Funds to the Franklin Group in 1992. Money magazine called him “arguably the greatest global stock picker of the century.” Yet Templeton was equally passionate about spirituality, morality, and science, and how the scientific method could increase our understanding of life’s “Big Questions"—questions about the nature of consciousness and the role that love and creativity, compassion and forgiveness, play in all areas of human life. In 1987, Templeton founded the John Templeton Foundation, dedicated to funding scientific research “on subjects ranging from complexity, evolution, and infinity, to creativity, forgiveness, love, and free will.” With the motto “How little we know, how eager to learn,” Templeton sought research grantees who were “innovative, creative, and open to competition and new ideas.” Templeton announced the Campaign for Forgiveness Research in 1997, a funding initiative for scientists in multiple disciplines who were interested in taking forgiveness out of the purview of religion and using rigorous scientific protocol to determine its effects on the body and mind. Spearheading the campaign was Dr. Everett Worthington, a psychology professor at Virginia Commonwealth University. One of the first psychologists to create therapeutic tools using forgiveness, he came to the topic through personal tragedy: His elderly mother was bludgeoned to death by an intruder, and, in part because of her death, his brother committed suicide. Struggling with rage and grief, Worthington switched his focus from marriage counseling to forgiveness. He designed a research framework for the Campaign for Forgiveness Research, Archbishop Desmond Tutu became a cochair for the campaign, and the Templeton Foundation provided a $5 million grant. Between 1998 and 2005, the foundation, along with thirteen partners including the Fetzer Institute, a Michigan-based nonprofit that funds research and educational projects focused on love and forgiveness, dedicated $9.4 million to 43 scientific studies on the health impacts of forgiveness. Whereas before, Worthington and a few other researchers were alone in their pursuits (and most of their research was aimed at affirming their own therapeutic models), the Campaign for Forgiveness Research took a traditionally religious concept and placed it firmly on the scientific landscape. In addition to funding researchers directly, the campaign sparked dialogue and interest in the broader scientific community. While in 1998 there were 58 empirical studies on forgiveness in the research literature, by 2005, when the campaign concluded, there were 950. Throughout the process, Templeton was highly engaged. Even into his eighties, he was known to walk waist-deep in the surf for an hour near his Bahamas home each morning before sitting down to read grant proposals. When he died at ninety-five, he was lauded by both the business and scientific communities. The Wall Street Journal called him the “maximum optimist,” whose confidence in rising stocks paid off and whose philanthropy left an enduring legacy. The leading scientific journal Nature wrote, “His love of science and his God led him to form his foundation in 1987 on the basis that mutual dialogue might enrich the understanding of both.” While it’s up for debate whether the research Templeton funded has enriched our understanding of God, it certainly has enriched our understanding of forgiveness, demonstrating that what was traditionally seen as a religious ideal is actually an important skill for anyone, whether atheist, agnostic, or believer, who seeks to live a healthy, happy life. The Science of Forgiveness One of the researchers who participated in the Campaign for Forgiveness Research was Dr. Robert Enright, a developmental psychologist at the University of Wisconsin–Madison. Enright began contemplating forgiveness back in the mid-eighties. As a Christian, he’d been raised on Jesus’ teachings about tolerance and forgiveness. He asked himself: Could forgiveness help patients in a clinical setting? In spite of skeptical colleagues who ridiculed him for applying science to something so “mushy” and “religious,” he designed forgiveness interventions for therapy and studied their psychological and physiological impacts. He began by developing therapies aimed at helping elderly women to forgive those who had wronged them in the past, and to help victims of abuse and incest to understand their tormentors without justifying the abusers’ actions. His initial findings were encouraging. His first study, which compared women undergoing forgiveness therapy with a control group who underwent therapy for emotional wounds without a forgiveness focus, found that the experimental group improved more in emotional and psychological health measures than the control group. It was published in the journal Psychotherapy in 1993. Afterward, Enright honed his therapeutic forgiveness tools, from helping people develop empathy—the ability to understand and share the feelings of another—toward aggressors, to learning to forgive and accept themselves, and tested them on a range of groups. Among battered women and “parental love–deprived college students,” for instance, those subject to forgiveness therapy showed more improvement in emotional and psychological health than control groups who received therapy without a forgiveness focus. Enright’s forgiveness model has four parts: uncovering your anger, deciding to forgive, working on forgiveness, and discovery and release from emotional prison. All take place through therapist-patient dialogue. Uncovering anger means examining how you’ve both avoided and dealt with it, and exploring how the offense and resulting anger has changed your health, worldview, and life in general. The phase involves learning about what forgiveness is and what it’s not, acknowledging that the ways you’ve dealt with your anger up until now haven’t worked, and setting the intention to forgive. Next, working on forgiveness entails confronting the pain the offense has caused and allowing yourself to experience it fully, then working toward developing some level of understanding and compassion for the offender. The final phase includes acknowledging that others have suffered as you have and that you’re not alone (for some, this means connecting with a support group of people who have endured a similar experience), examining what possible meaning your suffering could have for your life (learning a particular life lesson, perhaps contributing to one’s strength or character, or prompting one to help others), and taking action on whatever you determine to be your life purpose. Since developing that therapy model and pioneering the first studies, Enright and his colleagues have found positive results in drug rehabilitation participants (less anger, depression, and need for drugs compared to the control group receiving standard therapy), victims of domestic violence (decreased anxiety, depression, and post-traumatic stress disorder relative to the control group), and terminally ill cancer patients (more hope for the future and less anger than the control group). When it comes to determining the existence of a causal relationship between forgiveness and physical health, Enright says the most definitive study he has done was conducted with a team of researchers on cardiac patients. Published in 2009 in the journal Psychology & Health, their analysis found that when cardiac patients with coronary heart disease underwent forgiveness therapy, the rate of blood flow to their hearts improved more than that of the control group, which received only standard medical treatment and counseling about diet and exercise. “It wasn’t that they were cured—these were patients with serious heart problems,” Enright says. “But they were at less risk of pain and sudden death.” Those results echo studies by another Templeton grantee, Charlotte Witvliet, a psychology professor at Hope College; and Sonja Lyubomirsky, a psychology professor at the University of California, Riverside, and author of numerous books on happiness, which found that people who forgive more readily have fewer coronary heart problems than those who hold grudges. Perhaps the most comprehensive body of evidence showing links between forgiveness and health focuses on mood, says Dr. Frederic Luskin, the cofounder of the Stanford Forgiveness Project, an ongoing series of workshops and research studies at Stanford University. Researchers who measure emotional and psychological health outcomes following therapy that includes forgiveness are quantifying patients’ levels of anger, anxiety, and depression, concluding in multiple studies that forgiveness elevates mood and increases optimism, while not forgiving is positively correlated with depression, anxiety, and hostility. Like Enright, Luskin has developed ways to teach forgiveness in various places and with various groups, including war-ravaged populations in countries such as Northern Ireland and Sierra Leone, and he asserts that anyone—from jilted spouses to widows who have lost husbands to terrorism—can heal. Luskin developed a weeklong “forgiveness training” delivered in a group setting. In it, he leads participants through a series of discussions and exercises. The first steps involve teasing apart what he calls “your grievance story,” which is usually formed by taking something personally that wasn’t necessarily personal, and then blaming someone for your feelings. His argument is that when you blame someone for how you feel instead of holding them to account for their actions, you keep yourself stuck in victimhood and inaction (resenting your ex for her drinking and destructive behavior, for instance, instead of just seeking a restraining order). Luskin has participants “find the impersonal in the hurt” by realizing how many other people have experienced a similar offense or disappointment and how common it is, as well as acknowledging that most offenses are committed without the intention of hurting anyone personally. (If your mother yelled at you, for example, she likely did so not because her goal was to hurt your feelings and forever damage your self-confidence, but because she was stressed or afraid.) This doesn’t negate that often there is a personal aspect to an offense, Luskin says, but it can lessen the pain and blame. “When you don’t forgive you release all the chemicals of the stress response,” Luskin says. “Each time you react, adrenaline, cortisol, and norepinephrine enter the body. When it’s a chronic grudge, you could think about it twenty times a day, and those chemicals limit creativity, they limit problem-solving. Cortisol and norepinephrine cause your brain to enter what we call ‘the no- thinking zone,’ and over time, they lead you to feel helpless and like a victim. When you forgive, you wipe all of that clean.” One of the main areas funded by the Templeton grant was the neuroscience of forgiveness. Around the time of the award, functional MRI, or fMRI, scanners were becoming increasingly common and sparking new discoveries in a variety of areas. The machines enable neuroscientists to capture X-ray images of people’s brains in action to observe blood flow and see which brain components are activated in which situations. In 2001, Dr. Tom Farrow of the University of Sheffield in the United Kingdom used fMRI scanners to conduct the first scientific study of the “functional anatomy” of forgiveness. Using ten subjects, he had each person climb into his laboratory’s fMRI scanner and asked them to answer a series of questions designed to evoke empathy and forgiveness. The empathy-related questions asked participants to consider potential explanations for someone’s emotional state (if your boss is unusually quiet or withdrawn, for instance, is it more likely that her child was expelled from school or that her child was caught shoplifting?), while the forgiveness-related questions asked people to evaluate which crimes they considered more forgivable (a neighbor who recently lost his job getting arrested for assaulting his girlfriend or for assaulting his boss?). Farrow and his team found that empathy and “forgivability judgments,” basically contemplating whether a certain action deserves forgiveness, activate various parts of the frontal lobe, which is associated with problem-solving and reason. In contrast, a researcher named Dr. Pietro Pietrini at the University of Pisa in Italy showed in a 2000 fMRI study that anger and vengeance inhibited rational thinking and caused high activity in the amygdala, which is involved in the fight-or-flight response. Anger and rage, then, impede reason, but the tasks involved in the complex process of forgiveness activate the more recently evolved parts of our brain, such as the prefrontal cortex and posterior cingulate, which are concerned with problem-solving, morality, understanding the mental states of others, and cognitive control of emotions. Having cognitive control means inhibiting impulsive reactions fueled by rage and hatred toward a wrongdoer. This can be done through thought, such as by devising a new, less upsetting interpretation of a painful event. When it comes to being hurt, this can mean viewing an infraction as less personal than you thought, or developing an understanding of someone’s actions by considering his point of view. Psychologists call this “reframing” a painful memory. It’s a key part of both Enright’s forgiveness therapy and Luskin’s forgiveness training. Taking things less personally is something I realized would benefit me and reduce a lot of my suffering. In my new relationship with Anthony, for instance, I would sometimes feel hurt when he teased me about something, whether my penchant for driving under the speed limit or the time I left a steak to thaw on the counter and his hundred-pound dog easily ate it. When I realized that he didn’t mean to hurt my feelings and was just making a good-natured joke, I was less likely to take offense and get upset. Another way to reframe is to consider a range of possible points of view that led someone to act a certain way. This makes it more difficult to blame and demonize that person and continue generating the same level of resentment as you did before. I once spent days feeling resentful about a former editor’s criticism about a story—which I thought was harsh and took personally. When a colleague suggested that what he said likely came from a deep commitment to accuracy and excellence, I let it go and felt a lot better. A third way to reframe is to consider what constructive learning, meaning, or opportunity may have resulted from an offense and the suffering it caused. For Azim, that was the opportunity to work with youth and prevent violence, and for my more mundane example about editorial feedback, it was a lesson about being more diligent in checking my facts and considering my approach to a story. Thanks to fMRI scanners, we can now identify the parts of the brain that make this sort of reframing practice possible. In one study, Farrow focused on two groups of people who struggle with empathy and, by extension, forgiveness: schizophrenics and people suffering from post-traumatic stress disorder. Both showed inhibited activity in the areas of the brain involved in forgiveness processes such as empathy and viewing another person’s perspective. But after ten weeks of therapy that included the discussion and practice of forgiveness (and antipsychotic drugs for the schizophrenics), those brain areas’ functions improved. While Farrow didn’t use a control group to isolate and test the therapeutic forgiveness intervention specifically, the findings confirm the earlier evidence of so-called forgiveness areas in the brain, and show that psychological treatments such as cognitive behavioral therapy can improve this aspect of brain function. In a separate experiment, Pietrini asked ten participants to lie in the scanner and consider a fictional scenario in which they were wronged and then forgave. As with the prior studies, both the dorsal prefrontal cortex (involved in cognitive control) and the posterior cingulate (involved in understanding the mental states of others) lit up on the screen. But a third part was also involved: the anterior cingulate cortex, which mediates the perception and the suppression of moral pain (such as the feeling of being wronged). Pietrini’s interpretation? Forgiveness could be viewed as a sort of painkiller for moral distress. When Pietrini presented his findings at a 2009 conference, he described them as evidence that forgiveness likely evolved as a way to overcome pain and alleviate suffering, and that even though it involves parts of the brain responsible for reason, it also requires a counterintuitive, and some would argue, irrational, choice: “You wronged me, but I forgive you, anyway.” “A great deal of evidence converges suggesting that forgiveness is a positive, healthy strategy for the individual to overcome a situation that otherwise would be a major source of stress from a psychological and neurobiological point of view,” he wrote to me in an e-mail. “The fact that forgiving is a healthy resolution of the problems caused by injuries suggests that this process may have evolved as a favorable response that promotes human survival.” From "TRIUMPH OF THE HEART: Forgiveness in an Unforgiving World" by Megan Feldman Bettencourt. Published by arrangement with Avery, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. Copyright © 2015 by Megan Feldman Bettencourt.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 13:00

Robots are coming for your job: We must fix income inequality, volatile job markets now — or face sustained turmoil

Global warming in and of itself isn’t a problem. After all, life on earth has survived numerous cycles of cooling and heating. The real problem with global warming is how quickly it happens. If there isn’t enough time for living things (including us) to adapt, rapid changes in climate, not to mention more volatile weather patterns, can sow havoc. The consequences of catastrophic climate change can reverberate for centuries as species suffer horrific losses of their habitat, leading to mass extinctions. The impact of technological change on our labor markets works the same way. As long as change is gradual, the markets can respond. Too fast, and it’s chaos. And as with my particular environmental preferences, it creates winners and losers. The likely accelerating effect of recent advances in artificial intelligence on technological change is going to roil our labor markets in two fundamental ways. The first is the simple truth that most automation replaces workers, so it eliminates jobs. That means fewer places for people to work. This threat is easy to see and measure— employers roll in a robot and walk a worker to the door. But sometimes change is less visible. Each new workstation may eliminate the need for one-fifth of a salesperson, or free Skype calls may allow you to work more productively at home one day a week, deferring the need for that new hire until next quarter. If this happens slowly, the resulting improvements in productivity and reduced cost eventually create wealth, stimulating job growth that compensates for the losses. The growth may be directly in the newly improved enterprise, as lower prices and better quality increase sales, creating a need to hire more workers. Or it may be in distant parts of the economy where the customers who no longer need to pay as much for some product or service decide to spend the money they saved. If new drilling technologies cause natural gas prices to drop, there’s more left over from your paycheck to save for that sailboat you’ve got your eye on. But the second threat is much more subtle and difficult to predict. Many technological advances change the rules of the game by permitting businesses to reorganize and reengineer  the  way they operate. These organizational and process improvements often make obsolete not only jobs but skills. A teller may get laid off when a bank installs ATMs; the improved service creates a need to hire network engineers but not tellers. Even if the bank ultimately expands its total workforce, the tellers remain out of luck. Weavers can eventually learn to operate looms, gardeners to service lawnmowers, and doctors to use computers to select the right antibiotics—once they accept that synthetic intellects are superior to their own professional judgment. But learning the new skills doesn’t happen overnight, and sometimes the redundant workers simply aren’t capable of adapting—that will have to wait for a new generation of workers. For an example of labor market transformation that we have weathered successfully, consider agriculture. As recently as the early 1800s, farms employed a remarkable 80 percent of U.S. workers. Consider what this means. Producing food was by far the dominant thing people did for a living, and no doubt this pattern had been typical since the invention of agriculture about five thousand years ago. But by 1900, that figure had dropped in half, to 40 percent, and today it’s only 1.5 percent, including unpaid family and undocumented workers. Basically, we managed to automate nearly everyone out of a job, but instead of causing widespread unemployment, we freed people up for a host of other productive and wealth-producing activities. So over the last two centuries the U.S. economy was able to absorb on average about 1/2 percent loss of agricultural job opportunities each year without any obvious dislocations. Now imagine that this had happened in two decades instead of two centuries. Your father worked on a farm, and his father before him, as far back as anyone could remember. Then a Henry Ford of farming revolutionized the entire industry in what seemed like a flash. The ground shook with the rumble of shiny new plows, threshers, and harvesters; the air was thick with the smell of diesel. Food prices plummeted, and corporations bought up farmland everywhere with the  backing  of deep-pocketed Wall  Street financiers. Within a few years, your family’s farm was lost to foreclosure, along with every possession except the family Bible. You and your five brothers and sisters, with an average third-grade education, found your skills of shoeing horses, plowing straight furrows, and baling hay utterly useless, as did all of your neighbors. But you still had to eat. You knew someone who knew someone who operated one of the new machines twelve hours a day in return for three squares, who supposedly got the job in Topeka, so you moved to one of the vast tent cities ringing the major Midwestern cities in the hope of finding work—any kind of work. Before long, you got word that your parents sold the Bible to buy medicine for your youngest sister, but she died of dysentery anyway. Eventually you lost track of the rest of your other siblings. The 1 percent who still had jobs lived in tiny tract houses and barely got by, but they were nonetheless the envy of the rest—at least they had a solid roof over their heads. Each day, you waited in line outside their gated communities hoping for a chance to wash their clothes or deliver their bag lunches. Rumors spread that the daughters of the storied entrepreneur who changed the world had used his vast fortune to build a fabulous art museum made of crystal in a small town in Arkansas. But all this was before the revolution. After that, things got really bad. I’m going to argue that a similarly tectonic shift looms ahead, though doubtlessly less dramatic and more humane. Forged laborers will displace the need for most skilled labor; synthetic intellects will largely supplant the skilled trades of the educated. When initially deployed, many new technologies will substitute directly for workers, getting the job done pretty much the same way. But other innovations will not only idle the workers; they will eliminate the types of jobs that they perform. For example, consider the way Amazon constantly adapts the stock patterns in its warehouses. If a person were to do the warehouse planning (as in many more traditional companies), products might be organized in a logical and comprehensible way—identical items would be stored next to each other, for example, so when you needed to pick one, you knew where it was. But a synthetic intellect of the sort Amazon has built isn’t subject to this constraint. Like items can be located next to others that are frequently shipped with them, or on any shelf where they fit more compactly. To the human eye, it looks like chaos—products of different sizes and shapes are stacked randomly everywhere—which is why this type of warehouse organization is known as chaotic storage. But a synthetic intellect can keep track of everything and direct a worker to exactly the right place to fulfill an order far more efficiently than a human organizer could. A side effect of introducing this innovation is that it reduces the training and knowledge required of warehouse workers, making them more susceptible to replacement by forged laborers. These employees no longer have to be familiar with the location of products on the shelves; indeed, it would be near impossible to do so in such a haphazard and evolving environment. Having first simplified the skills required to get the job done, Amazon can now replace the workers that roam the warehouse floor picking those orders. This is likely why the company bought the robotics company Kiva Systems, reportedly for $775 million, in 2012. This is a single example of a profound shift that synthetic intellects will cause in our world. The need to impose order—not only for warehouses but for just about everything—is driven by the limitations of the human mind. Synthetic intellects suffer no such constraint, and their impact will turn tidiness to turmoil in many aspects of our lives. Our efforts to tame our intellectual and physical domains into manicured gardens will give way to tangled thickets, impenetrable by us. When most people think about automation, they usually have in mind only the simple replacement of labor or improving workers’ speed or productivity, not the more extensive disruption caused by process reengineering. That’s why some jobs that you might least expect to succumb to automation may nonetheless disappear. For instance, studies often cite jobs that require good people skills or powers of persuasion as examples of ones unlikely to be automated in the foreseeable future. But this isn’t necessarily the case. The ability to convince you that you look terrific in a particular outfit is certainly the hallmark of a successful salesperson. But why do you need that person when you can ask hundreds of real people? Imagine a clothing store where you are photographed in several different outfits, and the images are immediately (and anonymously, by obscuring your face) posted to a special website where visitors can offer their opinion as to which one makes you look slimmer. Within seconds, you get objective, statistically reliable feedback from impartial strangers, who earn points if you complete a purchase. (This concept is called “crowdsourcing.”) Why put your faith in a salesperson motivated by commission when you can find out for sure? Reflecting these two different effects of automation on labor (replacing workers and rendering skills obsolete), economists have two different names for the resulting unemployment. The first is “cyclical,” meaning that people are cycling in and out of jobs. In bad times, the pool of people who are between jobs may grow, leading to higher unemployment. But historically, as soon as the economy picks up, the idled workers find new jobs. Fewer people are unemployed and for shorter periods of time. This works just like the housing market: in a slow market, there are more houses available and the ones that are take longer to sell. But when the market turns around this excess inventory is quickly absorbed. I was surprised to learn just how much turnover there is in the U.S. labor market. In 2013, a fairly typical year, 40 percent of workers changed jobs. That’s a very fluid market. By contrast, less than 4 percent of homes are sold each year. So when we talk about 8 percent unemployment, it doesn’t take long for small changes in the rates of job creation and destruction to soak that up, or conversely to spill more people out of work. The other type of unemployment is called “structural,” which means that some group of unemployed simply can’t find suitable employment at all. They can send out résumés all day long, but no one wants to hire them, because their skills are a poor match for the available jobs. The equivalent in the housing market would be if the types of houses available weren’t suitable for the available buyers. Suddenly couples start having triplets instead of single kids and so need more bedrooms, or people start commuting to work in flying cars that can take off only from flat rooftops, while most houses have pitched roofs. As you can see from my fanciful examples, the factors that change the desirability of housing don’t usually change very fast, so builders and remodelers have plenty of time to adapt. But this isn’t true for automation because the pace of invention and the rate of adoption can change quickly and unpredictably, shifting the character of whole labor market segments far more rapidly than people can learn new skills—if they can be retrained at all. We’re buffeted about by these fickle winds precisely because they are hard to anticipate and virtually impossible to measure. Economists and academics who study labor markets  have a natural bias toward the quantifiable. This is understandable, because to credibly sound the alarm, they must have the hard data to back it up. Their work must stand up to objective, independent peer review, which basically means it must be reduced to numbers. But as I learned in business, spreadsheets and financial statements can capture only certain things, while trends that resist reduction to measurement often dominate the outcome. (Indeed, there’s an argument to be made that the troublesome and unpredictable business cycles plaguing our economy are largely driven by the fact that returns are easily quantified, but risks are not.) I can’t count the number of meticulously detailed yet bogus sales projections I’ve seen bamboozle management teams. At work I sometimes felt my most important contribution as a manager was anticipating that which had yet to manifest itself in quantifiable form. But talking about the overall labor market, unemployment statistics, or the aggregate rate of change obscures the reality of the situation because the landscape of useful skills shifts erratically. The complexity of this web of disappearing labor habitats and evolving job ecosystems resists analysis by traditional mathematical tools, which is why attempts to quantify this whole process tend to bog down in reams of charts and tables or devolve into hand-waving. Luckily I’m not bound by these same professional constraints, so fasten your seat belt for a quick tour of the future. My approach will be to look at some specific examples, then attempt to reason by analogy to get a broader picture. Let’s start with retail—the largest commercial job market, as determined by the U.S. Bureau of Labor Statistics (BLS). The BLS reports that about 10 percent of all U.S. workers are employed in retailing, or approximately 4.5 million people. To analyze trends, let’s use salespersons as a proxy for the whole group. The BLS projects that this labor force, which stood at 4.4 million in 2012, will grow by 10 percent to 4.9 million over the next ten years. But this is based on current demographic trends, not a qualitative analysis of what’s actually going on in the industry. To get a sense of what’s really going to happen, consider the effect on employment of the transition from bricks-and-mortar stores to online retailers. A useful way to analyze this is to use a statistic called revenue per employee. You take the total annual revenue of a company and divide it by the number of employees. It’s a standard measure of how efficient a company is, or at least how labor-efficient. Average revenue per employee for Amazon (the largest online retailer) over the past five years is around $855,000. Compare that to Walmart (the largest bricks-and-mortar retailer), whose revenue per employee is around $213,000—one of the highest of any retailer. This means that for each $1 million in sales, Walmart employs about five people. But for the same amount of sales, Amazon employs slightly more than one person. So for every $1 million in sales that shift from Walmart to Amazon, four jobs are potentially lost. Now, both companies sell pretty much the same stuff. And Walmart does a good portion of its sales online as well, so the job loss implied by the shift to online sales is understated. And neither company is standing still; both are likely to grow more efficient in the future. Excerpted from "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" by Jerry Kaplan, published by Yale University Press. Copyright c 2015 by Jerry Kaplan. Reprinted by permission of the publisher. All rights reserved.Global warming in and of itself isn’t a problem. After all, life on earth has survived numerous cycles of cooling and heating. The real problem with global warming is how quickly it happens. If there isn’t enough time for living things (including us) to adapt, rapid changes in climate, not to mention more volatile weather patterns, can sow havoc. The consequences of catastrophic climate change can reverberate for centuries as species suffer horrific losses of their habitat, leading to mass extinctions. The impact of technological change on our labor markets works the same way. As long as change is gradual, the markets can respond. Too fast, and it’s chaos. And as with my particular environmental preferences, it creates winners and losers. The likely accelerating effect of recent advances in artificial intelligence on technological change is going to roil our labor markets in two fundamental ways. The first is the simple truth that most automation replaces workers, so it eliminates jobs. That means fewer places for people to work. This threat is easy to see and measure— employers roll in a robot and walk a worker to the door. But sometimes change is less visible. Each new workstation may eliminate the need for one-fifth of a salesperson, or free Skype calls may allow you to work more productively at home one day a week, deferring the need for that new hire until next quarter. If this happens slowly, the resulting improvements in productivity and reduced cost eventually create wealth, stimulating job growth that compensates for the losses. The growth may be directly in the newly improved enterprise, as lower prices and better quality increase sales, creating a need to hire more workers. Or it may be in distant parts of the economy where the customers who no longer need to pay as much for some product or service decide to spend the money they saved. If new drilling technologies cause natural gas prices to drop, there’s more left over from your paycheck to save for that sailboat you’ve got your eye on. But the second threat is much more subtle and difficult to predict. Many technological advances change the rules of the game by permitting businesses to reorganize and reengineer  the  way they operate. These organizational and process improvements often make obsolete not only jobs but skills. A teller may get laid off when a bank installs ATMs; the improved service creates a need to hire network engineers but not tellers. Even if the bank ultimately expands its total workforce, the tellers remain out of luck. Weavers can eventually learn to operate looms, gardeners to service lawnmowers, and doctors to use computers to select the right antibiotics—once they accept that synthetic intellects are superior to their own professional judgment. But learning the new skills doesn’t happen overnight, and sometimes the redundant workers simply aren’t capable of adapting—that will have to wait for a new generation of workers. For an example of labor market transformation that we have weathered successfully, consider agriculture. As recently as the early 1800s, farms employed a remarkable 80 percent of U.S. workers. Consider what this means. Producing food was by far the dominant thing people did for a living, and no doubt this pattern had been typical since the invention of agriculture about five thousand years ago. But by 1900, that figure had dropped in half, to 40 percent, and today it’s only 1.5 percent, including unpaid family and undocumented workers. Basically, we managed to automate nearly everyone out of a job, but instead of causing widespread unemployment, we freed people up for a host of other productive and wealth-producing activities. So over the last two centuries the U.S. economy was able to absorb on average about 1/2 percent loss of agricultural job opportunities each year without any obvious dislocations. Now imagine that this had happened in two decades instead of two centuries. Your father worked on a farm, and his father before him, as far back as anyone could remember. Then a Henry Ford of farming revolutionized the entire industry in what seemed like a flash. The ground shook with the rumble of shiny new plows, threshers, and harvesters; the air was thick with the smell of diesel. Food prices plummeted, and corporations bought up farmland everywhere with the  backing  of deep-pocketed Wall  Street financiers. Within a few years, your family’s farm was lost to foreclosure, along with every possession except the family Bible. You and your five brothers and sisters, with an average third-grade education, found your skills of shoeing horses, plowing straight furrows, and baling hay utterly useless, as did all of your neighbors. But you still had to eat. You knew someone who knew someone who operated one of the new machines twelve hours a day in return for three squares, who supposedly got the job in Topeka, so you moved to one of the vast tent cities ringing the major Midwestern cities in the hope of finding work—any kind of work. Before long, you got word that your parents sold the Bible to buy medicine for your youngest sister, but she died of dysentery anyway. Eventually you lost track of the rest of your other siblings. The 1 percent who still had jobs lived in tiny tract houses and barely got by, but they were nonetheless the envy of the rest—at least they had a solid roof over their heads. Each day, you waited in line outside their gated communities hoping for a chance to wash their clothes or deliver their bag lunches. Rumors spread that the daughters of the storied entrepreneur who changed the world had used his vast fortune to build a fabulous art museum made of crystal in a small town in Arkansas. But all this was before the revolution. After that, things got really bad. I’m going to argue that a similarly tectonic shift looms ahead, though doubtlessly less dramatic and more humane. Forged laborers will displace the need for most skilled labor; synthetic intellects will largely supplant the skilled trades of the educated. When initially deployed, many new technologies will substitute directly for workers, getting the job done pretty much the same way. But other innovations will not only idle the workers; they will eliminate the types of jobs that they perform. For example, consider the way Amazon constantly adapts the stock patterns in its warehouses. If a person were to do the warehouse planning (as in many more traditional companies), products might be organized in a logical and comprehensible way—identical items would be stored next to each other, for example, so when you needed to pick one, you knew where it was. But a synthetic intellect of the sort Amazon has built isn’t subject to this constraint. Like items can be located next to others that are frequently shipped with them, or on any shelf where they fit more compactly. To the human eye, it looks like chaos—products of different sizes and shapes are stacked randomly everywhere—which is why this type of warehouse organization is known as chaotic storage. But a synthetic intellect can keep track of everything and direct a worker to exactly the right place to fulfill an order far more efficiently than a human organizer could. A side effect of introducing this innovation is that it reduces the training and knowledge required of warehouse workers, making them more susceptible to replacement by forged laborers. These employees no longer have to be familiar with the location of products on the shelves; indeed, it would be near impossible to do so in such a haphazard and evolving environment. Having first simplified the skills required to get the job done, Amazon can now replace the workers that roam the warehouse floor picking those orders. This is likely why the company bought the robotics company Kiva Systems, reportedly for $775 million, in 2012. This is a single example of a profound shift that synthetic intellects will cause in our world. The need to impose order—not only for warehouses but for just about everything—is driven by the limitations of the human mind. Synthetic intellects suffer no such constraint, and their impact will turn tidiness to turmoil in many aspects of our lives. Our efforts to tame our intellectual and physical domains into manicured gardens will give way to tangled thickets, impenetrable by us. When most people think about automation, they usually have in mind only the simple replacement of labor or improving workers’ speed or productivity, not the more extensive disruption caused by process reengineering. That’s why some jobs that you might least expect to succumb to automation may nonetheless disappear. For instance, studies often cite jobs that require good people skills or powers of persuasion as examples of ones unlikely to be automated in the foreseeable future. But this isn’t necessarily the case. The ability to convince you that you look terrific in a particular outfit is certainly the hallmark of a successful salesperson. But why do you need that person when you can ask hundreds of real people? Imagine a clothing store where you are photographed in several different outfits, and the images are immediately (and anonymously, by obscuring your face) posted to a special website where visitors can offer their opinion as to which one makes you look slimmer. Within seconds, you get objective, statistically reliable feedback from impartial strangers, who earn points if you complete a purchase. (This concept is called “crowdsourcing.”) Why put your faith in a salesperson motivated by commission when you can find out for sure? Reflecting these two different effects of automation on labor (replacing workers and rendering skills obsolete), economists have two different names for the resulting unemployment. The first is “cyclical,” meaning that people are cycling in and out of jobs. In bad times, the pool of people who are between jobs may grow, leading to higher unemployment. But historically, as soon as the economy picks up, the idled workers find new jobs. Fewer people are unemployed and for shorter periods of time. This works just like the housing market: in a slow market, there are more houses available and the ones that are take longer to sell. But when the market turns around this excess inventory is quickly absorbed. I was surprised to learn just how much turnover there is in the U.S. labor market. In 2013, a fairly typical year, 40 percent of workers changed jobs. That’s a very fluid market. By contrast, less than 4 percent of homes are sold each year. So when we talk about 8 percent unemployment, it doesn’t take long for small changes in the rates of job creation and destruction to soak that up, or conversely to spill more people out of work. The other type of unemployment is called “structural,” which means that some group of unemployed simply can’t find suitable employment at all. They can send out résumés all day long, but no one wants to hire them, because their skills are a poor match for the available jobs. The equivalent in the housing market would be if the types of houses available weren’t suitable for the available buyers. Suddenly couples start having triplets instead of single kids and so need more bedrooms, or people start commuting to work in flying cars that can take off only from flat rooftops, while most houses have pitched roofs. As you can see from my fanciful examples, the factors that change the desirability of housing don’t usually change very fast, so builders and remodelers have plenty of time to adapt. But this isn’t true for automation because the pace of invention and the rate of adoption can change quickly and unpredictably, shifting the character of whole labor market segments far more rapidly than people can learn new skills—if they can be retrained at all. We’re buffeted about by these fickle winds precisely because they are hard to anticipate and virtually impossible to measure. Economists and academics who study labor markets  have a natural bias toward the quantifiable. This is understandable, because to credibly sound the alarm, they must have the hard data to back it up. Their work must stand up to objective, independent peer review, which basically means it must be reduced to numbers. But as I learned in business, spreadsheets and financial statements can capture only certain things, while trends that resist reduction to measurement often dominate the outcome. (Indeed, there’s an argument to be made that the troublesome and unpredictable business cycles plaguing our economy are largely driven by the fact that returns are easily quantified, but risks are not.) I can’t count the number of meticulously detailed yet bogus sales projections I’ve seen bamboozle management teams. At work I sometimes felt my most important contribution as a manager was anticipating that which had yet to manifest itself in quantifiable form. But talking about the overall labor market, unemployment statistics, or the aggregate rate of change obscures the reality of the situation because the landscape of useful skills shifts erratically. The complexity of this web of disappearing labor habitats and evolving job ecosystems resists analysis by traditional mathematical tools, which is why attempts to quantify this whole process tend to bog down in reams of charts and tables or devolve into hand-waving. Luckily I’m not bound by these same professional constraints, so fasten your seat belt for a quick tour of the future. My approach will be to look at some specific examples, then attempt to reason by analogy to get a broader picture. Let’s start with retail—the largest commercial job market, as determined by the U.S. Bureau of Labor Statistics (BLS). The BLS reports that about 10 percent of all U.S. workers are employed in retailing, or approximately 4.5 million people. To analyze trends, let’s use salespersons as a proxy for the whole group. The BLS projects that this labor force, which stood at 4.4 million in 2012, will grow by 10 percent to 4.9 million over the next ten years. But this is based on current demographic trends, not a qualitative analysis of what’s actually going on in the industry. To get a sense of what’s really going to happen, consider the effect on employment of the transition from bricks-and-mortar stores to online retailers. A useful way to analyze this is to use a statistic called revenue per employee. You take the total annual revenue of a company and divide it by the number of employees. It’s a standard measure of how efficient a company is, or at least how labor-efficient. Average revenue per employee for Amazon (the largest online retailer) over the past five years is around $855,000. Compare that to Walmart (the largest bricks-and-mortar retailer), whose revenue per employee is around $213,000—one of the highest of any retailer. This means that for each $1 million in sales, Walmart employs about five people. But for the same amount of sales, Amazon employs slightly more than one person. So for every $1 million in sales that shift from Walmart to Amazon, four jobs are potentially lost. Now, both companies sell pretty much the same stuff. And Walmart does a good portion of its sales online as well, so the job loss implied by the shift to online sales is understated. And neither company is standing still; both are likely to grow more efficient in the future. Excerpted from "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" by Jerry Kaplan, published by Yale University Press. Copyright c 2015 by Jerry Kaplan. Reprinted by permission of the publisher. All rights reserved.Global warming in and of itself isn’t a problem. After all, life on earth has survived numerous cycles of cooling and heating. The real problem with global warming is how quickly it happens. If there isn’t enough time for living things (including us) to adapt, rapid changes in climate, not to mention more volatile weather patterns, can sow havoc. The consequences of catastrophic climate change can reverberate for centuries as species suffer horrific losses of their habitat, leading to mass extinctions. The impact of technological change on our labor markets works the same way. As long as change is gradual, the markets can respond. Too fast, and it’s chaos. And as with my particular environmental preferences, it creates winners and losers. The likely accelerating effect of recent advances in artificial intelligence on technological change is going to roil our labor markets in two fundamental ways. The first is the simple truth that most automation replaces workers, so it eliminates jobs. That means fewer places for people to work. This threat is easy to see and measure— employers roll in a robot and walk a worker to the door. But sometimes change is less visible. Each new workstation may eliminate the need for one-fifth of a salesperson, or free Skype calls may allow you to work more productively at home one day a week, deferring the need for that new hire until next quarter. If this happens slowly, the resulting improvements in productivity and reduced cost eventually create wealth, stimulating job growth that compensates for the losses. The growth may be directly in the newly improved enterprise, as lower prices and better quality increase sales, creating a need to hire more workers. Or it may be in distant parts of the economy where the customers who no longer need to pay as much for some product or service decide to spend the money they saved. If new drilling technologies cause natural gas prices to drop, there’s more left over from your paycheck to save for that sailboat you’ve got your eye on. But the second threat is much more subtle and difficult to predict. Many technological advances change the rules of the game by permitting businesses to reorganize and reengineer  the  way they operate. These organizational and process improvements often make obsolete not only jobs but skills. A teller may get laid off when a bank installs ATMs; the improved service creates a need to hire network engineers but not tellers. Even if the bank ultimately expands its total workforce, the tellers remain out of luck. Weavers can eventually learn to operate looms, gardeners to service lawnmowers, and doctors to use computers to select the right antibiotics—once they accept that synthetic intellects are superior to their own professional judgment. But learning the new skills doesn’t happen overnight, and sometimes the redundant workers simply aren’t capable of adapting—that will have to wait for a new generation of workers. For an example of labor market transformation that we have weathered successfully, consider agriculture. As recently as the early 1800s, farms employed a remarkable 80 percent of U.S. workers. Consider what this means. Producing food was by far the dominant thing people did for a living, and no doubt this pattern had been typical since the invention of agriculture about five thousand years ago. But by 1900, that figure had dropped in half, to 40 percent, and today it’s only 1.5 percent, including unpaid family and undocumented workers. Basically, we managed to automate nearly everyone out of a job, but instead of causing widespread unemployment, we freed people up for a host of other productive and wealth-producing activities. So over the last two centuries the U.S. economy was able to absorb on average about 1/2 percent loss of agricultural job opportunities each year without any obvious dislocations. Now imagine that this had happened in two decades instead of two centuries. Your father worked on a farm, and his father before him, as far back as anyone could remember. Then a Henry Ford of farming revolutionized the entire industry in what seemed like a flash. The ground shook with the rumble of shiny new plows, threshers, and harvesters; the air was thick with the smell of diesel. Food prices plummeted, and corporations bought up farmland everywhere with the  backing  of deep-pocketed Wall  Street financiers. Within a few years, your family’s farm was lost to foreclosure, along with every possession except the family Bible. You and your five brothers and sisters, with an average third-grade education, found your skills of shoeing horses, plowing straight furrows, and baling hay utterly useless, as did all of your neighbors. But you still had to eat. You knew someone who knew someone who operated one of the new machines twelve hours a day in return for three squares, who supposedly got the job in Topeka, so you moved to one of the vast tent cities ringing the major Midwestern cities in the hope of finding work—any kind of work. Before long, you got word that your parents sold the Bible to buy medicine for your youngest sister, but she died of dysentery anyway. Eventually you lost track of the rest of your other siblings. The 1 percent who still had jobs lived in tiny tract houses and barely got by, but they were nonetheless the envy of the rest—at least they had a solid roof over their heads. Each day, you waited in line outside their gated communities hoping for a chance to wash their clothes or deliver their bag lunches. Rumors spread that the daughters of the storied entrepreneur who changed the world had used his vast fortune to build a fabulous art museum made of crystal in a small town in Arkansas. But all this was before the revolution. After that, things got really bad. I’m going to argue that a similarly tectonic shift looms ahead, though doubtlessly less dramatic and more humane. Forged laborers will displace the need for most skilled labor; synthetic intellects will largely supplant the skilled trades of the educated. When initially deployed, many new technologies will substitute directly for workers, getting the job done pretty much the same way. But other innovations will not only idle the workers; they will eliminate the types of jobs that they perform. For example, consider the way Amazon constantly adapts the stock patterns in its warehouses. If a person were to do the warehouse planning (as in many more traditional companies), products might be organized in a logical and comprehensible way—identical items would be stored next to each other, for example, so when you needed to pick one, you knew where it was. But a synthetic intellect of the sort Amazon has built isn’t subject to this constraint. Like items can be located next to others that are frequently shipped with them, or on any shelf where they fit more compactly. To the human eye, it looks like chaos—products of different sizes and shapes are stacked randomly everywhere—which is why this type of warehouse organization is known as chaotic storage. But a synthetic intellect can keep track of everything and direct a worker to exactly the right place to fulfill an order far more efficiently than a human organizer could. A side effect of introducing this innovation is that it reduces the training and knowledge required of warehouse workers, making them more susceptible to replacement by forged laborers. These employees no longer have to be familiar with the location of products on the shelves; indeed, it would be near impossible to do so in such a haphazard and evolving environment. Having first simplified the skills required to get the job done, Amazon can now replace the workers that roam the warehouse floor picking those orders. This is likely why the company bought the robotics company Kiva Systems, reportedly for $775 million, in 2012. This is a single example of a profound shift that synthetic intellects will cause in our world. The need to impose order—not only for warehouses but for just about everything—is driven by the limitations of the human mind. Synthetic intellects suffer no such constraint, and their impact will turn tidiness to turmoil in many aspects of our lives. Our efforts to tame our intellectual and physical domains into manicured gardens will give way to tangled thickets, impenetrable by us. When most people think about automation, they usually have in mind only the simple replacement of labor or improving workers’ speed or productivity, not the more extensive disruption caused by process reengineering. That’s why some jobs that you might least expect to succumb to automation may nonetheless disappear. For instance, studies often cite jobs that require good people skills or powers of persuasion as examples of ones unlikely to be automated in the foreseeable future. But this isn’t necessarily the case. The ability to convince you that you look terrific in a particular outfit is certainly the hallmark of a successful salesperson. But why do you need that person when you can ask hundreds of real people? Imagine a clothing store where you are photographed in several different outfits, and the images are immediately (and anonymously, by obscuring your face) posted to a special website where visitors can offer their opinion as to which one makes you look slimmer. Within seconds, you get objective, statistically reliable feedback from impartial strangers, who earn points if you complete a purchase. (This concept is called “crowdsourcing.”) Why put your faith in a salesperson motivated by commission when you can find out for sure? Reflecting these two different effects of automation on labor (replacing workers and rendering skills obsolete), economists have two different names for the resulting unemployment. The first is “cyclical,” meaning that people are cycling in and out of jobs. In bad times, the pool of people who are between jobs may grow, leading to higher unemployment. But historically, as soon as the economy picks up, the idled workers find new jobs. Fewer people are unemployed and for shorter periods of time. This works just like the housing market: in a slow market, there are more houses available and the ones that are take longer to sell. But when the market turns around this excess inventory is quickly absorbed. I was surprised to learn just how much turnover there is in the U.S. labor market. In 2013, a fairly typical year, 40 percent of workers changed jobs. That’s a very fluid market. By contrast, less than 4 percent of homes are sold each year. So when we talk about 8 percent unemployment, it doesn’t take long for small changes in the rates of job creation and destruction to soak that up, or conversely to spill more people out of work. The other type of unemployment is called “structural,” which means that some group of unemployed simply can’t find suitable employment at all. They can send out résumés all day long, but no one wants to hire them, because their skills are a poor match for the available jobs. The equivalent in the housing market would be if the types of houses available weren’t suitable for the available buyers. Suddenly couples start having triplets instead of single kids and so need more bedrooms, or people start commuting to work in flying cars that can take off only from flat rooftops, while most houses have pitched roofs. As you can see from my fanciful examples, the factors that change the desirability of housing don’t usually change very fast, so builders and remodelers have plenty of time to adapt. But this isn’t true for automation because the pace of invention and the rate of adoption can change quickly and unpredictably, shifting the character of whole labor market segments far more rapidly than people can learn new skills—if they can be retrained at all. We’re buffeted about by these fickle winds precisely because they are hard to anticipate and virtually impossible to measure. Economists and academics who study labor markets  have a natural bias toward the quantifiable. This is understandable, because to credibly sound the alarm, they must have the hard data to back it up. Their work must stand up to objective, independent peer review, which basically means it must be reduced to numbers. But as I learned in business, spreadsheets and financial statements can capture only certain things, while trends that resist reduction to measurement often dominate the outcome. (Indeed, there’s an argument to be made that the troublesome and unpredictable business cycles plaguing our economy are largely driven by the fact that returns are easily quantified, but risks are not.) I can’t count the number of meticulously detailed yet bogus sales projections I’ve seen bamboozle management teams. At work I sometimes felt my most important contribution as a manager was anticipating that which had yet to manifest itself in quantifiable form. But talking about the overall labor market, unemployment statistics, or the aggregate rate of change obscures the reality of the situation because the landscape of useful skills shifts erratically. The complexity of this web of disappearing labor habitats and evolving job ecosystems resists analysis by traditional mathematical tools, which is why attempts to quantify this whole process tend to bog down in reams of charts and tables or devolve into hand-waving. Luckily I’m not bound by these same professional constraints, so fasten your seat belt for a quick tour of the future. My approach will be to look at some specific examples, then attempt to reason by analogy to get a broader picture. Let’s start with retail—the largest commercial job market, as determined by the U.S. Bureau of Labor Statistics (BLS). The BLS reports that about 10 percent of all U.S. workers are employed in retailing, or approximately 4.5 million people. To analyze trends, let’s use salespersons as a proxy for the whole group. The BLS projects that this labor force, which stood at 4.4 million in 2012, will grow by 10 percent to 4.9 million over the next ten years. But this is based on current demographic trends, not a qualitative analysis of what’s actually going on in the industry. To get a sense of what’s really going to happen, consider the effect on employment of the transition from bricks-and-mortar stores to online retailers. A useful way to analyze this is to use a statistic called revenue per employee. You take the total annual revenue of a company and divide it by the number of employees. It’s a standard measure of how efficient a company is, or at least how labor-efficient. Average revenue per employee for Amazon (the largest online retailer) over the past five years is around $855,000. Compare that to Walmart (the largest bricks-and-mortar retailer), whose revenue per employee is around $213,000—one of the highest of any retailer. This means that for each $1 million in sales, Walmart employs about five people. But for the same amount of sales, Amazon employs slightly more than one person. So for every $1 million in sales that shift from Walmart to Amazon, four jobs are potentially lost. Now, both companies sell pretty much the same stuff. And Walmart does a good portion of its sales online as well, so the job loss implied by the shift to online sales is understated. And neither company is standing still; both are likely to grow more efficient in the future. Excerpted from "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" by Jerry Kaplan, published by Yale University Press. Copyright c 2015 by Jerry Kaplan. Reprinted by permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 11:00

Meet the Tea Party’s evangelical quack: David Barton is Glenn Beck’s favorite “historian”

The popular dissemination of Reconstructionist ideas is evident in the framing and language used by people in the religious right, if you have an ear for it. I think of this as analogous to the way in which a New Englander can hear the difference between a Maine accent and a Boston one, or how a Southerner can tell if a speaker is from North Carolina or South Carolina; it is subtle, but it is undeniably there. There is perhaps no better example of Christian Reconstructionist influence on the broader culture than the work of Tea Party “historian” David Barton. Barton does not explicitly identify as a Christian Reconstructionist, and Christian Reconstructionists would not claim him as one of their own.2Barton does have ties to several Reconstructionist groups, including the Providence Foundation; he occasionally cites the work of Rousas Rushdoony and promotes views on race and slavery that are rooted in Rushdoony. While Barton doesn’t use the language of theonomy or postmillennialism, as we will see, he speaks of dominion, biblical law, the necessity of bringing every area of life under the lordship of Christ, and sphere sovereignty of biblically ordained institutions. He embraces the whole range of political views advocated by Reconstructionists from the right-to-life and creationism to more narrowly held positions on issues such as the history of slavery and opposition to the Federal Reserve System. As we shall see, the approach to history that has made Barton famous is rooted in Rushdoony’s biblical philosophy of history. Barton was born in 1954, raised in Aledo, Texas, and graduated from public high school in 1972, the same year his parents started a house church with Pentecostal leanings. By 1974 the church had moved into facilities that now also house the Christian school they started in 1981, as well as Barton’s organization, Wallbuilders. After high school, Barton attended Oral Roberts University, where he received a degree in religious education in 1976. Upon returning home, he became principal of Aledo Christian School until, a decade later, as he tells it in an interview, God led him to his first book by showing him the connection between the Supreme Court decisions on prayer and Bible reading and “plummeting” academic achievement scores and “soaring” student crime and immorality.
In July 1987, God impressed me to do two things. First, I was to search the library and find the date that prayer had been prohibited in public schools. Second, I was to obtain a record of national SAT scores . . . spanning several decades. I didn’t know why, but I somehow knew that these two pieces of information would be very important.
The result was his America; to Pray or Not to Pray, which is a statistical analysis of the “volume of prayers being offered” overlaid with data on a number of social problems, to compare the “prayer years with the post prayer years.” According to Barton, the drop in prayer was so dramatic that its impact was felt not just in the schools but in every aspect of our national life. Barton seemed unaware of the notion that correlation is not causation. A self-styled historian with no real academic credentials, Barton went on to build an extensive collection of primary source documents from America’s founding era and write several “Christian American history” books that argue that the founding fathers intended America to be a Christian nation and that argue for a Christian reading of the Constitution they wrote. This work has shaped a generation of Christian school and homeschool students. Despite being roundly rejected by scholars, Barton claims to be a “recognized authority in American history and the role of religion in public life.” For example, an amicus brief filed by Wallbuilders in McCollum v. California Department of Corrections and Rehabilitation claims Barton works as a consultant to national history textbook publishers. He has been appointed by the State Boards of Education in states such as California and Texas to help write the American history and government standards for students in those states. Mr. Barton also consults with Governors and State Boards of Education in several states, and he has testified in numerous state legislatures on American history. Examples include a 1998 appointment as an advisor to the California Academic Standards Commission and a 2009 appointment as a reviewer in the Texas Board of Education’s effort to revise the state’s social science curriculum. In each case, Barton was one of three conservative “outside experts” appointed to review the curriculum to ensure that children learn that America was founded on biblical principles. As “experts” they sought changes to the curriculum to ensure that Christianity was presented “as an overall force for good—and a key reason for American Exceptionalism, the notion that the country stands above and apart.” Indeed, when Barton invoked his position as a curriculum consultant on Jon Stewart’s Daily Show, Stewart asked for whom he had done this work, and Barton refused to name anyone, saying “if they don’t name names then I don’t.” In 2005 Barton was included in Time magazine’s list of the twenty-five most influential evangelicals, but it was his association with Fox News’ Glenn Beck, who called him the most important man in America, that catapulted him into another level of influence. By 2011 Barton could boast that Republican primary presidential candidates Newt Gingrich and Michele Bachmann consulted him. Bachmann even invited him to speak to her congressional Tea Party Caucus on the history of the Constitution. Mike Huckabee infamously said that “every American should be forced to listen to Barton at gunpoint.” Barton’s presentation style makes on-the-spot critical engagement difficult. He jumps, at lightning speed, from one piece of data to another, interpreted through his “biblical” framework; he creates a barrage of information, tied to small pieces of familiar truth and rooted in an apparently vast collection of primary documents. Barton is one of the very best examples of the way in which the Tea Party is about much more than taxes, and he’s been at the center of its rise. In addition to being promoted by Glenn Beck, he travels the country presenting his Constitutional Seminars and selling materials promoting his views to churches, civic organizations, Christian schools, and Christian homeschoolers. Barton’s work has been the subject of extensive critique by bloggers, reporters, and other critics, some of whom are scholars publishing peer-reviewed critiques, but, for the most part, scholars have not devoted a lot of attention to debunking his claims. Beginning in about 2011, two conservative Christian professors from Grove City College, Warren Throckmorton, professor of psychology, and Michael Coulter, professor of humanities and political science, published a critique of Barton’s The Jefferson Lies entitled Getting Jefferson Right: Fact Checking Claims about Our Third President. The book was received well by scholars, and the authors’ credentials as conservative Christians undermined Barton’s defense that criticism of his work was ideological rather than factual. The Jefferson Lies was withdrawn by its publisher. One might expect under the weight of such resounding rejection, Barton would disappear into obscurity. Yet Barton’s supporters remain as devoted as before. Criticism from scholars (whether Christian or not) is dismissed as liberal, socialist, and even pagan. Discredited in the larger culture, Barton remains influential in the conservative Christian subculture. Barton and the Constitution In 2011, Barton’s radio program Wallbuilders Live carried a three-part series on the Constitution and “the principles of limited government” that illustrated well how he draws the conclusions he does regarding what the Constitution meant to the founders. The spectrum of activists calling themselves “constitutionalists”—including Barton but ranging from avowed Reconstructionists to Tea Partiers who claim their movement is solely about taxes and limited government—read the Constitution in the context of the Declaration of Independence to invoke the authority of the Creator in an otherwise godless document. The first of Barton’s three-part series lays out exactly how this works. Many look at the US Constitution and see little mention of religion and wonder how conservative Christians can insist that it is a template for a Christian nation. But Barton is careful to speak, instead, of our “original national founding documents.” For Barton and his followers, the Declaration of Independence, though never ratified and carrying no legal authority, has the same status as the Constitution. Indeed, in their view, the Constitution can only be read in the context of the Declaration:
Go back to our original national document, our original founding document, the Declaration of Independence. In the first forty-six words . . . they tell us the philosophy of government that has produced America’s exceptionalism . . . two points immediately become clear in that opening statement of our first national government document. Number one, they say there is a divine creator, and number two, the divine creator gives guaranteed rights to men . . . there’s a God and God gives specific rights to men.
Barton asserts that the founders believed there were a handful of unalienable rights, the most important of which are life, liberty, and property. He occasionally acknowledges that the language in the Declaration is slightly different (life, liberty, and the pursuit of happiness), but he argues that the pursuit of happiness is grounded in property, making the two terms interchangeable. He more often uses the term “property.” These rights are understood to come directly from God, and the purpose of government (and therefore the Constitution the founders wrote) is limited to securing those rights. According to Barton, in language that became common Tea Party rhetoric, an inalienable right is “a right to which we are entitled by our all-wise and all-beneficent creator; a right that God gave you, not government.” Any other perceived rights, not understood as coming from God, cannot be legitimately protected by the civil government. This is the very point of criticism made of Supreme Court nominee Elena Kagan by Herb Titus, described earlier. Rooted in the three-part division of authority popularized by Rushdoony and the Reconstructionists, Barton argues that the Bible (which he believes the founders read in the same way he does and upon which he believes they based the Constitution) limits the jurisdiction of civil government. That life, liberty, and property are “among” the God-given rights that Barton finds in the Declaration left room for the articulation of more rights derived from God to be “incorporated” into the Constitution, most clearly in the Bill of Rights, which he calls “the capstone” to the Constitution. “They said, we’re going to name some other inalienable rights just to make sure that government does not get into these rights . . . When you look at the ten amendments that are the Bill of Rights, those are God-granted rights that government is not to intrude into.” He then offered some unique interpretations of the rights protected in the first ten amendments. The First Amendment religion clauses, for Barton, become “the right of public religious expression.” The Second Amendment right to keep and bear arms is, according to Barton, “what they called the biblical right of self-defense.” The Third Amendment prohibiting the coerced quartering of soldiers is the biblical and constitutional protection of the “the sanctity of the home.” Finally, all the protections against unjust prosecution in the Fifth Amendment are reduced to the protection of “the right to private property.” While the “limited government” enshrined in the Constitution protects basic rights given by God and precludes government from doing anything not within the purview of its biblical mandate, it also, according to Barton, prohibits abortion. Barton says that, according to the founders, the first example of “God-given inalienable rights is the right to life.” Barton claims that when the founders invoked the God-given right to life they intended to prohibit abortion. He claims that “abortion was a discussed item in the founding era.” As evidence he says, “as a matter of fact we have books in our library of original documents—observations on abortion back in 1808,” and that “early legislatures in the 1790s were dealing with legislation on the right to life on the abortion issue.” But Barton gives no examples and provides no references to any evidence. After this slippery claim, he goes on at length with quotes from founders on the right to life, none of which mention abortion. “They understood back then that abortion was a bad deal and that your first guaranteed inalienable right is a right to life. Consider how many other founding fathers talked about the right to life.” In another example of slipperiness, he quotes founder James Wilson: “Human life, from its commencement to its close, is protected by the common law. In the contemplations of law, life begins when the infant is first able to stir in the womb.” Realizing that this won’t do the work of banning abortion from conception, Barton redefines the question, moving the focus from the development of the fetus to what the mother “knows.” Very simply, he [Wilson] says as soon as you know you’re pregnant, as soon as you know there’s life in the womb, that life is protected by law. That’s where the technology difference is, we can know that there’s life in the womb much earlier today than what they knew back then. But the point is the same there: as soon as you know there’s a life there, it’s protected. But this is not what Wilson said, and Barton’s argument gets worse. In his view this understanding of the right to life is a bellwether for a number of other issues that are at the top of the religious right’s agenda: “Our philosophy of American exceptionalism is very simple: there is a God, he gives specific rights, [and] the purpose of government is to protect the rights he’s given.” If someone is “wrong” on “life issues,” they’re likely to be wrong on the right to self-defense (the right to own guns), the sanctity of the home (his interpretation of what it means to not have soldiers in your house), private property (his reading of the rights of the accused culminating in the protection against eminent domain), and “the traditional marriage issue” (for which he makes no connection to the founders or the Constitution). Barton’s interpretation doesn’t even resemble a close reading of the text with an eye toward the founders’ intentions—or any coherent application of the value of limited government—yet he successfully frames it as such in populist discourse. In 2011, the Ninth Circuit Court rejected an appeal challenging the policy of the California Department of Corrections and Rehabilitation that allows only leaders of “five faiths” (Protestant, Catholic, Muslim, Jewish, and Native American) to serve as paid chaplains (McCollum v. CDCR). The ruling had nothing to do with the legitimacy of the claim that the policy unconstitutionally favors some religions over others but rather whether McCollum (a Pagan minister) had standing to bring the case. An amicus brief filed by Wallbuilders in support of the CDCR to privilege the “five faiths” provides a glimpse into how Barton reads the Constitution. For him the Constitution represents a consensus—as though there is a singular view that can be attributed to “the founders.” Barton’s style of reading the Constitution is modeled on his style of reading the Bible, which he also treats as a coherent document that can be read from start to finish to yield a clear, undisputed, objective meaning, instead of a collection of fragmented texts written over a very long period of time in different cultures, assembled into larger texts, then chosen from an even larger collection of texts in a political process, translated from ancient languages, and finally interpreted in different ways by different communities. Every stage of that process continues to be profoundly disputed by scholars, and there is always an interpretative framework (albeit all too often an unrecognized one) underlying any reading of it. While the US Constitution is a newer document, and it is therefore somewhat less difficult to discern its meaning(s), the fact remains that it is the product of hard-fought compromise among leaders, bound in time and culture, who profoundly disagreed with each other. There is no reason to believe they thought they were writing a sacred text to which all subsequent generations of Americans were bound by a process that amounts to divining a singular “intent.” The argument Barton made in the brief, moreover, illustrates a second important point. He is being disingenuous when he insists he just wants everyone to have the opportunity to practice his religion freely. In his appearance on the Daily Show, he defended the practice of Christian religious observance in otherwise secular contexts when the majority wants it by saying that a Muslim-majority community should be able to make “Sharia law” the law of the land. There was a significant outcry from his anti-Muslim supporters, and he backtracked on the point in a subsequent episode of Wallbuilders Live. In this brief, however, he argued that only those religions that fit with what he thinks the founders meant by “religion” should be protected. Protected religion is either Christianity alone or perhaps the larger category of monotheism—Barton asserts that rights of conscience don’t extend to atheists either (and by implication also not to Buddhists and Hindus): “whether this Court agrees that ‘religion’ meant monotheism or believes that it meant Christianity . . . it is clear that atheism, heathenism, and paganism were not part of the definition of ‘religion.’” Barton has argued against the free exercise of rights of Muslims, as have other religious right promoters of Islamophobia, claiming Islam is “not a religion.” Indeed, the term “religion” does have a complicated history, and it has often been used (or denied) to legitimize dominance of one group over another. Initially Africans were said to be “without religion,” legitimizing their enslavement, and, in another example, Native Americans were considered “without religion” to justify taking their land. Barton’s brief is important because it made explicit that which he often tries to deny: that only Christianity (and maybe Judaism) is protected under his reading of the Constitution. Barton on the Free Market and Socialism On another segment of Wallbuilders Live, Barton and co-host Rick Green discussed the effort by the Obama administration to prohibit Internet service providers from charging for service based on usage (known as Net Neutrality) because it violates biblical economics and is “socialist.” It’s easy to dismiss that charge as nothing more than demagoguery, but, in fact, the discussion illustrates what they mean by socialism and, ultimately, how they understand freedom. Both points trace directly back to Rushdoony. Most of us understand socialism as a system that limits private ownership of property and in which power (political and economic) is centralized in the state; Tea Party accusations that any policy they oppose is “socialist” seem, at best, like hyperbole. But in Barton’s view, any move away from what he sees as an unfettered free market, any regulation or involvement on the part of government, is a move toward socialism—and of course he thinks that private ownership and free markets are biblically sanctioned. Net Neutrality prohibits ISPs from charging for Internet service based on usage. This seems straightforward to Barton and Green: “what they mean is we’re not going to let you choose who you need to charge more to.” Maybe more interesting, though, is the subsequent exchange between Rick Green and his “good friend” Texas congressman Joe Barton, who was sponsoring legislation to overturn the Obama administration’s Net Neutrality regulation. Joe Barton tried to explain Net Neutrality and, in the process, revealed important aspects of how such people understand freedom in entirely economic terms. Joe Barton says that we cannot regulate the Internet, it should be open and free. Democrats’ definition of Net Neutrality is we want to give FCC the authority to tell people who actually provide the Internet what they can and can’t do with it. Now, what people like yourself and myself mean [by freedom] is no government interference; it’s pretty straightforward. Republicans and conservatives have always tried to keep the Internet totally free. But of course they have not tried to keep it totally free, except in one very narrow economic sense. They certainly do not mean “free” in a way that includes broadly available access, because that’s socialism; “redistribution of wealth through the Internet . . . this is socialism on the Internet.” Nor do they mean free regarding content, as David Barton made explicit when he returned to the conversation at the end of the show saying, “We’re not suggesting moral license, we don’t want to have obscenity, pornography, child pornography . . . You still have moral laws to follow.” Economic freedom is nearly absolute, but it is still subordinate to moral law. At the height of the debate over the federal budget and the Tea Party demands that Congress not raise the debt ceiling during the summer of 2011, David Barton and company tackled the question posed by the “religious left” in the budget debate: What would Jesus cut? They devoted an entire episode of Wallbuilders Live to the question: “Why Do People Think Government’s Role Is to Take Care of the Poor?” The episode is promoted with the assertion that “The role of the government is not to exercise mercy, but to exercise justice. It is improper for government to take care of the poor. That is up to us, as individuals.” With guest Michael Youseff, who had recently written on his blog about the application of the Bible to government spending and the poor, David Barton and Rick Green invoked the framework for limited biblical jurisdiction developed and promoted by Rushdoony. They claimed that the Bible has “205 verses about taking care of the poor” and asserted that “only one is directed to government,” which simply requires no more than the poor be “treated fairly in court.” Barton and Green employ Rushdoony’s framework of three God-ordained spheres of authority and the view that any action on issues outside those responsibilities is tyrannical and socialist. The responsibility to take care of the poor is limited to families and churches. As we have seen, Rushdoony, Gary North, David Chilton, George Grant, and others have written on this topic. One of the more accessible places to find their view is in George Grant’s volume in Gary North’s Biblical Blueprints Series. Barton and Green borrow from them to assert that taking care of the poor is not the job of the government. Charity is up to individuals, families, and churches. Moreover, it should not be extended to everyone. The architects of the framework on which Barton bases his view are quite clear: biblical charity may extend to the four corners of the earth, but only to those who are in submission to biblical law as it is articulated by the Reconstructionists. Barton on Race David Barton is also the popularizer of a revisionist history of race in America that has become part of the Tea Party narrative. Drawn in part from the writings of Christian Reconstructionists, that narrative recasts modern-day Republicans as the racially inclusive party, and modern-day Democrats as the racists supportive of slavery and postemancipation racist policies. Barton’s website has included a “Black History” section for some time. Like Barton’s larger revisionist effort to develop and perpetuate the narrative that America is a Christian nation, the “Republicans-are-really-the-party-of-racial-equality” narrative is not entirely fictive. Some historical points Barton makes are true; but he and his biggest promoter, Glenn Beck, manipulate those points, remove all historical context, and add patently false historical claims in order to promote their political agenda. Barton appeared regularly on Beck’s show to disseminate his alternative reading of African American history, carrying with him, as he does, what he claims are original documents and artifacts that he flashes around for credibility. In June of 2010 I traveled to central Florida to attend a Tea Party event sponsored by the Florida chapter of Beck’s “9–12 Project” held at a Baptist church (with a Christian school that was established in the late 1970s). The church sanctuary was decked in patriotic trimmings, including eight big flags on the wall, bunting all over the altar area, and a collection of small flags on the altar itself. As I waited for the event to begin, I overheard people talking about homeschooling and David Barton’s work on “America’s Christian Heritage,” all while Aaron Copland’s “Fanfare for the Common Man” played over the sound system. For those unconvinced of the religious dimensions of at the Tea Party movement, the strain of it exhibited here was indistinguishable from the church-based political organizing efforts of the religious right dating back at least to the 1980s. As each local candidate spoke, it was clear how profoundly conservative, Republican, and Christian (in the religious right sense of Christian) this gathering was. The event was promoted as a response to charges of racism in the Tea Party movement. The banner at the entrance to the event read: “9–12 Project: not racist, not violent, just not silent anymore.” The pastor of the church introduced the meeting, the Tea Party–supported candidates for local office spoke, and all invoked “Christian American history” and the “religion of the founders.” The “9–12 Project” refers both to post-9/11 America (when “divisions didn’t matter”) and to the “nine principles and twelve values” of the group, initiated and promoted by Beck. The “principles” are a distillation of those in The Five Thousand Year Leap, a 1981 book by Cleon Skousen, which was referenced repeatedly by speakers at the event. The book has long been a favorite for Christian schools and homeschoolers and among Reconstructionists despite the fact that Skousen is a Mormon (perhaps because he is also a strong advocate of the free-market Austrian School of economics). I was surprised to learn that Skousen’s book was enjoying a resurgence in popularity as a result of Beck’s promotion and is available in a new edition with a preface by Beck. The fight over the degree to which America was “founded as a Christian nation” is important in that it is a fight over our mythic understanding of ourselves. That is, it is a fight over the narratives through which Americans construct a sense of what it means to be American and perpetuate that sense through the culture and in successive generations. Intended to counter the charges of racism made against the Tea Party movement, the main speaker was an African American, Franz Kebreau, from the National Association for the Advancement of Conservative People of all Colors (NAACPC). The event was in a more rural part of Florida than where I live, and I passed a number of Confederate flags on my way there. I expected an all-white crowd making arguments about “reverse discrimination,” libertarian arguments against violations of state sovereignty (especially with the Civil Rights Act), and maybe even some of the “slavery wasn’t as bad as people say” arguments. What I found surprised me. Kebreau gave a detailed lecture on the history of slavery and racism in America: a profoundly revisionist history. In Kebreau’s narrative, racism is a legacy of slavery, but it was a socially constructed mechanism by which people in power divided, threatened, and manipulated both blacks and whites. Many of the pieces of historical data he marshals in favor of this thesis are not unfamiliar to those who have studied this aspect of American history, but they are probably not as well known among Americans in general: some slave owners were black, not all slaves were black, black Africans played a huge role in the slave trade, and very few Southerners actually owned slaves. While at least most of these points are undeniably true, they were presented with a specific subtext: with the goal of lending credence to the view that contemporary critics of racism make too much of America’s history of slavery. In this view, it is Democrats who are primarily responsible for fostering racism to solidify power. Southern Democrats opposed civil rights laws, voting rights, integration, and so on. Northern Democrats fanned racial tensions by promoting social programs that made African Americans dependent on government. Race-baiting demagogues like Jesse Jackson and the Reverend Al Sharpton perpetuate the divisions today. In August of 2010, Beck held his Restoring Honor Rally, bringing many Tea Party groups—Tea Party Patriots, Freedom Works, 9–12 Project, Special Operations Warrior Foundation, and others—together at the Lincoln Memorial. While Beck initially promoted the event as a nonpolitical effort to return to the values of the founders, he claims he only realized later that he scheduled it on the anniversary of Martin Luther King Jr.’s “I Have a Dream” speech. He suggested that while he did not realize the significance of the date, “God might have had a hand” in the coincidence. Beck was criticized for both his timing and his crediting the Almighty. Beck fancies himself a contemporary King, “reclaiming the civil rights movement,” and while he was widely mocked for drawing this parallel, it was less recognized that he did it on a foundation laid by David Barton and his revisionist history, which relies in no small part on the work of Rushdoony. In his essay “The Founding Fathers and Slavery,” Barton quotes extensively from the writings of the founders and claims that many of them were abolitionists. He maintains that the overwhelming majority of the founders were “sincere Christians” who thought American slavery was “unbiblical,” blamed England for imposing the institution on the colonies, and set in motion processes to end it. Scholars dispense with these claims. According to Diana Butler Bass, “It was nearly universally accepted by white Christian men that the Bible taught, supported, or promoted slavery and it was rare to find a leading American intellectual, Christian or otherwise, who questioned the practice on the basis that it was ‘unbiblical.’ Some intellectuals thought it was counter to the Enlightenment.” Historian Mark Noll argues that the reverse of Barton’s view with regard to the British is correct: evangelicals in the Church of England, not in America, argued that slavery violated the Bible. Again, according to Bass, “the American biblical argument against slavery did not develop in any substantial way until the 1830s and 1840s. Even then, the anti-slavery argument was considered liberal and not quite in line with either scripture or tradition.” Another essay on Barton’s website, “Democrats and Republicans in Their Own Words: National Party Platforms on Specific Biblical Issues,” compares party platforms from 1840 to 1964—the period before Southern Democrats who blocked civil rights legislation began switching to the Republican Party. In Barton’s narrative, the modern Republican Party is the party more favorable to African Americans because the Republicans led the fight against slavery and for civil rights from the formation of the Republican Party as the “anti-slavery party” and the “election of Abraham Lincoln as the first Republican President,” to the Emancipation Proclamation, the Thirteenth and Fourteenth Amendments, the passage of civil rights laws during Reconstruction, and the election of blacks to office. Barton writes that while the Democratic Party platform was defending slavery, “the original Republican platform in 1856 had only nine planks— six of which were dedicated to ending slavery and securing equal rights for African-Americans.” Democrats, on the other hand, supported slavery, and they then sought to ban blacks from holding public office and to limit their right to vote via poll taxes, literacy tests, grandfather clauses, and general harassment and intimidation, and they established legal segregation under Jim Crow laws. Barton takes issue with the claim that “Southerners” fought for racist policies, because “just one type of Southern whites were the cause of the problem: Southern racist whites.” Rather, he argues (missing the logical inconsistency), we should lay the responsibility for racism at the feet of Democrats: Current writers and texts addressing the post-Civil War period often present an incomplete portrayal of that era . . . To make an accurate portrayal of black history, a distinction must be made between types of whites. Therefore, [it would be] much more historically correct— although more “politically incorrect”—were it to read: “Democratic legislatures in the South established whites-only voting in party primaries.” Because he says very little about contemporary Democrats, it’s clear that Barton’s purpose is to connect them with racist Southern Democrats, while completely ignoring the relationship of modern-day Republicans with racism. Most glaringly, the Republican “Southern strategy” is entirely missing from Barton’s account of the parties’ political strategies with regard to race. From the Johnson administration through the Nixon and Reagan campaigns, Republican strategists effectively used race as a “wedge issue.” Southern Democrats would not support efforts by the national party to secure civil rights for African Americans. By focusing on specific racial issues (like segregation), Republicans split off voters who had traditionally voted for Democrats. The contemporary “states’ rights” battle cry at the core of the conservative movement and Tea Party rhetoric is rooted in this very tactic. Barton and Beck want to rewrite American history on race and slavery in order to cleanse the founding fathers of responsibility for slavery and, more importantly, blame it and subsequent racism on Democrats. But Barton’s rewriting of the history of the founding era and the civil rights movement alone doesn’t quite accomplish that. He has to lower the bar even more and make slavery itself seem like it wasn’t quite as bad as we might think. And for that, he turns to Stephen McDowell of the Reconstructionist-oriented Providence Foundation. Wallbuilders’ website promotes a collection of “resources on African American History.” Much of the material is written by Barton himself, but one of the essays is McDowell’s, drawn almost entirely from Rushdoony’s work in the early 1960s. McDowell’s discussion of slavery, written in 2003, comes directly from Rushdoony’s The Institutes of Biblical Law. McDowell attributes his views to Rushdoony and uses precisely the language that Rushdoony used as early as the 1960s. Rushdoony’s writings on slavery are often cited by his critics. Rushdoony did argue that slavery is biblically permitted. While criticizing American slavery as violating a number of biblical requirements, he also defended it in his writings. By promoting McDowell, and by extension Rushdoony, Barton promotes a biblical worldview in which slavery is in some circumstances acceptable. This worldview downplays the dehumanization of slavery by explicitly arguing that God condones it in certain circumstances. McDowell writes that, while it was not part of “God’s plan” from the beginning, “slavery, in one form or another (including spiritual, mental, and physical), is always the fruit of disobedience to God and His law/ word,” meaning that the slave is justifiably being punished for his or her disobedience. McDowell argues that slavery is tightly regulated, though not forbidden, in the Bible, and that American Southern slavery was not “biblical” slavery because it was race-based. Following Rushdoony, he argues that there are two forms of biblically permissible slavery: indentured servitude, in which “servants were well treated and when released, given generous pay,” and slavery, in which, in exchange for being taken care of, one might choose to remain a slave. Moreover, he maintains that the Bible permits two forms of involuntary slavery: criminals who could not make restitution for their crimes could be sold into slavery and “pagans, [who] could be made permanent slaves.” Of course, Rushdoony defines “pagans” as simply non-Christians. This means that slavery was/is voluntary only for Christians; non-Christians can be held in nonvoluntary perpetual slavery. Barton shares this understanding of the legal status of “pagans,” at least in terms of their rights under the First Amendment. McDowell is explicit that race-based kidnapping and enforced slavery are unbiblical. In fact, they are punishable by death. All this comes directly from The Institutes of Biblical Law. McDowell argues, as did Rushdoony in the early 1960s, that while American slavery was not biblical slavery, neither was it the cause of the Civil War. The major point of dispute between North and South, they argue, was, not slavery but “centralism,”—that is, the increasing centralization of power in the federal government, an argument frequently echoed today by the states’ rights agitators and Tenth Amendment Tea Partiers. Although in one essay Barton parts company with Rushdoony and McDowell over the significance of slavery as a cause of the Civil War (Barton argues instead that slavery was a cause, in service of his argument that the present-day Republican Party is more racially inclusive than the Democrats), he nonetheless continues to promote, on his website, their view that slavery is biblical. The historical revisionism with regard to race in America that gained a hearing in the Tea Party (thanks to Glenn Beck and activists such as Franz Kebreau) is rooted in Barton’s and Wallbuilders’ writings, which have been deeply influenced by Rushdoony. Excerpted from "Building God's Kingdom: Inside the World of Christian Reconstruction" by Julie Ingersoll. Published by Oxford University Press. Copyright 2015 by Julie Ingersoll. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 09:00

The 12 most ludicrous ideas about women’s health from the GOP field

AlterNet There are 17 Republican candidates for president that get the New York Times stamp of legitimacy. In a field like that, standing out is hard. The easiest way to catch media attention---and attract voters in the notoriously conservative Republican primary voting base---is to get competitively nutters. Which most of the candidates are doing, hard, when it comes to bashing reproductive health care. It’s impossible to really hand out lifetime achievement awards when it comes to the ugliest slams against reproductive health care. But here are the worst things they’ve said recently. 1) Mike Huckabee. Huckabee is a notorious spewer of sexist garbage, but his latest –defending the Paraguay government forcing a 10-year-old rape victim to have her rapist’s baby---is low even for him. “When an abortion happens, there are two victims,” he argued. “One is the child, the other is that birth mother, who often will go through extraordinary guilt years later when she begins to think through what happened, with the baby, with her.” Yes, he tried to argue he wants a 10-year-old to endure childbirth for her own good, lest she feel “guilt” over reneging on their Huckabee-prescribed duty to having babies for rapists. Not very convincing, that. 2) Scott Walker. During the Fox News GOP debate, Walker affirmed his support for forcing pregnant women to give birth, even if their doctors tell them it will kill them. He doubled down later in an interview with Sean Hannity, saying, “I’ve said for years, medically there’s always a better choice than choosing between the life of an unborn baby and the life of the mother.” It is true that you don’t have to choose, since Walker’s preference, doing nothing, tends to kill both a woman and her fetus. How that’s “pro-life”, however, remains a mystery. 3) Ben Carson. “It brings up a very important issue and that is do those black lives matter,” he told Fox News host Eric Bolling recently when discussing Planned Parenthood. “The number one cause of death for black people is abortion.” Undermining the Black Lives Matter movement while implying that black women are somehow race traitors because they control their own bodies? It’s a two-fer---maybe a three-fer---of the kind of viciousness that motivates the modern American right. 4) Rick Santorum. “It is not any more than the Dred Scott decision was settled law to Abraham Lincoln,” Santorum said, during the Republican debate, about a recent court decision legalizing same-sex marriage. “This a rogue Supreme Court decision.” ““We passed a bill and we said, ‘Supreme Court, you’re wrong!,” he continued, citing a 2003 law he wrote that undermined Roe v Wade. Dred Scott v Sanford was a notorious 1856 case where the Supreme Court ruled that black people cannot be U.S. citizens. That’s right. Santorum was suggesting that denying black people their basic humanity is somehow the equivalent of letting women control their bodies or letting gay people marry for love. 5) Bobby Jindal. “Today's video of a Planned Parenthood official discussing the systematic harvesting and trafficking of human body parts is shocking and gruesome,” Jindal said in announcing an investigation of Planned Parenthood inspired by videos that have been repeatedly shown to be anti-choice hoaxes. Investigations into Planned Parenthood have found, no surprise, that there is no “trafficking of human body parts” going on. Jindal has yet to weigh in on what other surgeries should be banned because they are “gruesome”, a word that can be used to characterize all of them. 6) Marco Rubio. Rubio’s argument on CNN for why women should not be allowed to remove unwanted embryos from their uteruses: “It cannot turn into an animal. It can’t turn into a donkey.” "Well, if they can’t say it will be human life, what does it become, then?” he added. Could it become a cat?” All surgery, as well as tooth removal and hair brushing, removes living human cells, aka human life. It’s not donkey. It’s not cat. Human. We look forward to Rubio’s upcoming ban on dentistry on the grounds that human life is not cat life. 7) Carly Fiorina. Fiorina considered denying her daughter the HPV vaccine, even though nearly all sexually active people will get it at some point in their life. "And she got bullied. She got bullied by a school nurse saying: 'Do you know what your daughter is doing?'" Fiorina complained at a campaign event. Sorry, Fiorina, but assuming that your kid will likely grow up and have sex one day is not bullying. Signaling to your kid that you expect her to be a lifelong virgin or risk cervical cancer? Now that’s what I’d call bullying. 8) Jeb Bush. Bush got a lot of negative attention for a campaign event where he said, “I'm not sure we need a half a billion dollars for women's health issues.” His attempt to “clarify” this, however, showed that he really does mean it. He proposes taking the money away from family planning clinics like Planned Parenthood and redirecting it to general service community health centers. Which is to say, to take away money bookmarked for women’s health, forcing women to give up their gynecologists and go to general clinics instead, where they can expect longer wait times, less direct access to contraception and less access to specialized services. 9) Ted Cruz. When the hoax Planned Parenthood videos came out, Cruz floated a conspiracy theory accusing the media of censorship. “The mainstream media wants to do everything they can to hide these videos from the American people,” he argued. “And the reason is virtually every reporter, virtually every editor, virtually every person who makes decisions in the mainstream media is passionately pro-abortion.” In the real world, every major newspaper, cable news network, and many nightly news shows covered the videos. They also debunked the lies in the videos, though telling the truth is probably not what Cruz was hoping the “mainstream media” would do with these deceitful videos. 10) Donald Trump. Trump says a lot of foul things about women generally and reproductive health care generally, including calling Planned Parenthood an “abortion factory”. But he’s probably the candidate in the race who hates reproductive health care access the least, which is a sad statement about the state of the modern GOP. 11) Rand Paul. Paul has been pushing the idea of banning Medicaid patients from Planned Parenthood and redirecting them to already overcrowded general service clinics instead. “We’ve doubled the amount of money we put into women’s health care through government, and so it’s just an absurd argument to say we need Planned Parenthood,” he argued on Fox News last week. “It’s only about abortion.” In reality, 97 percent of Planned Parenthood’s services are not abortion and 0 percent of federal money goes to Planned Parenthood’s abortion services. Nor can women just go to a community health center. When Texas defunded Planned Parenthood, there were over 63,000 fewer claims for birth control services. Community health centers try to pick up the slack, but it’s more than they can handle. 12) Chris Christie. Christie’s attempts to ingratiate himself with the religious right brought him to start defunding Planned Parenthood in New Jersey years ago. But his enthusiasm for preventing women from using contraception stops at his bedroom door. “I’m a Catholic, but I’ve used birth control, and not just the rhythm method,” Christie recently told a New Hampshire crowd. Birth control for me but not for thee? It’s probably what all these candidates, none of whom have Duggar-size families, actually practice. But Christie doesn’t get bonus points for honesty. After all, he didn’t admit that this was hypocrisy and continues to bash Planned Parenthood every chance he gets. There are five other white guys in the race, all eager to dump on affordable contraception services and legal abortion. But, as of now, few have shown the vim to really stand out from the crowd in their tedious denunciations of reproductive health care technologies that, in the real world, are a normal part of everyday life. But give them time. It’s a long campaign season and the anti-woman competition is only just getting started. AlterNet There are 17 Republican candidates for president that get the New York Times stamp of legitimacy. In a field like that, standing out is hard. The easiest way to catch media attention---and attract voters in the notoriously conservative Republican primary voting base---is to get competitively nutters. Which most of the candidates are doing, hard, when it comes to bashing reproductive health care. It’s impossible to really hand out lifetime achievement awards when it comes to the ugliest slams against reproductive health care. But here are the worst things they’ve said recently. 1) Mike Huckabee. Huckabee is a notorious spewer of sexist garbage, but his latest –defending the Paraguay government forcing a 10-year-old rape victim to have her rapist’s baby---is low even for him. “When an abortion happens, there are two victims,” he argued. “One is the child, the other is that birth mother, who often will go through extraordinary guilt years later when she begins to think through what happened, with the baby, with her.” Yes, he tried to argue he wants a 10-year-old to endure childbirth for her own good, lest she feel “guilt” over reneging on their Huckabee-prescribed duty to having babies for rapists. Not very convincing, that. 2) Scott Walker. During the Fox News GOP debate, Walker affirmed his support for forcing pregnant women to give birth, even if their doctors tell them it will kill them. He doubled down later in an interview with Sean Hannity, saying, “I’ve said for years, medically there’s always a better choice than choosing between the life of an unborn baby and the life of the mother.” It is true that you don’t have to choose, since Walker’s preference, doing nothing, tends to kill both a woman and her fetus. How that’s “pro-life”, however, remains a mystery. 3) Ben Carson. “It brings up a very important issue and that is do those black lives matter,” he told Fox News host Eric Bolling recently when discussing Planned Parenthood. “The number one cause of death for black people is abortion.” Undermining the Black Lives Matter movement while implying that black women are somehow race traitors because they control their own bodies? It’s a two-fer---maybe a three-fer---of the kind of viciousness that motivates the modern American right. 4) Rick Santorum. “It is not any more than the Dred Scott decision was settled law to Abraham Lincoln,” Santorum said, during the Republican debate, about a recent court decision legalizing same-sex marriage. “This a rogue Supreme Court decision.” ““We passed a bill and we said, ‘Supreme Court, you’re wrong!,” he continued, citing a 2003 law he wrote that undermined Roe v Wade. Dred Scott v Sanford was a notorious 1856 case where the Supreme Court ruled that black people cannot be U.S. citizens. That’s right. Santorum was suggesting that denying black people their basic humanity is somehow the equivalent of letting women control their bodies or letting gay people marry for love. 5) Bobby Jindal. “Today's video of a Planned Parenthood official discussing the systematic harvesting and trafficking of human body parts is shocking and gruesome,” Jindal said in announcing an investigation of Planned Parenthood inspired by videos that have been repeatedly shown to be anti-choice hoaxes. Investigations into Planned Parenthood have found, no surprise, that there is no “trafficking of human body parts” going on. Jindal has yet to weigh in on what other surgeries should be banned because they are “gruesome”, a word that can be used to characterize all of them. 6) Marco Rubio. Rubio’s argument on CNN for why women should not be allowed to remove unwanted embryos from their uteruses: “It cannot turn into an animal. It can’t turn into a donkey.” "Well, if they can’t say it will be human life, what does it become, then?” he added. Could it become a cat?” All surgery, as well as tooth removal and hair brushing, removes living human cells, aka human life. It’s not donkey. It’s not cat. Human. We look forward to Rubio’s upcoming ban on dentistry on the grounds that human life is not cat life. 7) Carly Fiorina. Fiorina considered denying her daughter the HPV vaccine, even though nearly all sexually active people will get it at some point in their life. "And she got bullied. She got bullied by a school nurse saying: 'Do you know what your daughter is doing?'" Fiorina complained at a campaign event. Sorry, Fiorina, but assuming that your kid will likely grow up and have sex one day is not bullying. Signaling to your kid that you expect her to be a lifelong virgin or risk cervical cancer? Now that’s what I’d call bullying. 8) Jeb Bush. Bush got a lot of negative attention for a campaign event where he said, “I'm not sure we need a half a billion dollars for women's health issues.” His attempt to “clarify” this, however, showed that he really does mean it. He proposes taking the money away from family planning clinics like Planned Parenthood and redirecting it to general service community health centers. Which is to say, to take away money bookmarked for women’s health, forcing women to give up their gynecologists and go to general clinics instead, where they can expect longer wait times, less direct access to contraception and less access to specialized services. 9) Ted Cruz. When the hoax Planned Parenthood videos came out, Cruz floated a conspiracy theory accusing the media of censorship. “The mainstream media wants to do everything they can to hide these videos from the American people,” he argued. “And the reason is virtually every reporter, virtually every editor, virtually every person who makes decisions in the mainstream media is passionately pro-abortion.” In the real world, every major newspaper, cable news network, and many nightly news shows covered the videos. They also debunked the lies in the videos, though telling the truth is probably not what Cruz was hoping the “mainstream media” would do with these deceitful videos. 10) Donald Trump. Trump says a lot of foul things about women generally and reproductive health care generally, including calling Planned Parenthood an “abortion factory”. But he’s probably the candidate in the race who hates reproductive health care access the least, which is a sad statement about the state of the modern GOP. 11) Rand Paul. Paul has been pushing the idea of banning Medicaid patients from Planned Parenthood and redirecting them to already overcrowded general service clinics instead. “We’ve doubled the amount of money we put into women’s health care through government, and so it’s just an absurd argument to say we need Planned Parenthood,” he argued on Fox News last week. “It’s only about abortion.” In reality, 97 percent of Planned Parenthood’s services are not abortion and 0 percent of federal money goes to Planned Parenthood’s abortion services. Nor can women just go to a community health center. When Texas defunded Planned Parenthood, there were over 63,000 fewer claims for birth control services. Community health centers try to pick up the slack, but it’s more than they can handle. 12) Chris Christie. Christie’s attempts to ingratiate himself with the religious right brought him to start defunding Planned Parenthood in New Jersey years ago. But his enthusiasm for preventing women from using contraception stops at his bedroom door. “I’m a Catholic, but I’ve used birth control, and not just the rhythm method,” Christie recently told a New Hampshire crowd. Birth control for me but not for thee? It’s probably what all these candidates, none of whom have Duggar-size families, actually practice. But Christie doesn’t get bonus points for honesty. After all, he didn’t admit that this was hypocrisy and continues to bash Planned Parenthood every chance he gets. There are five other white guys in the race, all eager to dump on affordable contraception services and legal abortion. But, as of now, few have shown the vim to really stand out from the crowd in their tedious denunciations of reproductive health care technologies that, in the real world, are a normal part of everyday life. But give them time. It’s a long campaign season and the anti-woman competition is only just getting started.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 08:00

David Simon: Hyper-segregation is our national dynamic

ProPublica David Simon’s new HBO miniseries “Show Me a Hero,” which premiered last Sunday, is the harrowing tale of a hopeless battle. Based on a nonfiction book of the same title – written by former New York Times reporter Lisa Belkin – the show dramatizes the real fight that took place 25 years ago in Yonkers, New York, after a federal judge ordered public housing projects to be built in the wealthier (and whiter) parts of the city. In an interview with ProPublica, David Simon discussed the legacy of the Yonkers crisis and what desegregation is all about. The transcript has been edited for clarity and length. You said that you thought about this show many years ago. How has the project changed over time? Very little, sadly. We optioned this book shortly after it came out [in 1999], and we were fairly certain that the dynamic of hyper-segregation was a national dynamic, that we were not just writing a story about Yonkers. What do you mean by hyper-segregation? White people, by and large, are not very good at sharing physical space or power or many other kinds of social dynamics with significant numbers of people of color. It’s been documented time and time again. There is a great book by Andrew Hacker called “Two Nations.” My God, it’s almost a quarter century old, but it is an incredible primer on just how specific the desire of white America is to remain in a hyper-majority. The reason we wanted [Lisa Belkin’s] book was that Yonkers was a place where the housing department actually got the housing right. They didn’t overwhelm the neighborhood with a massive project or hundreds of walk-up units. They were trying to do scattered-site housing for the first time, which has been this quiet revolution in public housing. It works, it doesn’t destabilize neighborhoods. But you were dealing with people who were entrenched behind the same fears as previous generations…. This project kept getting bumped a little bit to the back burner but every time we bumped it, in talking about it with the HBO executives, we’d say, “You know what, look, it just happened to Baltimore.” They tried to do the same thing with scattered housing in eastern Baltimore County, and the white folks went batshit, batshit crazy. At every point, there was a new fresh example that the dynamic was still there, that the racial pathology was still intact. And I think it has only become more pronounced. The show was greenlit before Ferguson, before Baltimore, before Charleston. If you had written the screenplay after these events, would you have changed anything? No, no. First of all, “Show Me a Hero” is not about police violence. It’s certainly not about a white racist backlash against changing demographics, which is how I would characterize the Charleston or Lafayette shootings. Part of the implied power of the piece is we are taking you back 25 years and nothing has changed! Lisa Belkin wrote an op-ed in The New York Times a few days ago saying she viewed Yonkers, at the time when she was doing her reporting, as a place of hope. She expected desegregation to happen around the United States as a result. That didn’t happen. The NAACP didn’t pursue the same cases anywhere else. Nor did the Justice Department because of horrible resources. Why do you think that happened? Because of how blistering Yonkers was, how insanely volatile and irrational Yonkers was. You have to remember that this case was brought at the end of the Carter administration. There wasn’t a single civil rights action filed by the Justice Department from 1980 to 1988 that mattered. Reagan effectively shut down the civil rights division of the DOJ. Then you had Clinton, who was doing everything he could during the Gingrich years to maneuver to the center. The reason you didn’t have aggressive use of this legal precedent under Clinton is the same reason you have those omnibus crime bills that filled up prisons as fast as we can construct them. Bill Clinton’s triangulation with the political center made things like fair housing prohibitive for his political priorities. We haven’t seen any movement on this in any presidential administration until the last two years of Obama. They sort of opened the books on all their data to basically encourage the use of the Fair Housing Act to do precisely what they did in Yonkers. But notice that this is coming in the last two years of the administration, and it’s coming as an administrative act. In the show, no one really wants the housing either. The NAACP is already tired of the whole ordeal before any units were built. You have to remember, they filed the case in [‘80]. It was litigated. They are now in 1987, and they can’t get the goddamn city to name a geographic site to build house No. 1... But the truth is, the 200 units were built. They are still there, and there has been no increase in crime in those neighborhoods as a result of it. There has been no substantive decrease in the housing values in Yonkers. There was a brief dip as there was some fundamental white flight, mostly surrounding the school desegregation portion of the civil rights suit, which is a whole other can of worms. The population in Yonkers is now probably about 56 percent white, 44 percent people of color, heavily Latino. A lot of people, with a certain amount of unknowing racial malevolence, say “oh, look, it was 80–20 white, now it’s almost 50 percent people of color. See what happens? Look at all that white flight.” But in 25 years, the population of the New York metropolitan area has been transformed. What desegregation is about is not about keeping Yonkers 80–20 or 75–25. This is about the browning of America. We are becoming a less white country. The trick is, can we become more brown without destabilizing ourselves and without having gated white communities and ghettos? Did you hear from current residents as you were filming? We talked to all the people in the book, some of whom are still living in those townhouses. I mean, did I go take a poll of random people in Yonkers? No. But every time we set up and started filming, people would come over, and we talked to people who were unrelenting in their belief that what was done was illegal and that the judge had no right to do this and that it was an affront to their freedom and their liberty. They would come and tell us that, and we’d say “well, okay…” Did anyone object to you filming at, for example, the actual city hall of Yonkers? Or that this could reopen wounds? No, the mayor appeared with us in Yonkers. I’m sure there are people who didn’t want to see the story made at all. But, you know, I’m not used to making shows that everyone agrees with, so I wouldn’t know what that would feel like anyway. With two episodes down, is there anything that viewers should remember, have in mind, when they watch the next two on Sunday? I certainly don’t want to tell people what to watch until they watch it. Just that we were very true to the history. This is all predicated upon a 40-year history of American government at the federal, state and local level using public money to purposefully hyper-segregate our society. Poor people didn’t end up all packed into housing projects in one square mile of Yonkers by accident. It was a plan. It was a plan in Chicago, in Baltimore, in Dallas and everywhere that took federal housing money since the 1930s. The records, the history of it is in plain sight. I have nothing but contempt for anybody who says that [the racial integration of Yonkers] was social engineering by this judge. Really? You want to parse it that way? What bullshit. The social engineering begins in the 1930s, with FHA mortgages and with the first public housing monies in the New Deal. Republicans and Democrats are both complicit. The idea that the social engineering starts at the moment that somebody might want to restore somebody to their full civil rights, 40 years into the rigged game. And that’s when you object? Sorry, that’s racist to begin your argument there.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on August 23, 2015 07:30