Helen H. Moore's Blog, page 809
April 10, 2016
Erotic and empowered: “Outlander” and “The Girlfriend Experience” are turning women on with their “prestige porn”
You’d be forgiven for thinking Starz is where the soft-core pornography lives. Cinemax is really where it lives, but as with all premium cable television, the selective subscription model comes with the promise of nudity and sex. Starz is often bundled in an even more exclusive tier than HBO and Showtime. Until quite recently, the network focused on theatrical releases to draw viewers, and even now only offers a handful of carefully pruned originals, compared to, for example, Netflix’s dozens of new shows or Showtime’s long-running slate. Showcased in almost all of Starz’s programming—and also in those unedited feature films—are the kind of sex scenes that separate erotica from porn (at least, the way Sally in “Coupling” might define it). It’s high-concept and expensive, and the characters have motivations and feelings, but also, there’s a lot of sex. At first, that’s all it seems to be; just more skin, from a network that can get away with skin. HBO has “Ballers,” Showtime has “Masters of Sex,” and Starz has its own possibly unnecessary nudity, from the strangely salacious “Boss” to the masked orgies of “Spartacus.” Starz’s current flagship show, “Outlander,” is a television adaptation of a historical fiction novel that does double time as a steamy bodice-ripping romance. And its new show “The Girlfriend Experience,” debuting Sunday, follows the life and times of a high-priced call girl. At the risk of being too vulgar, lead Riley Keough’s breasts should probably get second billing on the call sheet. But on the best Starz shows, there is something different about the nudity, something special about the sex. In 2014, Mo Ryan wrote the definitive piece on how “Outlander's” sex scenes appeared to be doing something revolutionary: “A feast for scavengers,” she called it.

Eventually, they had sex for the first time, and the whole act was amusingly devoid of cliché: no “sexy” camera angles, no golden light, no instant nirvana. Jamie wondered if they should do it like the horses in the fields; later, Claire stopped things at one point to tell him he was accidentally crushing her. It didn’t last long. It was awkward first-date sex, and it was achingly, amusingly real. And that was as refreshing as anything else.The term “female gaze” is a bit slippery and academic; we don’t really know what it is, we just know what it is not. But at the very least, this is a sex scene of a very feminine fantasy and it’s very feminine wish fulfillment. It is not that far off from the heroine falling in love with and marrying the handsome prince. “Outlander's” wedding episode caters, lovingly, to at least one kind of female desire, and the result is a sex scene quite unlike any other on television. As the show has gone on, it has doubled down on reflecting the feminine perspective in a patriarchal world, ranging from sexual assault to, in this second season, bikini waxing. To quote the incomparable Jenny Slate: “'Outlander’ is insane bc its like ‘do u luv 2 b horny, everyone?’ But then its like ‘HERES A SCENE WHERE THEY WASH WOOL W HOT PISS. ITS FINE.’” A lot of entertainment is wish fulfillment; a lot of prestige television, in turn, has been about deconstructing those fantasies. With “Outlander” and now “The Girlfriend Experience,” Starz is exploring a few very different female fantasies in a few very different ways—with the intent, I’m guessing, of reaching a female audience that is underserved by other prestige cable networks. Ryan jokes in her piece about the “boob quota” that even Starz’s other shows appear to need to meet, and certainly, both “Outlander” and “The Girlfriend Experience” have an awful lot of boobs. But they are not, always, there for men to be looking at them, or they are at least not just for that. Some other purpose is at play. “Outlander," whose second season started Saturday, began as one very special kind of fantasy—where a happily married woman falls through time and has no choice but to marry and then bang a hot young hunk who is desperately in love with her. In season two, following author Diana Gabaldon’s cheeky sense of humor in the source material, Claire and Jamie leave misty Scotland behind for an even more romantic locale: Versailles under Louis XV, and his jewel, Paris, the greatest European city of the era. Kilts and sporrans are exchanged for silks and carriages; “Outlander's” already gorgeous costuming and set design is practically reveling in the sumptuous trappings of Bourbon France. Claire and Jamie, of course, come upon a reliable source of wealth that keeps them in gowns and coaches, while plotting to change the course of history (literally, that is what is happening). “Outlander's” first season, in the highlands of Scotland, was rather gritty; in France, any pretense of being more than luxurious wish-fulfillment has been abandoned. Claire and Jamie take to French intrigue with surprising dexterity, with some retconned character traits and a whole lot of new characters. It’s increasingly feeling a bit like everyone they’ve ever met is related to everyone else they’ve ever met, too—Claire runs into her mother-in-law, several generations removed, in a parlor. But the fantasy is delicious—beautifully rendered and infinitely appealing. And along the way, a side of history that doesn’t often make the spotlight comes with it—the history of wives and mothers, of girlfriends and mistresses. One French dame experiences the aforementioned waxing, while her niece and friends look on; later in the season, a woman tries to end her own pregnancy with what is available to her, and finds her options lacking. Chilly, sleek, and meditative, “The Girlfriend Experience”—“suggested by,” the credits read, Steven Soderbergh’s 2008 film—could not be more different. Where “Outlander” is almost baroque in its production and appeal, “The Girlfriend Experience” is a spare, indie affair of a modern, almost featureless Chicago—essentially trading one much-romanticized setting for another. Christine Reade (Keough) is a second-year law student drawn to the money and excitement of expensive sex work, but even before she starts working, she’s drawn to being watched. “I want you to watch me,” she tells a partner in the first episode, before fondling her own breasts and getting herself off. The tension of watching and being watched is central to the show; Christine consciously models herself into a figure that men will look at—and pay thousands of dollars a night to spend time with. At the same time, she watches them, with interest, revulsion, or detached amusement. She also watches herself. In one semiotically charged scene early in the season, she scrolls through the photo gallery from the erotic shoot she just did, looking for the right images for her website. She’s become the object of fantasy; she swipes through the images without seeming to even recognize that the girl on the screen is her. The show’s most climactic moments, throughout the 13-episode season, are when Christine is regarding herself; her own body, through the lens of other people’s desires and stigmas. Christine is both the object of fantasy and living out her own fantasy; it’s possible to read her story as both empowering and objectifying, with Christine as both subject and object. The title is a snarky little dig at the fantasies of these wealthy, powerful men; Christine is technically doing what they want, but she’s also walking away with stacks of cash. The figure of the nameless escort or anonymous whore is frustratingly central to prestige drama—witness the dead bodies in “True Detective” or the hanger-on goomahs in “The Sopranos.” In “The Girlfriend Experience,” the tables are turned; it’s the clients whose pale, flabby, middle-aged bodies blend together, while Christine’s eyes are always looking, searching, examining. Keough is Elvis Presley’s granddaughter, and it is particularly noticeable in the shape and set of her brow—thrust slightly forward, alertly. “The Girlfriend Experience” isn’t perfect. Christine's motivations are sometimes opaque, and sometimes not; the plot is sometimes thrilling, and sometimes not. Where "Outlander" circumvents the narrative hurdles of fantasy by giving more and more character traits to its leads, "The Girlfriend Experience" circumvents the problems of deconstruction by making Christine more and more opaque, so that her motivations do not even seem obvious to herself, let alone the viewer. But it is riveting—and sexy—to watch Christine watch the rest of the world. What becomes apparent is that she is acutely aware of how the world works. Her life as a law intern—revealing a real legal curiosity, not just a dangled erotic fetish for her clients—exposes her to any number of power imbalances and political struggles. Her decision to leave behind the law is partly driven by her realization that she obtains more of the power she wants—attention, money, access—through her life as a sex worker. Her own exquisitely rendered interest in power dynamics mirrors, to some degree, the network’s. Starz is not so subtly going after marginalized audiences, whether that is female audiences, African-American audiences (a hugely underreported phenomenon), or that much-coveted pirate demographic. (Okay, just kidding on that last one, though “Black Sails,” starring John Malkovich, does have its own loyal fanbase.) Vulture’s Joe Adalian wrote last year:
[Starz CEO Chris] Albrecht decided to shift Starz’s series development to focus on projects that would speak not only to specific segments of the audience but viewer groups that, despite having plenty of options on broadcast and basic cable, hadn’t always been targeted by premium cable networks. “I looked around and … it seemed as if there were audiences that were being underserved — that were still paying money but that were probably not getting the value that they would hope to get off of a premium subscription,” he says. “And we said, ‘Let’s target those audiences, and let’s back shows that we think can drive a real fervent fan base that then becomes the kind of advocacy group for the shows themselves.'”It’s paying off. I’m not qualified to speak for either pirates or African-Americans, but certainly, as a woman, Starz’s programming both meets the apparently necessary components for a prestige show—sex, violence, expensive sets, pedigreed talent—while also delivering fantasy and deconstruction that can appeal to women viewers. (Some women viewers, anyway. This isn't a triumph for queer experiences, or for non-rich, non-white female experiences.) But it does seem significant, particularly with sexy shows like “Outlander” and “The Girlfriend Experience,” that Starz showcases its eroticism in a way that feels empathic and empowered, not exploitative. With female talent behind the camera—co-executive producer Anne Kenney and director Anna Foerster were behind that “Outlander” wedding sex scene, which of course was adapted from Diana Gabaldon’s books; Amy Seimetz writes, directs, and stars in “The Girlfriend Experience”—and the aforementioned feminine sensibilities in front, Starz’s prestige-y porn takes on a different, more democratic tone. A world where boobs are not just for all, but by all, too.






Published on April 10, 2016 14:00
Clouds won’t slow climate change: NASA survey finds their cooling effects have been overstated
To some, clouds resemble bunnies. To others, they can look like squished flowers. When scientists used NASA data to peer into clouds, what they saw resembled a hazard sign warning of a fast-deteriorating climate ahead. Analysis of the first seven years of data from a NASA cloud-monitoring mission suggests clouds are doing less to slow the warming of the planet than previously thought, and that temperatures may rise faster than expected as greenhouse gas pollution worsens—perhaps 25 percent faster. Clouds can play an important role in slowing global warming by reflecting energy back into space. As temperatures rise, clouds contain more liquid water and fewer ice crystals, making them brighter, meaning they reflect more sunlight. The new research, however, suggests climate models have overestimated how much ice is in clouds, meaning less is available to be converted to liquid as temperatures rise. “When carbon dioxide concentrations and temperatures rise, then mixed-phase clouds will increase their liquid water content,” said Ivy Tan, a PhD candidate at Yale University who led the research, which investigated common clouds that contain both ice and water. “Many models are overestimating how much ice is in the mixed-phase clouds.” The repercussions of the findings, which were published Thursday in Science, could make it harder to hold warming to limits set during recent United Nations climate negotiations—but they’re being received cautiously by other climate scientists, with questions raised over the results of the analysis. The coldest clouds are full of ice; the warmest are full of water. Modeling experiments by Tan and two other scientists focused on inbetweeners—mixed-phase clouds, such as undulating stratiform and fluffy stratocumulus clouds, which are abundant over the vast Southern Ocean and around the Northern Hemisphere north of New York. For their study, the researchers used the NASA data to guide the modification of a popular earth model. They added more liquid and less ice to the clouds in their model simulations, striving to create more realistic conditions. Because there was less ice, cloud brightness increased more slowly than it did in the unmodified model, since fewer ice crystals were replaced with reflective liquid as temperatures warmed. One of climate science’s great quests is to project how much earth warms when carbon dioxide concentrations double—something known as climate sensitivity. When carbon dioxide levels were doubled in the modified model, temperatures rose by at least a quarter more than they did when the unmodified model was used—to at least 5°C (9°F). What the findings might actually mean for earth will depend heavily on how much carbon dioxide, methane and other greenhouse gases yet gets billowed into the atmosphere, and how quickly. But the discovery suggests impacts from climate change will be worse, and that they will get worse more quickly than earth models had previously indicated. Isaac Held, a National Oceanic and Atmospheric Administration climate scientist, said he agreed with the researchers about the “the importance of getting the ice-liquid ratio in mixed-phase clouds right,” but he doesn’t agree that global climate models generally underestimate climate sensitivity. Based on past observations, Held, who was not involved with the study, said the climate sensitivity of 5°C or more shown by the new research may be implausible. “Admittedly, it is a rather high estimate, which may reflect the fact that the model used is already on the sensitive side,” said Mark Zelinka, a cloud modeling expert at Lawrence Livermore National Laboratory who worked with Tan on the research. But, based on Zelinka’s interpretation of historical data, he said it “seems premature” to dismiss it as implausible. Tan, meanwhile, said it would be a mistake to focus too closely on the exact number. The sensitivity result from the modeling experiments should be taken “with a grain of salt,” she said. That’s because the study was based on a single model. A main point in conducting the experiments was to show that climate models contain a bias that could be corrected. The group hopes other scientists will conduct similar experiments using different models to help hone in on a more reliable measure of climate sensitivity. Michael Mann, a meteorology professor at Penn State who was not involved with the study, said it’s “speculative” but “plausible” that global climate models have been underestimating climate sensitivity by assuming too much cloud glaciation. “This is one of several recent studies that provide sobering evidence that earth's climate sensitivity may lie in the upper end of the current uncertainty range,” Mann said in an email. “That means that avoiding dangerous 2°C warming might be an even greater challenge.” The new findings underscore the urgency of taking steps to slash rates of greenhouse gas pollution, Mann said. Carbon dioxide levels have risen more than 40 percent to 400 parts per million since before the Industrial Revolution, and they continue to rise at a hastening pace. The increase in carbon dioxide levels recorded so far has played the most important role in pushing average global temperatures up by 1°C (1.8°F) during the last 200 years. That has worsened heat waves, floods and droughts, leading to record-breaking temperatures in 2014 and then again in 2015. A U.N. pact negotiated in Paris in December set a goal for limiting warming to well below 2°C. Plans by the world’s biggest polluters to protect the climate, including China, the U.S. and Europe, however, so far fall well short of the measures needed to achieve that goal.







Published on April 10, 2016 13:00
The bizarro “Inherit the Wind”: Understanding evangelical Christianity’s persecution complex through “God’s Not Dead 2”
In a pivotal scene from the famous 1960 film "Inherit the Wind," a biblical scholar, prosecuting a defendant on trial for teaching evolution in a town whose laws forbid it, is called to the stand as an expert witness. Slowly but surely, he begins to unravel on the stand. The defense attorney, Henry Drummond (rendered vividly by Spencer Tracy), pulls apart his literal reading of the Bible. If Joshua had really made the Sun stand still, wouldn't the Earth have been destroyed? Where did Cain's wife come from if “in the beginning” there were only Cain, Abel, Adam and Eve? How can we be sure the Earth was created in 4004 B.C. if the Sun, the metric by which we measure time, was not created until the fourth day? "God's Not Dead 2," the sequel to the commercially successful movie of the same name, is an inversion of this theme. In the film, Grace, a history teacher played by Melissa Joan Hart, is asked whether the nonviolent philosophy preached by Mohandas Gandhi and Martin Luther King, Jr. has parallels to that preached by Jesus in the Bible. In response, she quotes scripture, and endorses the analogy. A scoffing student ridicules her by sneering, I kid you not, that Jesus could not have been great because he died. Grace responds that Jesus, like King, died out of dedication to causes larger than himself, and that this does not detract from the greatness of either man. Teachers, administrators, and the ACLU alike are outraged by this lesson, and Grace winds up in court, where her lawyer finds himself proving, as one of the satanic ACLU attorneys puts it, “the existence of Jesus Christ.” It's impossible to stress how deeply unrealistic the film's premise is, and important to stress that this case was not “based on a true story,” itself a loose specification. Nor was it a dramatized version of real events as "Inherit the Wind," based on the 1925 Scopes Monkey Trial, was. This shouldn't be surprising to anyone who saw the film with a vaguely critical eye, but should be surprising to anyone who took its message to heart. The movie suggests the persecution of Christians in our society is readily apparent in the real world, and not just as artistic license. (“Join the movement,” the closing credits implore). Then why on earth would its writers and producers have to invent such a case out of thin air, rather than portraying one of the multitudes of victimless crimes for which Christians throughout the country are presumably being prosecuted? Perhaps because employees demanding contraceptive coverage or gay couples service might be more sympathetic than fiendish ACLU lawyers? Or perhaps because no such case exists? In my personal experience at a diverse public school in the notoriously blue state of New Jersey, my teachers frequently discussed Christianity from a historical and political perspective as well as a literary one. (In a heavily South Asian school district, Hinduism was discussed only during a middle school unit on world religions, during which, for the first time in my 13 years as a practicing Hindu, I had to memorize the names and descriptions of the various castes.) The only complaint I ever heard came from a Catholic student who disapproved of our English teacher's reference to Christian “mythology.” But my dominant education in various religious traditions occurred outside the classroom. I enjoyed hiding the afikomen during a friend's Passover seder, much as I enjoyed breaking fast with Muslim friends during Ramadan. My friends, similarly, have relished our discussions about the finer points of Dawkins, visits to Hindu temples, or thumbing through the Ramayana. "God's Not Dead 2" offers no similar model for plurality. The subtler points of the movie, few and far between, were well taken. It is clear, though never articulated, that the Christian teacher and her Christian grandfather are of more modest means than the parents who lead the charge against her. The very fact that religion is often used as a proxy for class, race, or nationality is what distinguishes it from people's ideas and beliefs in other areas of life. It's what makes religious prejudice uniquely problematic. But the movie isn't a denunciation of that prejudice. We're instead led to believe that people are either Christian or unabashedly hateful atheists who boast their ignorance of both history and the scripture they denounce. When Grace's student Brooke approaches her at a local café, she is struggling with the death of her brother. Grace suggests she might find solace in Jesus, which inspires her to raise his teachings in class. Conveniently, though Grace has no way of knowing this, Brooke's brother was also a faithful Christian who hid his beliefs from his family for fear of judgment. Even more conveniently, though Grace also has no way of knowing this, she finds her parents' atheism deeply unfulfilling, and her brother's old Bible far more so. But what if Brooke's brother hadn't died at all? What if she were struggling with whether to come out to her parents or how to cope with her out-of-wedlock pregnancy or how to leave a church rather than join one? What if Brooke had dared instead to find comfort or solace in her agnosticism? What if she were a devout Jewish or Muslim or Hindu? Wouldn't Grace's suggestion that conversion would help her through the unexpected loss of a family member be insensitive, tone-deaf and horrifically unhelpful? The only mention the movie gives us of other faiths is when that fiendish ACLU attorney draws a comparison to Islam when addressing the jury. Had Grace instead quoted the Quran, he absurdly suggests, her familiarity with religious texts would have amounted to an endorsement thereof. I am genuinely curious as to how the filmmakers would feel about a public school teacher offering a lesson in Islam, a religion that also recognizes Jesus as a prophet, rather than the Bible. What if, during a discussion on nonviolence in which she explicitly mentions Gandhi's influence on King, she had instead delved into the intricacies of Hinduism or Jainism. Much like King was guided by his faith, a fact the movie takes special pains to note, Gandhi spoke and wrote publicly about the huge impact both traditions had on the nonviolent philosophy later adopted by King, a philosophy that was, supposedly, the subject of Grace's lesson. What if, when referring to these traditions, Grace presented its teachings as historical truths rather than alternate philosophies? What if Grace were teaching Darwin in Louisiana? The movie sidesteps all of these questions by validating the teacher's subtle assertion that what is presented in the Bible is incontrovertible fact. Experts are called to the stand to speak to the veracity of the text. One, whose cameo conveniently doubles as book promotion, cites the scholar Gerd Lüdemann. He mentions that Lüdemann is an atheist, but neglects to mention that his declaration of atheism was made in response to his own studies of the historicity of the New Testament, which he found wanting, and was met with calls for his dismissal that cost him his title at the University of Göttingen. Another expert who's analyzed murder cases argues that the testimonies present in the Bible are realistically rendered, proving nothing other than that it was well-written. But the jury and, it turns out, one of the prosecuting attorneys, are convinced. In the midst of a presidential election that has, at various moments, seen leading candidates calling for surveillance, databases, and bans on the entry of a particular, non-Christian religious group, what are we to make of this film? Given its uninspired box office performance and abysmal critical ratings, it promises to do its most damage to its target audience, to whom it offers nothing other than isolation and paranoia, and the decidedly un-Christian demonization of people who happen have a different vision of their children's taxpayer-funded educations. Had that iconic scene from "Inherit the Wind" been that film's conclusion, it could easily have been construed as presenting an irresolvable conflict between science and religion. Instead, the closing scene reveals that Henry is not a non-believer. His witness's sin, he says, was that he “looked for God too high up and too far away.” Henry is not anti-religion, but anti-fundamentalism. A reading of scripture grounded in facts and figures, rather, is a deeply petty one, unworthy of the transience offered by religious belief. Historical veracity is antithetical to the very premise of faith, powerful precisely because it needn't be true to be real.In a pivotal scene from the famous 1960 film "Inherit the Wind," a biblical scholar, prosecuting a defendant on trial for teaching evolution in a town whose laws forbid it, is called to the stand as an expert witness. Slowly but surely, he begins to unravel on the stand. The defense attorney, Henry Drummond (rendered vividly by Spencer Tracy), pulls apart his literal reading of the Bible. If Joshua had really made the Sun stand still, wouldn't the Earth have been destroyed? Where did Cain's wife come from if “in the beginning” there were only Cain, Abel, Adam and Eve? How can we be sure the Earth was created in 4004 B.C. if the Sun, the metric by which we measure time, was not created until the fourth day? "God's Not Dead 2," the sequel to the commercially successful movie of the same name, is an inversion of this theme. In the film, Grace, a history teacher played by Melissa Joan Hart, is asked whether the nonviolent philosophy preached by Mohandas Gandhi and Martin Luther King, Jr. has parallels to that preached by Jesus in the Bible. In response, she quotes scripture, and endorses the analogy. A scoffing student ridicules her by sneering, I kid you not, that Jesus could not have been great because he died. Grace responds that Jesus, like King, died out of dedication to causes larger than himself, and that this does not detract from the greatness of either man. Teachers, administrators, and the ACLU alike are outraged by this lesson, and Grace winds up in court, where her lawyer finds himself proving, as one of the satanic ACLU attorneys puts it, “the existence of Jesus Christ.” It's impossible to stress how deeply unrealistic the film's premise is, and important to stress that this case was not “based on a true story,” itself a loose specification. Nor was it a dramatized version of real events as "Inherit the Wind," based on the 1925 Scopes Monkey Trial, was. This shouldn't be surprising to anyone who saw the film with a vaguely critical eye, but should be surprising to anyone who took its message to heart. The movie suggests the persecution of Christians in our society is readily apparent in the real world, and not just as artistic license. (“Join the movement,” the closing credits implore). Then why on earth would its writers and producers have to invent such a case out of thin air, rather than portraying one of the multitudes of victimless crimes for which Christians throughout the country are presumably being prosecuted? Perhaps because employees demanding contraceptive coverage or gay couples service might be more sympathetic than fiendish ACLU lawyers? Or perhaps because no such case exists? In my personal experience at a diverse public school in the notoriously blue state of New Jersey, my teachers frequently discussed Christianity from a historical and political perspective as well as a literary one. (In a heavily South Asian school district, Hinduism was discussed only during a middle school unit on world religions, during which, for the first time in my 13 years as a practicing Hindu, I had to memorize the names and descriptions of the various castes.) The only complaint I ever heard came from a Catholic student who disapproved of our English teacher's reference to Christian “mythology.” But my dominant education in various religious traditions occurred outside the classroom. I enjoyed hiding the afikomen during a friend's Passover seder, much as I enjoyed breaking fast with Muslim friends during Ramadan. My friends, similarly, have relished our discussions about the finer points of Dawkins, visits to Hindu temples, or thumbing through the Ramayana. "God's Not Dead 2" offers no similar model for plurality. The subtler points of the movie, few and far between, were well taken. It is clear, though never articulated, that the Christian teacher and her Christian grandfather are of more modest means than the parents who lead the charge against her. The very fact that religion is often used as a proxy for class, race, or nationality is what distinguishes it from people's ideas and beliefs in other areas of life. It's what makes religious prejudice uniquely problematic. But the movie isn't a denunciation of that prejudice. We're instead led to believe that people are either Christian or unabashedly hateful atheists who boast their ignorance of both history and the scripture they denounce. When Grace's student Brooke approaches her at a local café, she is struggling with the death of her brother. Grace suggests she might find solace in Jesus, which inspires her to raise his teachings in class. Conveniently, though Grace has no way of knowing this, Brooke's brother was also a faithful Christian who hid his beliefs from his family for fear of judgment. Even more conveniently, though Grace also has no way of knowing this, she finds her parents' atheism deeply unfulfilling, and her brother's old Bible far more so. But what if Brooke's brother hadn't died at all? What if she were struggling with whether to come out to her parents or how to cope with her out-of-wedlock pregnancy or how to leave a church rather than join one? What if Brooke had dared instead to find comfort or solace in her agnosticism? What if she were a devout Jewish or Muslim or Hindu? Wouldn't Grace's suggestion that conversion would help her through the unexpected loss of a family member be insensitive, tone-deaf and horrifically unhelpful? The only mention the movie gives us of other faiths is when that fiendish ACLU attorney draws a comparison to Islam when addressing the jury. Had Grace instead quoted the Quran, he absurdly suggests, her familiarity with religious texts would have amounted to an endorsement thereof. I am genuinely curious as to how the filmmakers would feel about a public school teacher offering a lesson in Islam, a religion that also recognizes Jesus as a prophet, rather than the Bible. What if, during a discussion on nonviolence in which she explicitly mentions Gandhi's influence on King, she had instead delved into the intricacies of Hinduism or Jainism. Much like King was guided by his faith, a fact the movie takes special pains to note, Gandhi spoke and wrote publicly about the huge impact both traditions had on the nonviolent philosophy later adopted by King, a philosophy that was, supposedly, the subject of Grace's lesson. What if, when referring to these traditions, Grace presented its teachings as historical truths rather than alternate philosophies? What if Grace were teaching Darwin in Louisiana? The movie sidesteps all of these questions by validating the teacher's subtle assertion that what is presented in the Bible is incontrovertible fact. Experts are called to the stand to speak to the veracity of the text. One, whose cameo conveniently doubles as book promotion, cites the scholar Gerd Lüdemann. He mentions that Lüdemann is an atheist, but neglects to mention that his declaration of atheism was made in response to his own studies of the historicity of the New Testament, which he found wanting, and was met with calls for his dismissal that cost him his title at the University of Göttingen. Another expert who's analyzed murder cases argues that the testimonies present in the Bible are realistically rendered, proving nothing other than that it was well-written. But the jury and, it turns out, one of the prosecuting attorneys, are convinced. In the midst of a presidential election that has, at various moments, seen leading candidates calling for surveillance, databases, and bans on the entry of a particular, non-Christian religious group, what are we to make of this film? Given its uninspired box office performance and abysmal critical ratings, it promises to do its most damage to its target audience, to whom it offers nothing other than isolation and paranoia, and the decidedly un-Christian demonization of people who happen have a different vision of their children's taxpayer-funded educations. Had that iconic scene from "Inherit the Wind" been that film's conclusion, it could easily have been construed as presenting an irresolvable conflict between science and religion. Instead, the closing scene reveals that Henry is not a non-believer. His witness's sin, he says, was that he “looked for God too high up and too far away.” Henry is not anti-religion, but anti-fundamentalism. A reading of scripture grounded in facts and figures, rather, is a deeply petty one, unworthy of the transience offered by religious belief. Historical veracity is antithetical to the very premise of faith, powerful precisely because it needn't be true to be real.







Published on April 10, 2016 12:30
Bill Clinton has lost his superpower: Why his confrontation with BLM was such a stunning mistake
Bill Clinton has long had the nickname “Big Dog,” which seems appropriate in the wake of his confrontation with a couple of Black Lives Matter protestors in Philadelphia on Thursday. Watching a brief video of the encounter, I had to suppress the urge to yell, “No, Bill Clinton! No! That’s a bad boy! That’s a very bad Bill Clinton!” The BLM activists had shown up to a rally in an African-American neighborhood in Philadelphia to heckle Clinton over the crime bill he signed while president in 1994. The bill has been blamed for many of the problems facing America’s carceral state today: overcrowded prisons, mass incarceration of a disproportionate number of African-American males, and overly harsh sentencing for relatively minor offenses, among other issues. Partly because these consequences have become so pervasive and destructive, and partly because the Democratic Party has moved to the left in the last 22 years, their positions on the crime bill have become a litmus test for 2016’s Democratic candidates. Better-informed people than I can argue from now until the trumpets sound about who between Bernie Sanders and Hillary Clinton was “right” and who was “wrong” about the bill in 1994, but there is no doubt that criminal-justice reform is rightfully a major issue in this primary and should remain one in the general campaign. Which is why it was so sad to watch Bill Clinton’s reaction on Thursday. Talk about someone whose time has passed him by! It was a bit like being a longtime Lakers fan watching Kobe Bryant’s last season. His knees are shot, he spends all his bench time with ice packs and bandages wrapped around him like a partially finished mummy, and yet he’s still going out there and jacking up terrible shots. And fewer and fewer of them are finding the net than earlier in his career. In the case of Clinton, he seems to have missed the point of Black Lives Matter so badly it was like seeing Kobe airball a layup. One aspect of campaigning where Clinton always excelled, where he was in fact the most talented politician on the national scene for so long, was in his ability to spin a narrative, to frame it and shape it and make his listeners believe in its truth. He was doing this as recently as the 2012 Democratic convention, when he made a meal of Mitt Romney and Paul Ryan with the same enthusiasm and thoroughness with which he used to make a meal out of McDonald’s entire menu. But it makes no difference if his defense of the crime bill and his role in shaping and signing it, was accurate. Or if he was overstating or understating the support the bill had in the African-American community in 1994. Like the original debate itself, the truth is a lot more complicated and could benefit from some context that our current political dialogue, conducted as it is in soundbites and 140 characters at a time on Twitter, does not allow for. Clinton needed to at least show a little bit of awareness about the contentiousness of this debate in the Democratic Party base. He certainly knows about it. His wife kicked off her campaign last year with a major speech on criminal justice reform and has apologized for her invocation in the '90s of the now-discredited “super-predators” theory (which she seems to have used only once publicly, still one more time than she should have). And her husband went in front of the NAACP a few months back to apologize for the consequences of the crime bill, and promised to work to fix its more egregious effects. Instead, in Philadelphia he was biting and defensive, which had the Washington Post calling the exchange “2016’s Sister Souljah moment.” Which, holy dear God, if you remember, the original Sister Souljah moment in 1992, is one of the last things about Bill’s presidency you want to bring up in a conversation that has race as such a huge component. It was enough of an insulting and gross pander to white voters 24 years ago, even if you find it understandable in the context of that era’s racial and criminal politics, and this is a different electorate and a very different Democratic primary. Bill Clinton, who used to be so great at reading the moods of the people and telling them exactly what they wanted to hear, should have known better. But, the cheers of the loyalists in the Philadelphia crowd notwithstanding, he apparently didn’t. I’ll give Clinton half-credit for calming down by Friday and speaking a little more reasonably about the issue. (He only gets half-credit because, in typical Clinton fashion, he would only say that he “almost wanted to apologize,” then seemed to turn it into a plea for Democrats to fight Republicans, not each other.) After Thursday, political writers were saying that since he is so apt to give public performances that lead to headaches for his wife, her campaign should sideline him for the duration. This is probably a wise move, but it only hides the problem. It does not solve it. What happens if Hillary wins the presidency, making Bill the First Gentleman, or whatever we wind up calling him? How much time will a Hillary Clinton White House spend cleaning up because the Big Dog keeps piddling on the carpet? Here is what Bill Clinton can do, should he find himself once again residing in the White House next January, with an office in the East Wing this time. First Ladies have their causes, for which they agitate and plan and lobby Capitol Hill. Think Michelle Obama and the obesity epidemic, Laura Bush and literacy, or Hillary Clinton and healthcare reform. What Bill, as the nation’s first First Gentleman, can do is make criminal justice reform his pet issue. He can tour the country talking to reform advocates and BLM activists, he can chair task forces to come up with ways of unwinding some of the damage that the ’94 bill did. It’s a huge issue, encompassing elements of our ridiculous drug war and the fight against poverty, among others. The whole project could keep him busy for his wife’s entire term. Will he do that? Will he embark on a project that would require him to renounce part of his own presidency? That would require him to sit and listen, without getting angry and defensive, to people whose lives have been terribly impacted and even destroyed by the reforms of the 1990s? Can he recommit himself to a role he seemed to be playing a bit during the George W. Bush era – that of a beloved and wise elder statesman? Promising such a commitment might be a good way for him to help his wife’s campaign now, rather than muzzling himself and going off to brood in Chappaqua. Confront the problem that is Bill Clinton – with all the contradictions and frustrations of his always-enormous personality – and put it to work. It won’t change the minds of the Clintons’ most fervent opponents, who will see it as a cynical ploy for votes that a Hillary Clinton administration will never follow through. But who knows? It would not be the first time politicians did the right thing for wrong or self-interested reasons.Bill Clinton has long had the nickname “Big Dog,” which seems appropriate in the wake of his confrontation with a couple of Black Lives Matter protestors in Philadelphia on Thursday. Watching a brief video of the encounter, I had to suppress the urge to yell, “No, Bill Clinton! No! That’s a bad boy! That’s a very bad Bill Clinton!” The BLM activists had shown up to a rally in an African-American neighborhood in Philadelphia to heckle Clinton over the crime bill he signed while president in 1994. The bill has been blamed for many of the problems facing America’s carceral state today: overcrowded prisons, mass incarceration of a disproportionate number of African-American males, and overly harsh sentencing for relatively minor offenses, among other issues. Partly because these consequences have become so pervasive and destructive, and partly because the Democratic Party has moved to the left in the last 22 years, their positions on the crime bill have become a litmus test for 2016’s Democratic candidates. Better-informed people than I can argue from now until the trumpets sound about who between Bernie Sanders and Hillary Clinton was “right” and who was “wrong” about the bill in 1994, but there is no doubt that criminal-justice reform is rightfully a major issue in this primary and should remain one in the general campaign. Which is why it was so sad to watch Bill Clinton’s reaction on Thursday. Talk about someone whose time has passed him by! It was a bit like being a longtime Lakers fan watching Kobe Bryant’s last season. His knees are shot, he spends all his bench time with ice packs and bandages wrapped around him like a partially finished mummy, and yet he’s still going out there and jacking up terrible shots. And fewer and fewer of them are finding the net than earlier in his career. In the case of Clinton, he seems to have missed the point of Black Lives Matter so badly it was like seeing Kobe airball a layup. One aspect of campaigning where Clinton always excelled, where he was in fact the most talented politician on the national scene for so long, was in his ability to spin a narrative, to frame it and shape it and make his listeners believe in its truth. He was doing this as recently as the 2012 Democratic convention, when he made a meal of Mitt Romney and Paul Ryan with the same enthusiasm and thoroughness with which he used to make a meal out of McDonald’s entire menu. But it makes no difference if his defense of the crime bill and his role in shaping and signing it, was accurate. Or if he was overstating or understating the support the bill had in the African-American community in 1994. Like the original debate itself, the truth is a lot more complicated and could benefit from some context that our current political dialogue, conducted as it is in soundbites and 140 characters at a time on Twitter, does not allow for. Clinton needed to at least show a little bit of awareness about the contentiousness of this debate in the Democratic Party base. He certainly knows about it. His wife kicked off her campaign last year with a major speech on criminal justice reform and has apologized for her invocation in the '90s of the now-discredited “super-predators” theory (which she seems to have used only once publicly, still one more time than she should have). And her husband went in front of the NAACP a few months back to apologize for the consequences of the crime bill, and promised to work to fix its more egregious effects. Instead, in Philadelphia he was biting and defensive, which had the Washington Post calling the exchange “2016’s Sister Souljah moment.” Which, holy dear God, if you remember, the original Sister Souljah moment in 1992, is one of the last things about Bill’s presidency you want to bring up in a conversation that has race as such a huge component. It was enough of an insulting and gross pander to white voters 24 years ago, even if you find it understandable in the context of that era’s racial and criminal politics, and this is a different electorate and a very different Democratic primary. Bill Clinton, who used to be so great at reading the moods of the people and telling them exactly what they wanted to hear, should have known better. But, the cheers of the loyalists in the Philadelphia crowd notwithstanding, he apparently didn’t. I’ll give Clinton half-credit for calming down by Friday and speaking a little more reasonably about the issue. (He only gets half-credit because, in typical Clinton fashion, he would only say that he “almost wanted to apologize,” then seemed to turn it into a plea for Democrats to fight Republicans, not each other.) After Thursday, political writers were saying that since he is so apt to give public performances that lead to headaches for his wife, her campaign should sideline him for the duration. This is probably a wise move, but it only hides the problem. It does not solve it. What happens if Hillary wins the presidency, making Bill the First Gentleman, or whatever we wind up calling him? How much time will a Hillary Clinton White House spend cleaning up because the Big Dog keeps piddling on the carpet? Here is what Bill Clinton can do, should he find himself once again residing in the White House next January, with an office in the East Wing this time. First Ladies have their causes, for which they agitate and plan and lobby Capitol Hill. Think Michelle Obama and the obesity epidemic, Laura Bush and literacy, or Hillary Clinton and healthcare reform. What Bill, as the nation’s first First Gentleman, can do is make criminal justice reform his pet issue. He can tour the country talking to reform advocates and BLM activists, he can chair task forces to come up with ways of unwinding some of the damage that the ’94 bill did. It’s a huge issue, encompassing elements of our ridiculous drug war and the fight against poverty, among others. The whole project could keep him busy for his wife’s entire term. Will he do that? Will he embark on a project that would require him to renounce part of his own presidency? That would require him to sit and listen, without getting angry and defensive, to people whose lives have been terribly impacted and even destroyed by the reforms of the 1990s? Can he recommit himself to a role he seemed to be playing a bit during the George W. Bush era – that of a beloved and wise elder statesman? Promising such a commitment might be a good way for him to help his wife’s campaign now, rather than muzzling himself and going off to brood in Chappaqua. Confront the problem that is Bill Clinton – with all the contradictions and frustrations of his always-enormous personality – and put it to work. It won’t change the minds of the Clintons’ most fervent opponents, who will see it as a cynical ploy for votes that a Hillary Clinton administration will never follow through. But who knows? It would not be the first time politicians did the right thing for wrong or self-interested reasons.Bill Clinton has long had the nickname “Big Dog,” which seems appropriate in the wake of his confrontation with a couple of Black Lives Matter protestors in Philadelphia on Thursday. Watching a brief video of the encounter, I had to suppress the urge to yell, “No, Bill Clinton! No! That’s a bad boy! That’s a very bad Bill Clinton!” The BLM activists had shown up to a rally in an African-American neighborhood in Philadelphia to heckle Clinton over the crime bill he signed while president in 1994. The bill has been blamed for many of the problems facing America’s carceral state today: overcrowded prisons, mass incarceration of a disproportionate number of African-American males, and overly harsh sentencing for relatively minor offenses, among other issues. Partly because these consequences have become so pervasive and destructive, and partly because the Democratic Party has moved to the left in the last 22 years, their positions on the crime bill have become a litmus test for 2016’s Democratic candidates. Better-informed people than I can argue from now until the trumpets sound about who between Bernie Sanders and Hillary Clinton was “right” and who was “wrong” about the bill in 1994, but there is no doubt that criminal-justice reform is rightfully a major issue in this primary and should remain one in the general campaign. Which is why it was so sad to watch Bill Clinton’s reaction on Thursday. Talk about someone whose time has passed him by! It was a bit like being a longtime Lakers fan watching Kobe Bryant’s last season. His knees are shot, he spends all his bench time with ice packs and bandages wrapped around him like a partially finished mummy, and yet he’s still going out there and jacking up terrible shots. And fewer and fewer of them are finding the net than earlier in his career. In the case of Clinton, he seems to have missed the point of Black Lives Matter so badly it was like seeing Kobe airball a layup. One aspect of campaigning where Clinton always excelled, where he was in fact the most talented politician on the national scene for so long, was in his ability to spin a narrative, to frame it and shape it and make his listeners believe in its truth. He was doing this as recently as the 2012 Democratic convention, when he made a meal of Mitt Romney and Paul Ryan with the same enthusiasm and thoroughness with which he used to make a meal out of McDonald’s entire menu. But it makes no difference if his defense of the crime bill and his role in shaping and signing it, was accurate. Or if he was overstating or understating the support the bill had in the African-American community in 1994. Like the original debate itself, the truth is a lot more complicated and could benefit from some context that our current political dialogue, conducted as it is in soundbites and 140 characters at a time on Twitter, does not allow for. Clinton needed to at least show a little bit of awareness about the contentiousness of this debate in the Democratic Party base. He certainly knows about it. His wife kicked off her campaign last year with a major speech on criminal justice reform and has apologized for her invocation in the '90s of the now-discredited “super-predators” theory (which she seems to have used only once publicly, still one more time than she should have). And her husband went in front of the NAACP a few months back to apologize for the consequences of the crime bill, and promised to work to fix its more egregious effects. Instead, in Philadelphia he was biting and defensive, which had the Washington Post calling the exchange “2016’s Sister Souljah moment.” Which, holy dear God, if you remember, the original Sister Souljah moment in 1992, is one of the last things about Bill’s presidency you want to bring up in a conversation that has race as such a huge component. It was enough of an insulting and gross pander to white voters 24 years ago, even if you find it understandable in the context of that era’s racial and criminal politics, and this is a different electorate and a very different Democratic primary. Bill Clinton, who used to be so great at reading the moods of the people and telling them exactly what they wanted to hear, should have known better. But, the cheers of the loyalists in the Philadelphia crowd notwithstanding, he apparently didn’t. I’ll give Clinton half-credit for calming down by Friday and speaking a little more reasonably about the issue. (He only gets half-credit because, in typical Clinton fashion, he would only say that he “almost wanted to apologize,” then seemed to turn it into a plea for Democrats to fight Republicans, not each other.) After Thursday, political writers were saying that since he is so apt to give public performances that lead to headaches for his wife, her campaign should sideline him for the duration. This is probably a wise move, but it only hides the problem. It does not solve it. What happens if Hillary wins the presidency, making Bill the First Gentleman, or whatever we wind up calling him? How much time will a Hillary Clinton White House spend cleaning up because the Big Dog keeps piddling on the carpet? Here is what Bill Clinton can do, should he find himself once again residing in the White House next January, with an office in the East Wing this time. First Ladies have their causes, for which they agitate and plan and lobby Capitol Hill. Think Michelle Obama and the obesity epidemic, Laura Bush and literacy, or Hillary Clinton and healthcare reform. What Bill, as the nation’s first First Gentleman, can do is make criminal justice reform his pet issue. He can tour the country talking to reform advocates and BLM activists, he can chair task forces to come up with ways of unwinding some of the damage that the ’94 bill did. It’s a huge issue, encompassing elements of our ridiculous drug war and the fight against poverty, among others. The whole project could keep him busy for his wife’s entire term. Will he do that? Will he embark on a project that would require him to renounce part of his own presidency? That would require him to sit and listen, without getting angry and defensive, to people whose lives have been terribly impacted and even destroyed by the reforms of the 1990s? Can he recommit himself to a role he seemed to be playing a bit during the George W. Bush era – that of a beloved and wise elder statesman? Promising such a commitment might be a good way for him to help his wife’s campaign now, rather than muzzling himself and going off to brood in Chappaqua. Confront the problem that is Bill Clinton – with all the contradictions and frustrations of his always-enormous personality – and put it to work. It won’t change the minds of the Clintons’ most fervent opponents, who will see it as a cynical ploy for votes that a Hillary Clinton administration will never follow through. But who knows? It would not be the first time politicians did the right thing for wrong or self-interested reasons.







Published on April 10, 2016 11:00
Technology f*cked us all: The anxiety driving Donald Trump and Bernie Sanders is really about machines taking our jobs
Peel away the billionaire braggadocio, set aside the insults and bile, and Donald Trump is running for president on a promise to build a protective wall around America’s white working class. Most famously he is promising to build a physical wall on the Mexican border – which he will somehow persuade the Mexicans to pay for – to protect American workers from immigrants bent on stealing their jobs. But he is also promising to build trade walls by renegotiating free trade agreements so workers in China and elsewhere can’t undercut American workers. And he is promising to wall us off from the world diplomatically so American blood and treasure will no longer be spent defending our allies in Asia and Europe. This is the core message Trump hopes to ride to the Oval Office: If we build a big, beautiful wall around Fortress America, we can sustain ourselves without the help of the rest of the world. Let us ignore for a moment all the practical reasons why this make no sense. Forget that, whoever ends up paying for it, no wall can keep out desperate, hungry people who already tunnel miles underground and cross searing deserts for a better life in America. Forget, too, that with more than 11 million illegal aliens living in this country, some of them for decades, we are never going to be able to send them all home. And forget that free trade deals, while they do send many jobs overseas, also benefit Americans by reducing the cost of goods we all buy. Forget all that. Trump is wrong because, while he, along with Democratic firebrand Bernie Sanders, have tapped into the deep pain being felt by America’s working class, they are misdiagnosing its root cause. Companies have always cut costs by shopping around for the cheapest workers. Bankers and corporate executives have always taken their profits first and absorbed their losses only after everyone else has suffered. Politicians have always sided, often corruptly, with whoever holds the most money and power. What is different today, and what is ailing America’s working and lower-middle classes, is the way machines have fundamentally reshaped the work undereducated workers do. Advances in technology are gradually draining the skill and human judgement from every kind of working-class labor, from weaving a bolt of cloth to frying a hamburger, giving companies free rein to transfer once reasonably well-paid skilled and semi-skilled American jobs to unskilled foreign workers, both here and abroad. At the same time, advances in communications and transportation have enabled companies to build truly global supply chains, opening the way for them to shop the world for the cheapest and least powerful workers to do the job. Together these forces, the de-skilling of human labor and the shrinking of the world of commerce, are helping corporations grow ever larger and more powerful and shifting ever more earning power from the assembly line to the executive suite. The more money and power the elite amasses, the more they can game the system to amass more money and power, until you’re left with a situation like ours, where a typical CEO makes 330 times more than the average worker and the top 1 percent of Americans hold 40 percent of the country’s wealth. The growing gap between rich and poor in America is morally appalling and politically unsustainable, but many of the forces driving the disparities are systemic and irreversible. No wall we can build will stop the seepage of jobs overseas in a fully globalized and technologically mobile world, and a shift to a democratic socialist model that worked for highly homogenous northern European countries in the middle part of the last century is unlikely to work in our own fractious, racially diverse country facing 21st century problems. Whether we like it or not, we are stuck with the world we have made, along with the deep problems it is causing us, and we have no choice but to innovate our way out of it. The thing is, we did it once before. In the 1930s, when mechanization finally pushed America’s small farmers and sharecroppers out of the fields and into factories, we saw a profound shift in the role of government in people’s lives. When poor people could no longer feed themselves from their own plot of land, however meager, governments had to step in to find new ways to feed the hungry and care for the sick and elderly. The Soviet Union responded to this crisis with Communism. Central Europe responded with Nazism. America responded with the New Deal. Then, 10 years later, we had a war to see which system worked best. America won that battle, and in the process set off the longest and most prosperous economic boom in our history. But now that boom has run its course, and we must reinvent ourselves once again, this time for a very different world. As was the case in the 1930s, when the developed world at last felt the consequences of a century-long shift from an agrarian society to an industrial one, our present challenges have a long history. As early the 1950s and ’60s, global markets were pushing good union jobs overseas and technological advances were turning skilled jobs into less lucrative semi-skilled ones. But today, when robots build cars and the digital revolution is tossing entire job categories – office assistant, taxi driver, newspaper reporter – into the trash heap, the sheer rate of change is starting to have political ramifications. So far, though, America’s political class, including the current crop of presidential contenders, hasn’t begun to grapple in a serious way with how machines are changing the nature of work, especially at the economic margins. In part, this is because American political elites are themselves creatures of the knowledge economy and don’t feel the pain of the shift from an industrial economy in the personal and direct ways so many of their less educated constituents do. Then, too, there is the problem of mounting a coherent campaign against technological progress. People love their gadgets, which bring them things they want cheaply and instantaneously. Digital communication may be picking clean the lower rungs of clerical workers, from receptionists to mailroom employees to inventory clerks, but try coming up with a political slogan denouncing email. More troublingly, forces like technology and globalization are faceless abstractions, and in the heat of a campaign, efforts to combat their ill effects often devolve into scapegoating and demagoguery, as this year’s presidential contest has shown us all too well. A crowd that might sit on its hands for a lecture on the changing nature of work in a globalized, digitally connected world will cheer lustily at crude denunciations of Mexican immigrants or crafty Chinese businessmen stealing American jobs. This fury, while it turns out voters, can also send us tilting at windmills, chasing solutions that are wildly impractical, morally objectionable, or just plain counterproductive. Still, as a society, we must learn to rage against the machines, not in the destructive sense of the industrial-age saboteurs who threw their wooden shoes into their machines to break them, but thoughtfully and creatively. In every past burst of technological progress, the jobs that were rendered obsolete by the new machines were eventually replaced by new kinds of work made possible by the new machines. But that doesn’t simply happen by natural law. We have to make it happen, the way Franklin Roosevelt did when he started experimenting wildly with the levers of government to stop capitalism from devouring itself. And we have to start making it happen now. The longer we allow machines to shape our work lives unchecked, the more we will see wealth concentrate in the hands of those who design and finance those machines. Already, technology and the globalizing effects technology makes possible have hollowed out American factories and sucked the economic life from a few select knowledge industries like journalism, and now even workers in solidly upper-middle-class professions like law and medicine are starting to see their work de-skilled and outsourced to other countries. Man vs. machine: It’s the defining issue of our age.







Published on April 10, 2016 09:00
Neoliberalism vs. New Deal: Bernie, Hillary and what’s really at stake in this primary









Published on April 10, 2016 08:00
Malcolm Gladwell got us wrong: Our research was key to the 10,000-hour rule, but here’s what got oversimplified
In 1993 one of us (Anders Ericsson) published the results of a study on a group of violin students in a music academy in Berlin that found that the most accomplished of those students had put in an average of ten thousand hours of practice by the time they were twenty years old. That paper, written with co-authors Ralf Krampe and Clemens Tesch-Römer, would go on to become a major part of the scientific literature on expert performers, but it was not until 2008, with the publication of Malcolm Gladwell’s "Outliers," that the paper’s results attracted much attention from outside the scientific community. In his discussion of what it takes to become a top performer in a given field, Gladwell offered a catchy phrase: “the ten-thousand-hour rule.” According to this rule, it takes ten thousand hours of practice to become a master in most fields. As evidence, Gladwell pointed to our results on the student violinists, and, in addition, he estimated that the Beatles had put in about ten thousand hours of practice while playing in Hamburg in the early 1960s and that Bill Gates put in roughly ten thousand hours of programming to develop his skills to a degree that allowed him to found and develop Microsoft. In general, Gladwell suggested, the same thing is true in essentially every field of human endeavor — people don’t become expert at something until they’ve put in about ten thousand hours of practice. The rule is irresistibly appealing. It’s easy to remember, for one thing. It would’ve been far less effective if those violinists had put in, say, eleven thousand hours of practice by the time they were twenty. And it satisfies the human desire to discover a simple cause-and-effect relationship: just put in ten thousand hours of practice at anything, and you will become a master. Unfortunately, this rule — which is the only thing that many people today know about the effects of practice — is wrong in several ways. (It is also correct in one important way, which we will get to shortly.) First, there is nothing special or magical about ten thousand hours. Gladwell could just as easily have mentioned the average amount of time the best violin students had practiced by the time they were eighteen — approximately seventy-four hundred hours — but he chose to refer to the total practice time they had accumulated by the time they were twenty, because it was a nice round number. And, either way, at eighteen or twenty, these students were nowhere near masters of the violin. They were very good, promising students who were likely headed to the top of their field, but they still had a long way to go when at the time of the study. Pianists who win international piano competitions tend to do so when they’re around thirty years old, and thus they’ve probably put in about twenty thousand to twenty-five thousand hours of practice by then; ten thousand hours is only halfway down that path. And the number varies from field to field. Steve Faloon, the subject of an early experiment on improving memory, became better at memorizing strings of digits than any other person in history after only about two hundred hours of practice. Now, thirty years later, with improved training techniques the world’s best digit memorizers can recall strings of digits that are several times longer than Steve Faloon’s best. We don’t know exactly how many hours of practice these top performers have put in, but it is likely well under ten thousand. Second, the number of ten thousand hours at age twenty for the best violinists was only an average. Half of the ten violinists in that group hadn’t actually accumulated ten thousand hours at that age. Gladwell misunderstood this fact and incorrectly claimed that all the violinists in that group had accumulated over ten thousand hours. Third, Gladwell didn’t distinguish between the type of practice that the musicians in our study did — a very specific sort of practice referred to as “deliberate practice” which involves constantly pushing oneself beyond one’s comfort zone, following training activities designed by an expert to develop specific abilities, and using feedback to identify weaknesses and work on them — and any sort of activity that might be labeled “practice.” For example, one of Gladwell’s key examples of the ten-thousand-hour rule was the Beatles’ exhausting schedule of performances in Hamburg between 1960 and 1964. According to Gladwell, they played some twelve hundred times, each performance lasting as much as eight hours, which would have summed up to nearly ten thousand hours. "Tune In," an exhaustive 2013 biography of the Beatles by Mark Lewisohn, calls this estimate into question and, after an extensive analysis, suggests that a more accurate total number is about eleven hundred hours of playing. So the Beatles became worldwide successes with far less than ten thousand hours of practice. More importantly, however, performing isn’t the same thing as practice. Yes, the Beatles almost certainly improved as a band after their many hours of playing in Hamburg, particularly because they tended to play the same songs night after night, which gave them the opportunity to get feedback — both from the crowd and themselves — on their performance and find ways to improve it. But an hour of playing in front of a crowd, where the focus is on delivering the best possible performance at the time, is not the same as an hour of focused, goal-driven practice that is designed to address certain weaknesses and make certain improvements — the sort of practice that was the key factor in explaining the abilities of the Berlin student violinists. A closely related issue is that, as Lewisohn argues, the success of the Beatles was not due to how well they performed other people’s music but rather to their songwriting and creation of their own new music. Thus, if we are to explain the Beatles’ success in terms of practice, we need to identify the activities that allowed John Lennon and Paul McCartney—the group’s two primary songwriters—to develop and improve their skill at writing songs. All of the hours that the Beatles spent playing concerts in Hamburg would have done little, if anything, to help Lennon and McCartney become better songwriters, so we need to look elsewhere to explain the Beatles’ success. This distinction between deliberate practice aimed at a particular goal and generic practice is crucial because not every type of practice leads to the improved ability that we saw in the music students or the ballet dancers. Generally speaking, deliberate practice and related types of practice that are designed to achieve a certain goal consist of individualized training activities — usually done alone — that are devised specifically to improve particular aspects of performance. The final problem with the ten-thousand-hour rule is that, although Gladwell himself didn’t say this, many people have interpreted it as a promise that almost anyone can become an expert in a given field by putting in ten thousand hours of practice. But nothing in the study of the Berlin violinists implied this. To show a result like this, it would have been necessary to put a collection of randomly chosen people through ten thousand hours of deliberate practice on the violin and then see how they turned out. All that the Berlin study had shown was that among the students who had become good enough to be admitted to the Berlin music academy, the best students had put in, on average, significantly more hours of solitary practice than the better students, and the better and best students had put in more solitary practice than the music-education students. The question of whether anyone can become an expert performer in a given field by taking part in enough designed practice is still open, and we offer some thoughts on that issue elsewhere. But there was nothing in the original study to suggest that it was so. Gladwell did get one thing right, and it is worth repeating because it’s crucial: becoming accomplished in any field in which there is a well-established history of people working to become experts requires a tremendous amount of effort exerted over many years. It may not require exactly ten thousand hours, but it will take a lot. Research has shown this to be true in field after field. It generally takes about ten years of intense study to become a chess grandmaster. Authors and poets have usually been writing for more than a decade before they produce their best work, and it is generally a decade or more between a scientist’s first publication and his or her most important publication — and this is in addition to the years of study before that first published research. A study of musical composers by the psychologist John R. Hayes found that it takes an average of twenty years from the time a person starts studying music until he or she composes a truly excellent piece of music, and it is generally never less than ten years. Gladwell’s ten-thousand-hour rule captures this fundamental truth — that in many areas of human endeavor it takes many, many years of practice to become one of the best in the world — in a forceful, memorable way, and that’s a good thing. On the other hand, emphasizing what it takes to become one of the best in the world in such competitive fields as music, chess, or academic research leads us to overlook what we believe to be the more important lesson from the study of the violin students. When someone says that it takes ten thousand — or however many — hours to become really good at something, it puts the focus on the daunting nature of the task. While some may take this as a challenge — as if to say, “All I have to do is spend ten thousand hours working on this, and I’ll be one of the best in the world!”—many will see it as a stop sign: “Why should I even try if it’s going to take me ten thousand hours to get really good?” As Dogbert observed in one "Dilbert" comic strip, “I would think a willingness to practice the same thing for ten thousand hours is a mental disorder.” But we see the core message as something else altogether: In pretty much any area of human endeavor, people have a tremendous capacity to improve their performance, as long as they train in the right way. If you practice something for a few hundred hours, you will almost certainly see great improvement — it took Steve Faloon only a couple of hundred hours of practice to become the best ever at memorizing strings of digits — but you have only scratched the surface. You can keep going and going and going, getting better and better and better. How much you improve is up to you. This puts the ten-thousand-hour rule in a completely different light: The reason that you must put in ten thousand or more hours of practice to become one of the world’s best violinists or chess players or golfers is that the people you are being compared to or competing with have themselves put in ten thousand or more hours of practice. There is no point at which performance maxes out and additional practice does not lead to further improvement. So, yes, if you wish to become one of the best in the world in one of these highly competitive fields, you will need to put in thousands and thousands of hours of hard, focused work just to have a chance of equaling all of those others who have chosen to put in the same sort of work. One way to think about this is simply as a reflection of the fact that, to date, we have found no limitations to the improvements that can be made with particular types of practice. As training techniques are improved and new heights of achievement are discovered, people in every area of human endeavor are constantly finding ways to get better, to raise the bar on what was thought to be possible, and there is no sign that this will stop. The horizons of human potential are expanding with each new generation. Adapted from "Peak: Secrets from the New Science of Expertise" by Anders Ericsson and Robert Pool. Published by Houghton Mifflin Harcourt. Copyright © 2016 by K. Anders Ericsson and Robert Pool. Reprinted with permission of the publisher. All rights reserved.







Published on April 10, 2016 07:30
April 9, 2016
Gen X and the big self-help lie: In furious middle age, a “Life Reimagined” feels impossibly out of reach
It is 2016, and all reports about the state of things are rosy. Scientists tell us that the midlife crisis doesn’t
really
exist. The New York Times runs an unusually optimistic article proclaiming that — brace yourself — the economy is chugging along beautifully. By all accounts, we should all be as happy as Snow White swept up in that whirlwind of chirping birds and spring cleaning. But you don’t have to look far to see that people in middle age are, actually, historically angry, more like “Falling Down”’s William Foster in a rage-filled rampage. Just think about the Sanders-Trump upsurge — one thing that unifies both teams is anger, even fury. It’s not all that rare to hear a particularly angry middle-aged grouser hope that Trump wins in order to see the whole system exposed and watch it crumble. That’s not just anger, that’s angry to the point of extreme nihilism. This isn’t all racially driven -— wages are flat, the safety net has frayed, and the one percent is still making out like bandits. Banks that were too big to fail are even bigger. Barbara Bradley Hagerty, a former NPR correspondent and the author of the new book “Life Reimagined: The Science, Art, and Opportunity of Midlife,” floats into this tumultuous period like Pollyanna. Part memoir and part scientific exploration, the book takes a hard look at what keeps people ticking when things look bleak. In the wake of the trials and traumas of middle age, Bradley Hagerty arrives with a hopeful message: You can think and act your way out of the soul-crushing blandness of the middle years, and all you need is a bit of imagination. There’s no question that some of what she writes is true: A positive attitude does help people cope with crisis. Reorienting your sense of self around serving others — which some of her case studies have done — is good for everyone involved. Certainly, she’s not imagining the fact that some Americans have pivoted and moved into a productive period. And Bradley Hagerty writes well and, as anyone who’s heard her NPR reports knows, tells stories engagingly. But it’s a message that can feel like a slap in the face for some Generation Xers, many of whom, post-recession, have found themselves in competition with Millennials for low wages and nonexistent job security. Many of us were just hitting what should have been the most productive part of our careers, and the most stable, even boring, time in our lives. Just looking at the cover, chirpy bright yellow dotted with scientific looking sprinkles, seems like a slap in the face for many of us still reeling. How can this cheerful book even exist? Well, for one, Bradley Hagerty herself identifies as a Baby Boomer, (though she’s at the tail end of the boom). She grew up during the years of plenty and had nearly two decades at one employer — experiences an Xer can only envy. Honestly, after reading David Brooks’ cloying take on it in the New York Times, I was prepared to dislike this book. At the beginning, it was pretty easy. There are the peppy replies to Bradley Hagerty’s queries on NPR’s Facebook page, like the one from a laid-off professional: “I have used up most of my savings. I am not sure where my next job will be. I am currently working a retail job and just wrote a TV pilot. I am having the time of my life.” Much of Bradley Hagerty’s book is a spin through the recent studies on midlife and happiness psychology, which has now turned its back on moment-by-moment fulfillment and toward meaning and purpose as a way to lead a successful life. Like Gretchen Rubin’s “The Happiness Project,” the author sets out to test some of the latest prescriptions — only this time it’s for a meaningful midlife and beyond. Not surprisingly for anyone who’s been paying attention, there is a strong push toward building a strong social network, creating short- and long-term goals, and having the right mindset. It’s not until the chapter “When Bad Stuff Happens,” especially the section on “post-traumatic growth” that the book surprisingly begins to speak more directly to our particularly difficult times. According to Bradley Hagerty, “PTG is a cousin to resilience, but more of a thug: meaner, more brutal, more devastating — and more transformative.” In order to move away from the trauma and toward the growth, according to two psychologists from the University of North Carolina in Charlotte, there needs to be a dramatic change of bedrock beliefs. While Bradley Hagerty’s examples are of individuals rebuilding their lives, much of the country is in a state of post-traumatic shock and needs to do something similar. Perhaps the reason why people are so angry is that our bedrock beliefs have not changed even though so many of us are living very different lives than we might have imagined we would. Chucking it all to work a minimum wage job and write a screenplay on the side, as freeing and fulfilling as that may be, is not exactly the answer for most of us who want to ensure that we’re not in some kind of bargain-basement assisted care facility in our later years. There is a need for a reframing of our ideals — and that, I would guess, is why we are reaching for extreme answers. Bradley Hagerty delivers some excellent advice about longevity and having a purposeful life. Many of us are still in crisis, however, and only the more comfortable among us have the luxury of thinking ahead. As the author states, the earlier you start, the better, because you never know how long you have. But where we are right now feels like turmoil — and it feels very precarious, the total opposite of what we were raised to expect from middle age. Seven million homes have been foreclosed since the Great Recession, and there were even more layoffs. Some of these people have pulled their lives together, but these kinds of setbacks are tougher to recover from than “Life Reimagined” lets on. And while Bradley Hagerty offers good suggestions for individuals seeking a way out, she avoids stoking anger or urging a larger movement of the dispossessed. This is where her Boomer roots become clear. She’s trying to help people, but she’s part of a generation that went from broad social movements to the gospel of individual fulfillment, and it shows in her book. Some of the people who’ve bounced back are skilled and resourceful; some were just lucky. If you’re still trying to regain your footing, these stories are not as inspirational as they are annoying. A lot of Americans long for a Trump or a Sanders. We are aching for a revolution. We are in the middle of figuring out — post-recession — what we believe. Maybe, someday, we can think more positive thoughts and start planning for that 10K run. Right now, though, we’re still traumatized. Being told we need to make more friends doesn’t really help.







Published on April 09, 2016 16:00
Thomas Frank: Democrats just aren’t that concerned about income inequality
Thomas Frank, founding editor of The Baffler, and author of What’s the Matter with Kansas and other books, is among the most trenchant voices on the left today. In his new book, Listen, Liberal, Frank sets his sights on the troubles with the modern Democratic Party, which he argues is representing the interests of elites, while forsaking the working-class and poor Americans it once championed. Here, Frank answers questions from The National’s Jim Swearingen. 1. The Guardian has referred to you as “the great chronicler of American paradox.” How did you get your start as a socio-political analyst? It began in the usual way, with college newspapers and reading H. L. Mencken and things like that. Then, in 1988, my friends and I decided we needed to launch our own magazine -- we called it "The Baffler" -- and that was really where I started developing my own ideas. Reagan was in the White House, and Rambo was on our crappy little TV set, and the world needed criticism. 2. We are going through what seems to be one of the craziest presidential elections in American history. All of the rules seem to be breaking down, and the pundits cannot backtrack from their predictions quickly enough. Why are things so weird this time around? It's still "the economy, stupid," nearly 25 years after James Carville coined the phrase. For working people, the ones our politicians used to salute as the salt of the earth, the situation never improves. Their lives are going nowhere, and sometimes are actively ruined by decisions made in distant places. But it's easy to see that for certain other classes, this is a golden age, a heaven on earth. For them, the McMansions are a-building, the artisans are crafting, the Teslas are rolling. What this has meant is a series of hard-times elections, one after another, getting more and more bitter as the situation drags on. However, talking about social class is extremely uncomfortable for American pundits, and so you have the situation you describe. Everything is a surprise. Nothing makes sense. 3. Does Bernie Sanders’ campaign bring you any sense of renewed hope in the ability—or willingness—of the Democratic Party to tackle income inequality? Very much so. Not because anyone thinks he'd be able to put his proposals into effect right away if he became president, but because he's putting ideas on the table that more conventional Democrats abandoned many years ago. These happen to be very popular ideas, and now we're remembering why. Sanders has also shown us the weak point in the armor of the plutocracy--the way in which a traditional liberal politician can indeed compete in this age of mega-donors. 4. Does the Democratic Party have a vested interest in perpetuating income inequality? Does their welfare—no pun intended—rest on perpetuating an incendiary issue that supplies them with a righteous brand of political power/grievance? I wouldn't put it that way. I think it's more accurate to say that, while they know inequality is bad and while it makes them sad, they aren't deeply concerned about it. And that's because, as a party, they are committed to the winners in the inequality sweepstakes: the "creative class," the innovative professionals in Silicon Valley and on Wall Street. The people who are doing really well in this new gilded age. That's simply who the Democrats are nowadays. On the other side of the coin, they are not structurally aligned with the organizations of working people any longer, and as a result they aren't terribly concerned with working people's issues. 5. Both party bases seem to be suffering from nostalgia for a better past. Is liberal nostalgia for a pro-labor, pro-poor ideology any less atavistic than conservative nostalgia for a pro-small government, pro-white era? I grew up in an era of rampant nostalgia--the 1970s, pining for the 1950s--and one of my beliefs is that nostalgia is really no worse or even all that different than a faith in "progress." Sometimes progress doesn't happen. Sometimes nostalgia makes sense. Sometimes the past really is an improvement on the present. At a time when inequality is growing, and most people find that unpleasant, they naturally get sentimental about a time when things were less bad--or when they believe things were less bad. Nostalgia is not history, however. My job as a historian is to draw meaningful stories from that past, stories that are true, that make sense, that are useful, and that aren't simply fairy tales. 6. As the Democrats seem ready to nominate a Wall St.-friendly Hillary Clinton and Republicans to nominate an anti-free-trade Donald Trump, could we be witnessing a seismic party role reversal such as happened with Democrats shifting from a segregationist to a civil rights party? Could the GOP end up, when this upheaval is over, as the Party of the Oppressed? It seems unlikely, but everything about Donald Trump is unlikely. Craziest of all is the idea of the white working class turning for a savior to a guy who had a TV show in which you got to watch him firing people. I will say this, however: The Republicans are going to have a hard time getting the Trump phenomenon back in its box. Regardless of what happens, four years from now you're going to have another Trump, probably one who doesn't insult and offend so many different groups. If you get a Trump minus the bigotry--a Trump minus Trump--I will be ready to start looking at the GOP again. And just think of what we're acknowledging about the Democrats! I spent the last year working on this book about how they've abandoned the working class--a highly controversial subject!--and now it's not even in question any more. 7. You’ve pretty well debunked the right and the left in your writings. Where does Thomas Frank, or any other well-meaning progressive, go from here? Maybe back to cultural criticism, which is where I started out. Maybe away from writing altogether. I used to be really good at building model airplanes; surely there's a future in that.Thomas Frank, founding editor of The Baffler, and author of What’s the Matter with Kansas and other books, is among the most trenchant voices on the left today. In his new book, Listen, Liberal, Frank sets his sights on the troubles with the modern Democratic Party, which he argues is representing the interests of elites, while forsaking the working-class and poor Americans it once championed. Here, Frank answers questions from The National’s Jim Swearingen. 1. The Guardian has referred to you as “the great chronicler of American paradox.” How did you get your start as a socio-political analyst? It began in the usual way, with college newspapers and reading H. L. Mencken and things like that. Then, in 1988, my friends and I decided we needed to launch our own magazine -- we called it "The Baffler" -- and that was really where I started developing my own ideas. Reagan was in the White House, and Rambo was on our crappy little TV set, and the world needed criticism. 2. We are going through what seems to be one of the craziest presidential elections in American history. All of the rules seem to be breaking down, and the pundits cannot backtrack from their predictions quickly enough. Why are things so weird this time around? It's still "the economy, stupid," nearly 25 years after James Carville coined the phrase. For working people, the ones our politicians used to salute as the salt of the earth, the situation never improves. Their lives are going nowhere, and sometimes are actively ruined by decisions made in distant places. But it's easy to see that for certain other classes, this is a golden age, a heaven on earth. For them, the McMansions are a-building, the artisans are crafting, the Teslas are rolling. What this has meant is a series of hard-times elections, one after another, getting more and more bitter as the situation drags on. However, talking about social class is extremely uncomfortable for American pundits, and so you have the situation you describe. Everything is a surprise. Nothing makes sense. 3. Does Bernie Sanders’ campaign bring you any sense of renewed hope in the ability—or willingness—of the Democratic Party to tackle income inequality? Very much so. Not because anyone thinks he'd be able to put his proposals into effect right away if he became president, but because he's putting ideas on the table that more conventional Democrats abandoned many years ago. These happen to be very popular ideas, and now we're remembering why. Sanders has also shown us the weak point in the armor of the plutocracy--the way in which a traditional liberal politician can indeed compete in this age of mega-donors. 4. Does the Democratic Party have a vested interest in perpetuating income inequality? Does their welfare—no pun intended—rest on perpetuating an incendiary issue that supplies them with a righteous brand of political power/grievance? I wouldn't put it that way. I think it's more accurate to say that, while they know inequality is bad and while it makes them sad, they aren't deeply concerned about it. And that's because, as a party, they are committed to the winners in the inequality sweepstakes: the "creative class," the innovative professionals in Silicon Valley and on Wall Street. The people who are doing really well in this new gilded age. That's simply who the Democrats are nowadays. On the other side of the coin, they are not structurally aligned with the organizations of working people any longer, and as a result they aren't terribly concerned with working people's issues. 5. Both party bases seem to be suffering from nostalgia for a better past. Is liberal nostalgia for a pro-labor, pro-poor ideology any less atavistic than conservative nostalgia for a pro-small government, pro-white era? I grew up in an era of rampant nostalgia--the 1970s, pining for the 1950s--and one of my beliefs is that nostalgia is really no worse or even all that different than a faith in "progress." Sometimes progress doesn't happen. Sometimes nostalgia makes sense. Sometimes the past really is an improvement on the present. At a time when inequality is growing, and most people find that unpleasant, they naturally get sentimental about a time when things were less bad--or when they believe things were less bad. Nostalgia is not history, however. My job as a historian is to draw meaningful stories from that past, stories that are true, that make sense, that are useful, and that aren't simply fairy tales. 6. As the Democrats seem ready to nominate a Wall St.-friendly Hillary Clinton and Republicans to nominate an anti-free-trade Donald Trump, could we be witnessing a seismic party role reversal such as happened with Democrats shifting from a segregationist to a civil rights party? Could the GOP end up, when this upheaval is over, as the Party of the Oppressed? It seems unlikely, but everything about Donald Trump is unlikely. Craziest of all is the idea of the white working class turning for a savior to a guy who had a TV show in which you got to watch him firing people. I will say this, however: The Republicans are going to have a hard time getting the Trump phenomenon back in its box. Regardless of what happens, four years from now you're going to have another Trump, probably one who doesn't insult and offend so many different groups. If you get a Trump minus the bigotry--a Trump minus Trump--I will be ready to start looking at the GOP again. And just think of what we're acknowledging about the Democrats! I spent the last year working on this book about how they've abandoned the working class--a highly controversial subject!--and now it's not even in question any more. 7. You’ve pretty well debunked the right and the left in your writings. Where does Thomas Frank, or any other well-meaning progressive, go from here? Maybe back to cultural criticism, which is where I started out. Maybe away from writing altogether. I used to be really good at building model airplanes; surely there's a future in that.Thomas Frank, founding editor of The Baffler, and author of What’s the Matter with Kansas and other books, is among the most trenchant voices on the left today. In his new book, Listen, Liberal, Frank sets his sights on the troubles with the modern Democratic Party, which he argues is representing the interests of elites, while forsaking the working-class and poor Americans it once championed. Here, Frank answers questions from The National’s Jim Swearingen. 1. The Guardian has referred to you as “the great chronicler of American paradox.” How did you get your start as a socio-political analyst? It began in the usual way, with college newspapers and reading H. L. Mencken and things like that. Then, in 1988, my friends and I decided we needed to launch our own magazine -- we called it "The Baffler" -- and that was really where I started developing my own ideas. Reagan was in the White House, and Rambo was on our crappy little TV set, and the world needed criticism. 2. We are going through what seems to be one of the craziest presidential elections in American history. All of the rules seem to be breaking down, and the pundits cannot backtrack from their predictions quickly enough. Why are things so weird this time around? It's still "the economy, stupid," nearly 25 years after James Carville coined the phrase. For working people, the ones our politicians used to salute as the salt of the earth, the situation never improves. Their lives are going nowhere, and sometimes are actively ruined by decisions made in distant places. But it's easy to see that for certain other classes, this is a golden age, a heaven on earth. For them, the McMansions are a-building, the artisans are crafting, the Teslas are rolling. What this has meant is a series of hard-times elections, one after another, getting more and more bitter as the situation drags on. However, talking about social class is extremely uncomfortable for American pundits, and so you have the situation you describe. Everything is a surprise. Nothing makes sense. 3. Does Bernie Sanders’ campaign bring you any sense of renewed hope in the ability—or willingness—of the Democratic Party to tackle income inequality? Very much so. Not because anyone thinks he'd be able to put his proposals into effect right away if he became president, but because he's putting ideas on the table that more conventional Democrats abandoned many years ago. These happen to be very popular ideas, and now we're remembering why. Sanders has also shown us the weak point in the armor of the plutocracy--the way in which a traditional liberal politician can indeed compete in this age of mega-donors. 4. Does the Democratic Party have a vested interest in perpetuating income inequality? Does their welfare—no pun intended—rest on perpetuating an incendiary issue that supplies them with a righteous brand of political power/grievance? I wouldn't put it that way. I think it's more accurate to say that, while they know inequality is bad and while it makes them sad, they aren't deeply concerned about it. And that's because, as a party, they are committed to the winners in the inequality sweepstakes: the "creative class," the innovative professionals in Silicon Valley and on Wall Street. The people who are doing really well in this new gilded age. That's simply who the Democrats are nowadays. On the other side of the coin, they are not structurally aligned with the organizations of working people any longer, and as a result they aren't terribly concerned with working people's issues. 5. Both party bases seem to be suffering from nostalgia for a better past. Is liberal nostalgia for a pro-labor, pro-poor ideology any less atavistic than conservative nostalgia for a pro-small government, pro-white era? I grew up in an era of rampant nostalgia--the 1970s, pining for the 1950s--and one of my beliefs is that nostalgia is really no worse or even all that different than a faith in "progress." Sometimes progress doesn't happen. Sometimes nostalgia makes sense. Sometimes the past really is an improvement on the present. At a time when inequality is growing, and most people find that unpleasant, they naturally get sentimental about a time when things were less bad--or when they believe things were less bad. Nostalgia is not history, however. My job as a historian is to draw meaningful stories from that past, stories that are true, that make sense, that are useful, and that aren't simply fairy tales. 6. As the Democrats seem ready to nominate a Wall St.-friendly Hillary Clinton and Republicans to nominate an anti-free-trade Donald Trump, could we be witnessing a seismic party role reversal such as happened with Democrats shifting from a segregationist to a civil rights party? Could the GOP end up, when this upheaval is over, as the Party of the Oppressed? It seems unlikely, but everything about Donald Trump is unlikely. Craziest of all is the idea of the white working class turning for a savior to a guy who had a TV show in which you got to watch him firing people. I will say this, however: The Republicans are going to have a hard time getting the Trump phenomenon back in its box. Regardless of what happens, four years from now you're going to have another Trump, probably one who doesn't insult and offend so many different groups. If you get a Trump minus the bigotry--a Trump minus Trump--I will be ready to start looking at the GOP again. And just think of what we're acknowledging about the Democrats! I spent the last year working on this book about how they've abandoned the working class--a highly controversial subject!--and now it's not even in question any more. 7. You’ve pretty well debunked the right and the left in your writings. Where does Thomas Frank, or any other well-meaning progressive, go from here? Maybe back to cultural criticism, which is where I started out. Maybe away from writing altogether. I used to be really good at building model airplanes; surely there's a future in that.







Published on April 09, 2016 15:59
Valerie Plame has one regret: “I wish I had the maturity and courage to have pushed back more”
Valerie Plame Wilson was a lifelong CIA officer until her cover was spectacularly — and illegally — blown in 2003 by the Bush Administration, ending the viability of her career in intelligence. She is now an anti-nuclear-proliferation activist and espionage novelist, as well as the author of the bestseller "Fair Game," which was turned into a major motion picture. Chris Pavone, a longtime New York City book editor, never had a thing to do with the CIA until his wife got a job in Luxembourg, at which point he, too, began writing spy novels. His first two books, "The Expats" and "The Accident," were both New York Times bestsellers, and "The Travelers" has just published. Here they discuss the challenges of writing espionage fiction and being a real spy versus a fake one. First things first, something I worry about constantly: When you read espionage novels written by amateurs like me, what mistakes do we make that drive you absolutely crazy? Chris, you’re doing a hell of a good job for being an “amateur” — lots of bestsellers! But, to your question, there are no “lone wolves” in the CIA. A successful ops officer is part of a highly skilled team: analysts, targeting officers, surveillance, tech geniuses. I know having a Jason Bourne all alone in a field firing at bad guys is much more dramatic, but it’s not real. Second, hate to burst your bubble, but the CIA is expressly prohibited from political assassinations per the 1979 Executive Order 11905. That said, use of drones certainly kills — intended targets as well as unintended civilians. Finally, a pet peeve is overreliance on guns. Again, very visual but not much intelligence gets collected when there is a gun in the room. Am I correct that there are no guns in "The Travelers"? Yes. The only time a gun is mentioned is when the protagonist — just recruited to be an intelligence asset — asks his new handler when he’ll be getting a weapon, and she tells him never, there are already too many guns in the hands of people who shouldn’t have them. That’s also the way I feel about it, both in real life and in fiction: there are too many guns where they don’t belong. I’m gratified to know that real-world intelligence gathering is relatively firearms-free. Speaking of the real world, do CIA officers have book clubs? Do you all cycle through the same authors, sit around in Beirut and Prague, passing paperbacks back and forth? Are some authors well known for their verisimilitude? If there is a book club, I wasn’t asked to join it ... I feel left out. I think CIA officers particularly like John Le Carré for the existential dread he is able to convey in his gray world. Of course, we all watch James Bond with envy — knowing the U.S. government would never pay for the lifestyle he enjoys. A few years ago, I was signing books at Shakespeare and Co. in Paris when an American man handed me his State Department business card. He was in Paris on a special delegation, lived mostly in Washington, invited me to contact him with any questions. Was this guy with the CIA? Maybe. Or a kook. The world is full of them. Didn’t you call him? No, I never did. When I returned home after that long book tour across America and Europe, I was already well past the research stage on a new project, and I really just wanted to sit down and write. I also wanted to stay in New York with my wife and children; I was the parent who was supposed to be at school pickup every day. A few years ago, one my kids wanted to read my first book, and I allowed it. But I thought my second novel was a bit too racy, so I forbid it, temporarily. Your kids are a bit older than mine. What’s your policy? And when you’re writing, do you ever consider what your children will think of the book? Yes, of course! My kids and my 86-year-old mother. Both "BLOWBACK" and "BURNED" have sex scenes in them — pretty tame — but still. My kids get queasy when my husband and I kiss. But they have read the books and seemed to take it all in stride. Frankly, I’d rather have them read sex scenes than ultra-violent ones. Do you worry what other readers will think? Like your ex-colleagues in the CIA? About sex scenes? Or the larger CIA themes? I did a book event with a former colleague, Bob Bauer, and was so relieved that he said I got all the CIA stuff right. Do you worry about what your ex-colleagues in the publishing business will think about your novel that’s set in the publishing business? Definitely. I know that "The Accident" is not a completely accurate reflection of the reality of the book publishing world, which, like nearly any other business, consists mostly of people sitting in small offices staring at computer screens, or reading, or trying to stay awake in meetings. None of this is the stuff of compelling fiction. Although I did admire David Foster Wallace’s final unfinished novel about boredom, I’m no DFW, and I want my books to be exciting, not boring. So I purposefully stray from what I know to be reality. And I do worry that well-informed readers will think I’m erroneous, or irresponsible. In the same vein, the protagonist of your two novels is a fictional operations officer named Vanessa Pierson, a name that sounds a lot like Valerie Plame. What sort of more compelling version is she of you? Younger. Smarter. Better with languages. Fiction is a kind of do-over. Besides wanting to make yourself younger and smarter — and who doesn’t? — are there other things you wish you could do over? Looking back at my career, I wish I knew then what I know now ... that gender bias is built into the system and it’s unconscious in many ways. I wish I had the maturity and courage to have pushed back more. I was always trying to be a “good girl” and play by the rules. Maybe this sounds silly, but during the times I worked at CIA HQS, I wish I had left my desk for lunch more. All the men did it, but of course the women are busy beavering away at their workload, nibbling their salads, because they needed to get out at a certain time for child pick-up at school... Your career was burned for purely political reasons. Intelligence is supposed to be above politics, but now that you’re not in intelligence, are your novels motivated by any political agendas? Not motivated in a political partisan sense. I’ve had plenty of that nonsense in real life. But in both my novels, I focus on nuclear weapons as the ultimate problem — which that and climate change really are. This is because my expertise in the CIA was nuclear counterproliferation: that is, making sure the bad guys, terrorists or rogue actors do not get a nuclear capacity. By using those themes in my books, I hope to bring this issue to a wider, younger audience so they are perhaps intrigued enough to educate themselves and even act — politically.Valerie Plame Wilson was a lifelong CIA officer until her cover was spectacularly — and illegally — blown in 2003 by the Bush Administration, ending the viability of her career in intelligence. She is now an anti-nuclear-proliferation activist and espionage novelist, as well as the author of the bestseller "Fair Game," which was turned into a major motion picture. Chris Pavone, a longtime New York City book editor, never had a thing to do with the CIA until his wife got a job in Luxembourg, at which point he, too, began writing spy novels. His first two books, "The Expats" and "The Accident," were both New York Times bestsellers, and "The Travelers" has just published. Here they discuss the challenges of writing espionage fiction and being a real spy versus a fake one. First things first, something I worry about constantly: When you read espionage novels written by amateurs like me, what mistakes do we make that drive you absolutely crazy? Chris, you’re doing a hell of a good job for being an “amateur” — lots of bestsellers! But, to your question, there are no “lone wolves” in the CIA. A successful ops officer is part of a highly skilled team: analysts, targeting officers, surveillance, tech geniuses. I know having a Jason Bourne all alone in a field firing at bad guys is much more dramatic, but it’s not real. Second, hate to burst your bubble, but the CIA is expressly prohibited from political assassinations per the 1979 Executive Order 11905. That said, use of drones certainly kills — intended targets as well as unintended civilians. Finally, a pet peeve is overreliance on guns. Again, very visual but not much intelligence gets collected when there is a gun in the room. Am I correct that there are no guns in "The Travelers"? Yes. The only time a gun is mentioned is when the protagonist — just recruited to be an intelligence asset — asks his new handler when he’ll be getting a weapon, and she tells him never, there are already too many guns in the hands of people who shouldn’t have them. That’s also the way I feel about it, both in real life and in fiction: there are too many guns where they don’t belong. I’m gratified to know that real-world intelligence gathering is relatively firearms-free. Speaking of the real world, do CIA officers have book clubs? Do you all cycle through the same authors, sit around in Beirut and Prague, passing paperbacks back and forth? Are some authors well known for their verisimilitude? If there is a book club, I wasn’t asked to join it ... I feel left out. I think CIA officers particularly like John Le Carré for the existential dread he is able to convey in his gray world. Of course, we all watch James Bond with envy — knowing the U.S. government would never pay for the lifestyle he enjoys. A few years ago, I was signing books at Shakespeare and Co. in Paris when an American man handed me his State Department business card. He was in Paris on a special delegation, lived mostly in Washington, invited me to contact him with any questions. Was this guy with the CIA? Maybe. Or a kook. The world is full of them. Didn’t you call him? No, I never did. When I returned home after that long book tour across America and Europe, I was already well past the research stage on a new project, and I really just wanted to sit down and write. I also wanted to stay in New York with my wife and children; I was the parent who was supposed to be at school pickup every day. A few years ago, one my kids wanted to read my first book, and I allowed it. But I thought my second novel was a bit too racy, so I forbid it, temporarily. Your kids are a bit older than mine. What’s your policy? And when you’re writing, do you ever consider what your children will think of the book? Yes, of course! My kids and my 86-year-old mother. Both "BLOWBACK" and "BURNED" have sex scenes in them — pretty tame — but still. My kids get queasy when my husband and I kiss. But they have read the books and seemed to take it all in stride. Frankly, I’d rather have them read sex scenes than ultra-violent ones. Do you worry what other readers will think? Like your ex-colleagues in the CIA? About sex scenes? Or the larger CIA themes? I did a book event with a former colleague, Bob Bauer, and was so relieved that he said I got all the CIA stuff right. Do you worry about what your ex-colleagues in the publishing business will think about your novel that’s set in the publishing business? Definitely. I know that "The Accident" is not a completely accurate reflection of the reality of the book publishing world, which, like nearly any other business, consists mostly of people sitting in small offices staring at computer screens, or reading, or trying to stay awake in meetings. None of this is the stuff of compelling fiction. Although I did admire David Foster Wallace’s final unfinished novel about boredom, I’m no DFW, and I want my books to be exciting, not boring. So I purposefully stray from what I know to be reality. And I do worry that well-informed readers will think I’m erroneous, or irresponsible. In the same vein, the protagonist of your two novels is a fictional operations officer named Vanessa Pierson, a name that sounds a lot like Valerie Plame. What sort of more compelling version is she of you? Younger. Smarter. Better with languages. Fiction is a kind of do-over. Besides wanting to make yourself younger and smarter — and who doesn’t? — are there other things you wish you could do over? Looking back at my career, I wish I knew then what I know now ... that gender bias is built into the system and it’s unconscious in many ways. I wish I had the maturity and courage to have pushed back more. I was always trying to be a “good girl” and play by the rules. Maybe this sounds silly, but during the times I worked at CIA HQS, I wish I had left my desk for lunch more. All the men did it, but of course the women are busy beavering away at their workload, nibbling their salads, because they needed to get out at a certain time for child pick-up at school... Your career was burned for purely political reasons. Intelligence is supposed to be above politics, but now that you’re not in intelligence, are your novels motivated by any political agendas? Not motivated in a political partisan sense. I’ve had plenty of that nonsense in real life. But in both my novels, I focus on nuclear weapons as the ultimate problem — which that and climate change really are. This is because my expertise in the CIA was nuclear counterproliferation: that is, making sure the bad guys, terrorists or rogue actors do not get a nuclear capacity. By using those themes in my books, I hope to bring this issue to a wider, younger audience so they are perhaps intrigued enough to educate themselves and even act — politically.







Published on April 09, 2016 15:00