Helen H. Moore's Blog, page 822

March 28, 2016

Bernie’s Blue Dog backer: Sanders gains the support of an unlikely superdelegate

"I think the super delegates are going to have make a very difficult decision and that is, if a candidate wins in a state by 40 or 50 points, who are you going to give your vote to?" Democratic presidential candidate Bernie Sanders asked over the weekend after winning three Western states by upwards of 40 to 70 points. While his rival Hillary Clinton still holds a substantial lead in the undedicated so-called superdelegates, lawmakers and Party loyalists who have up until voting at the Democratic National Convention (DNC) to decide their vote of support, Sanders has been working to persuade superdelegates to vote in line with the voters of their states. "I think what’s important to remember here is that superdelegates are kind of like football recruits," a Sanders spokesperson said in an interview with CNN's "New Day" on Monday. "You know, they say they are coming but until they have signed on the dotted line and they’re in practice, you don’t know that they’re all the way with you and that they’re on your team. And so we think that we still have time to garner support from these superdelegates, especially when we’re winning." On Saturday, a most unlikely Sanders supporter appeared to agree with that sentiment, pledging his superdelegate vote in favor of Sanders as his home state of Minnesota had recently done. Rep. Collin Peterson told Forum News Service that his superdelegate vote will reflect the will of voters in his state who voted for Sanders 62 percent to 38 percent for Clinton. "I'm voting my district," Peterson said. "I'm going to vote for Bernie." While the founding member of Democrat's Blue Dog Coalition, formed to promote fiscal conservatism, rarely attends the DNC, he said that if his superdelegate vote would make a difference, he would make the trip to Philadelphia to cast a vote for the Democratic Socialist despite their obvious differences on tax policy. "He's got something going," Peterson said of Sanders's support with young voters. "He's tapped into something." As the Huffington Post notes, Peterson's non-endorsement pledge to Sanders as a superdelegate is noteworthy because the Blue Dog Democrat is anti-abortion, opposed to embryonic stem cell research, against same-sex marriage and supports the death penalty -- all key positions out of line with Sanders' platform. Still, Peterson might be more likely to attend the other party's national convention in Cleveland instead this year. That is of course, if Secret Service allows open carry to occur as many attendees of the Republican National Convention have called for. "You've got to be against gun control to play in my band," Peterson told the The Forum Editorial Board, explaining that his band, "The Second Amendment," might play at the Rock and Roll Hall of Fame in Cleveland during the GOP convention. Besides Peterson, there is more evidence that Sanders' campaign to persuade superdelegates to follow the will of voters in their states appears to be paying off, albeit in less dramatic fashion. After Sanders won the Alaska caucuses by commanding margins over the weekend, Democratic Party vice chairman Larry Murakami officially pledged his superdelegate vote to Sanders. "I think it's totally appropriate if we're over 80 percent for one of us to step forward and say, 'yeah, I'm voting for Sanders like everyone in my district, like most of the people in my district and most of the people in Alaska,'" Murakami told Politico. Democratic Party chairman Bert Marley echoed Murakami and the Sanders campaign, telling Politico that, "I felt like super delegates should reflect for lack of a better term the will of the people so when the results were so overwhelming in Idaho it was the natural thing to do," and pledging his vote to Sanders, who won Idhao's caucus. Utah Democratic Party chairman Peter Corroon also pledged his superdelegate vote to Sanders after his state picked the Vermont senator 79.3 percent to Clinton with 20.3 percent. All superdelegates committing to Sanders in recent days come from caucus states that have overwhelming favored Sanders. This still leaves Sanders, who trails Clinton by more than 400 superdelegates, at a clear disadvantage. As the Cook Political Report has calculated, even if you awarded Sanders all of the superdelegates in the states he has won so far, it would still not be enough to overcome Clinton’s lead among super-delegates, because caucus states tend to have fewer such superdelegates:
According to figures provided by the Democratic National Committee, here are the numbers of super-delegates in the states Sanders has won so far: New Hampshire (8). Colorado (12). Minnesota (16). Oklahoma (4). Vermont (10). Kansas (4). Nebraska (5). Maine (5). Michigan (17). Idaho (4). Utah (4). Alaska (4). Hawaii (10). Washington State (17). Democrats abroad (4). The total number of super-delegates in all the states Sanders has won thus far is: 124. Clinton currently leads Sanders by 469-29 among super-delegates who have declared support for one candidate or the other, an advantage of 440. Giving Sanders all of those super-dels in states he won would not come close to closing that gap.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 28, 2016 13:57

Clinton campaign to Bernie: Drop your negative tone and maybe Hillary will debate you in NY

Hillary Clinton's chief strategist refused to commit to a proposed presidential debate with Bernie Sanders in New York before the state's April 19 primary. As reported by The Hill, Clinton staffer Joel Benenson accused Sanders of "running a very negative campaign" and suggested that a New York debate might not take place unless the Vermont senator changes his "tone." "I think the real question is what kind of campaign is Senator Sanders going to run going forward," said Benenson, the Clinton campaign's chief strategist, when asked about a potential debate by CNN's Kate Bolduan on Monday. "This is a man who said he'd never run a negative ad ever," continued Benenson. "He's now running them. They're now planning to run more. Let's see the tone of the campaign he wants to run before we get to any other questions." Benenson also referenced a Washington Post article published Sunday that reported the Sanders campaign would intensify its attacks on Clinton in the weeks leading up to the New York primary. As NBC News reports, the Sanders campaign's issues-first, positive tone has given way to increasingly aggressive policy critiques of Clinton as the primary campaign has worn on. In recent weeks, Sanders stump speeches have included attacks on Clinton's record on trade, campaign finance and foreign policy. Benenson remarks come one day after Sanders called for a debate in New York during an appearance on NBC's Meet the Press. The Democratic National Committee lists an April debate as "TBD" on its official website as of Monday afternoon. Asked by Bolduan about the risk Clinton faced by agreeing to a debate, Benenson responded, “There’s no risk. She’s done very well in the debates. The debates have been very good, but Senator Sanders doesn’t get to decide when we debate, particularly when he’s running a very negative campaign against us." The Clinton campaign risks appearing disingenuous in its condemnation of Sanders' tactics — Clinton herself hasn't been shy about going after her competition during primary season, dismissing Sanders as a "single issue candidate" and aggressively criticizing his record on health care and gun control. But as the Washington Post notes, Clinton's team has become wary of attacking Sanders in recent weeks, as the former secretary of state's campaign — which maintains a healthy lead in delegates—seeks to avoid alienating Sanders backers as it consolidates Democratic support and pivots towards the general election. According to FiveThirtyEight's weighted polling average, as of March 28, Clinton leads Sanders by over 40 points in New York, the state where Hillary Clinton twice won election as a U.S. senator. Despite what appears to be a large deficit, Sanders hopes to close the gap by running an aggressive campaign in the Empire State over the next three weeks. Winning New York's 247 delegates — the second-most of any state — would build momentum for the Sanders camp and narrow Clinton's overall delegate lead as the primary enters its home stretch. Watch the full video of Benenson's comments below: https://www.youtube.com/watch?v=kiEkL...  Hillary Clinton's chief strategist refused to commit to a proposed presidential debate with Bernie Sanders in New York before the state's April 19 primary. As reported by The Hill, Clinton staffer Joel Benenson accused Sanders of "running a very negative campaign" and suggested that a New York debate might not take place unless the Vermont senator changes his "tone." "I think the real question is what kind of campaign is Senator Sanders going to run going forward," said Benenson, the Clinton campaign's chief strategist, when asked about a potential debate by CNN's Kate Bolduan on Monday. "This is a man who said he'd never run a negative ad ever," continued Benenson. "He's now running them. They're now planning to run more. Let's see the tone of the campaign he wants to run before we get to any other questions." Benenson also referenced a Washington Post article published Sunday that reported the Sanders campaign would intensify its attacks on Clinton in the weeks leading up to the New York primary. As NBC News reports, the Sanders campaign's issues-first, positive tone has given way to increasingly aggressive policy critiques of Clinton as the primary campaign has worn on. In recent weeks, Sanders stump speeches have included attacks on Clinton's record on trade, campaign finance and foreign policy. Benenson remarks come one day after Sanders called for a debate in New York during an appearance on NBC's Meet the Press. The Democratic National Committee lists an April debate as "TBD" on its official website as of Monday afternoon. Asked by Bolduan about the risk Clinton faced by agreeing to a debate, Benenson responded, “There’s no risk. She’s done very well in the debates. The debates have been very good, but Senator Sanders doesn’t get to decide when we debate, particularly when he’s running a very negative campaign against us." The Clinton campaign risks appearing disingenuous in its condemnation of Sanders' tactics — Clinton herself hasn't been shy about going after her competition during primary season, dismissing Sanders as a "single issue candidate" and aggressively criticizing his record on health care and gun control. But as the Washington Post notes, Clinton's team has become wary of attacking Sanders in recent weeks, as the former secretary of state's campaign — which maintains a healthy lead in delegates—seeks to avoid alienating Sanders backers as it consolidates Democratic support and pivots towards the general election. According to FiveThirtyEight's weighted polling average, as of March 28, Clinton leads Sanders by over 40 points in New York, the state where Hillary Clinton twice won election as a U.S. senator. Despite what appears to be a large deficit, Sanders hopes to close the gap by running an aggressive campaign in the Empire State over the next three weeks. Winning New York's 247 delegates — the second-most of any state — would build momentum for the Sanders camp and narrow Clinton's overall delegate lead as the primary enters its home stretch. Watch the full video of Benenson's comments below: https://www.youtube.com/watch?v=kiEkL...  

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 28, 2016 13:06

March 27, 2016

“Zootopia” in the Year of Trump: The parallels between a Disney cartoon and the 2016 election continue to surprise

“Bad news, in this city gripped by fear…” –“Zootopia,” released March 4, 2016 “Riskiest Political Act of 2016? Protesting at Rallies for Donald Trump” –New York Times, March 10, 2016
As Disney’s latest animated opus, “Zootopia” has not only toppled records with a $75 million opening weekend, but might very well beat “Frozen” as the most profitable cartoon to date. Pegged 99 percent “fresh” by Rotten Tomatoes’ Tomatometer, the film has been roundly praised not only for its quippy script and solid storytelling, but for how aptly it captures early 2016 anxieties about race, police brutality, and the rise of political demagogues. Chronicling the trials of the tenacious Judy Hopps (Ginnifer Goodwin), who chirpily sets off from Bunny Burrow to the big city to pursue her dreams of joining the police force under its new “Mammal Inclusion Initiative,” the film initially seems headed in Disney’s familiar “dream big” direction. But after her first day on the job goes horribly awry, she and caddish fox Nick Wilde (Jason Bateman) form an unlikely gumshoe duo to investigate a case of a missing otter, exposing that certain Zootopians have mysteriously “gone savage,” devolving to a state of wildness believed by the city to be long left behind. What follows is a mix of caper, character study and inter-species melodrama, as presumptions about “predator” versus “prey” are tested, confirmed or daringly tossed aside. “Zootopia isn’t simply another fun Disney animated movie,” says Dirk Libbey of Cinemablend. “It’s one of the greatest Disney animated movies the company has ever produced.” The Washington Post calls it “the best political film so far this year.” Not everybody’s cheering—critics on both the left and the right point out problems in the movie’s sociopolitical ambitions. In one zinger of a critique for Consequence of Sound, Nico Lang claims that “Disney attempts to confront racism but instead delivers the kids’ version of ‘Crash,’” while Jason Johnson of The Root more forgivingly dubs it “‘The Wire’ with webbed feet.” Meanwhile, one conservative cleverly riffs that rather than a “milestone of dynamic storytelling,” the film is “a millstone around the neck of the establishment leftist social justice engineers,” denouncing the film as anti-white propaganda. “The Left has infected every facet of life,” says a young conservative in another review, “and they want to make sure we can’t go anywhere in this country without having the ‘white privilege’ hoax shoved in our faces.” Such splintered reactions suggest the film has clearly hit a nerve, a nerve presumably dulled down in those critics beguiled by its feel-good ethos. It is this same mainstream audience that has been bombarded by images of unrest across America that have monopolized the March news cycle. In the same two weeks of late-winter thaw during which “Zootopia” launched to the top, Donald Trump’s presidential campaign rallies prompted heated pushback in Fayetteville, St. Louis, Chicago and Tucson—with five deputies from North Carolina recently punished for neglecting to disrupt an assault on a black protester by a white Trump defender. Days later, an African-American officer in Arizona calls Trump protesters “the most hateful, evil people I've ever seen,” fearing “a full-fledged riot” at the event he attended undercover to “see [the protesters’] point of view.” At that same rally last Saturday, a black supporter and U.S. airman sucker-punched and kicked a white protester, a startling volte-face from Fayetteville. If these outbursts come across as convoluted spectacle, it’s because they are—eschewing the expected narratives for left/right, white/black, and everyone in between. Pandemonium rules and logic founders, predictable lines between victim and aggressor, predator and prey haphazardly erased. Suddenly a Disney film feels eerily on point, however muddled its rainbow rhetoric. “Give me back the Zootopia I love,” pleads Gazelle, the pop star voiced by Shakira, leading a herd of picketers shaking signs against the injustice of species-centered profiling. In reply their mammalian rivals mount protests of their own, commanding all natural “predators” to “go back where you came from.” Sound familiar? “If you’re an African first, go back to Africa!” taunted a livid Trumpist at a Black Lives Matter protester prior to the canceled Chicago rally. “Trump is a racist and so are his supporters!” cried Tucson activists last Saturday, chanting, “Shame on you!” in unison to those lining up to attend. In a recent CBS/NYT poll, 85 percent of Democratic primary voters cast blame on Trump for doing little to keep the peace, while 80 percent of his supporters approve of how he’s responded to rally violence. That said, it’s hard to believe that “go back” bombast and concomitant scuffles materialize from thin air; Trump has implored no small number of people to “go back” to some ostensibly native place—Latinos, Muslim immigrants, African-Americans, women, pretty much anyone invading his sensationalized kingdom (population one), a space no less fantastic, delusional or escapist than anything Disney could dream up. Uncannily, the studio green-lit “Zootopia” back when Trump was best known for a Tower and a toupee. “We’ve been happy to find that both the humor and the meat of it are resonating,” director Rich Moore told the LA Times in early March, a massive understatement as domestic profits pass the $200 million mark less than three weeks from release. Are conservative viewers largely ignoring its heavy-pawed allegories, or could its broad appeal suggest a resistance across party lines to bigotry and hate? The answer would seem a bit of both. In a trenchant piece for Politico, “Donald Trump Needs 7 of 10 White Guys,” David Bernstein calls attention to just how statistically unlikely a Trump presidency will be, no matter his lead in the primaries. “[T]he argument often made by Trump’s followers is that he will win in November because he will bring so many disengaged Americans to the polls. But they’re talking about disengaged white voters, mostly men—and unfortunately for him, the turnout rate for white men is already relatively high.” In essence, if any new voters turn up in November, they are much more likely to be female, Hispanic or black, and “given his flirtations with racism and fascism, he’s likely done too much damage to salvage much crossover appeal.” No matter what disconcerting percentage of current Trump supporters are white men shouting epithets, the fact is that this group simply lacks the numbers to sway things their way, even if white males at large were mostly pro-Trump (and there is nothing to prove that they are). “Between Reagan and Romney, the white male share of the total vote had dropped from 45 percent to 35 percent,” Bernstein explains. And among right-leaning critiques of “Zootopia”—which are predominantly authored by white men (even more so than movie reviews in general)—no amount of ire has so far triggered any serious boycott. Policeone.com has even lauded the film for its “refreshingly positive” portrayal of cops, unusual for Hollywood. Like Obama himself insisting that “cops deserve our respect,” while tacitly, if belatedly, approving #blacklivesmatter from afar, “Zootopia” never presents a world so smugly harmonious that police aren’t called for to fix things up. Which is why critiques of being overly p.c. feel downright ludicrous; the heroine is an officer, after all, and her caddish partner in crime (and punishment) ultimately joins the “good guys” as an officer himself after a life of petty theft and roguery. Indeed, Chief Bogo (voiced by Idris Elba), a cape buffalo in no mood for bull, is also the outlet for some of the film’s most sagacious lines, as when he throws the falsity of the Disney premise into vivid relief: “Life isn't some cartoon musical where you sing a little song and all your insipid dreams magically come true. So let it go.” Topped off by the “Frozen” reference at the end, the film’s self-awareness as a cartoon musical and political film challenges any who seek to see it as simply either/or. Ultimately, the finale sends us off with a silly little song, performed by Gazelle (proving her hips don’t lie even when swinging from a lithe antelope) circled by a quartet of Vogueing tigers in leather chaps. “They have to figure out how to coexist,” says Byron Howard, a director of the film, commenting on the conflict at the film’s crux. Let’s see if we can do the same within our own riven species.
“Bad news, in this city gripped by fear…” –“Zootopia,” released March 4, 2016 “Riskiest Political Act of 2016? Protesting at Rallies for Donald Trump” –New York Times, March 10, 2016
As Disney’s latest animated opus, “Zootopia” has not only toppled records with a $75 million opening weekend, but might very well beat “Frozen” as the most profitable cartoon to date. Pegged 99 percent “fresh” by Rotten Tomatoes’ Tomatometer, the film has been roundly praised not only for its quippy script and solid storytelling, but for how aptly it captures early 2016 anxieties about race, police brutality, and the rise of political demagogues. Chronicling the trials of the tenacious Judy Hopps (Ginnifer Goodwin), who chirpily sets off from Bunny Burrow to the big city to pursue her dreams of joining the police force under its new “Mammal Inclusion Initiative,” the film initially seems headed in Disney’s familiar “dream big” direction. But after her first day on the job goes horribly awry, she and caddish fox Nick Wilde (Jason Bateman) form an unlikely gumshoe duo to investigate a case of a missing otter, exposing that certain Zootopians have mysteriously “gone savage,” devolving to a state of wildness believed by the city to be long left behind. What follows is a mix of caper, character study and inter-species melodrama, as presumptions about “predator” versus “prey” are tested, confirmed or daringly tossed aside. “Zootopia isn’t simply another fun Disney animated movie,” says Dirk Libbey of Cinemablend. “It’s one of the greatest Disney animated movies the company has ever produced.” The Washington Post calls it “the best political film so far this year.” Not everybody’s cheering—critics on both the left and the right point out problems in the movie’s sociopolitical ambitions. In one zinger of a critique for Consequence of Sound, Nico Lang claims that “Disney attempts to confront racism but instead delivers the kids’ version of ‘Crash,’” while Jason Johnson of The Root more forgivingly dubs it “‘The Wire’ with webbed feet.” Meanwhile, one conservative cleverly riffs that rather than a “milestone of dynamic storytelling,” the film is “a millstone around the neck of the establishment leftist social justice engineers,” denouncing the film as anti-white propaganda. “The Left has infected every facet of life,” says a young conservative in another review, “and they want to make sure we can’t go anywhere in this country without having the ‘white privilege’ hoax shoved in our faces.” Such splintered reactions suggest the film has clearly hit a nerve, a nerve presumably dulled down in those critics beguiled by its feel-good ethos. It is this same mainstream audience that has been bombarded by images of unrest across America that have monopolized the March news cycle. In the same two weeks of late-winter thaw during which “Zootopia” launched to the top, Donald Trump’s presidential campaign rallies prompted heated pushback in Fayetteville, St. Louis, Chicago and Tucson—with five deputies from North Carolina recently punished for neglecting to disrupt an assault on a black protester by a white Trump defender. Days later, an African-American officer in Arizona calls Trump protesters “the most hateful, evil people I've ever seen,” fearing “a full-fledged riot” at the event he attended undercover to “see [the protesters’] point of view.” At that same rally last Saturday, a black supporter and U.S. airman sucker-punched and kicked a white protester, a startling volte-face from Fayetteville. If these outbursts come across as convoluted spectacle, it’s because they are—eschewing the expected narratives for left/right, white/black, and everyone in between. Pandemonium rules and logic founders, predictable lines between victim and aggressor, predator and prey haphazardly erased. Suddenly a Disney film feels eerily on point, however muddled its rainbow rhetoric. “Give me back the Zootopia I love,” pleads Gazelle, the pop star voiced by Shakira, leading a herd of picketers shaking signs against the injustice of species-centered profiling. In reply their mammalian rivals mount protests of their own, commanding all natural “predators” to “go back where you came from.” Sound familiar? “If you’re an African first, go back to Africa!” taunted a livid Trumpist at a Black Lives Matter protester prior to the canceled Chicago rally. “Trump is a racist and so are his supporters!” cried Tucson activists last Saturday, chanting, “Shame on you!” in unison to those lining up to attend. In a recent CBS/NYT poll, 85 percent of Democratic primary voters cast blame on Trump for doing little to keep the peace, while 80 percent of his supporters approve of how he’s responded to rally violence. That said, it’s hard to believe that “go back” bombast and concomitant scuffles materialize from thin air; Trump has implored no small number of people to “go back” to some ostensibly native place—Latinos, Muslim immigrants, African-Americans, women, pretty much anyone invading his sensationalized kingdom (population one), a space no less fantastic, delusional or escapist than anything Disney could dream up. Uncannily, the studio green-lit “Zootopia” back when Trump was best known for a Tower and a toupee. “We’ve been happy to find that both the humor and the meat of it are resonating,” director Rich Moore told the LA Times in early March, a massive understatement as domestic profits pass the $200 million mark less than three weeks from release. Are conservative viewers largely ignoring its heavy-pawed allegories, or could its broad appeal suggest a resistance across party lines to bigotry and hate? The answer would seem a bit of both. In a trenchant piece for Politico, “Donald Trump Needs 7 of 10 White Guys,” David Bernstein calls attention to just how statistically unlikely a Trump presidency will be, no matter his lead in the primaries. “[T]he argument often made by Trump’s followers is that he will win in November because he will bring so many disengaged Americans to the polls. But they’re talking about disengaged white voters, mostly men—and unfortunately for him, the turnout rate for white men is already relatively high.” In essence, if any new voters turn up in November, they are much more likely to be female, Hispanic or black, and “given his flirtations with racism and fascism, he’s likely done too much damage to salvage much crossover appeal.” No matter what disconcerting percentage of current Trump supporters are white men shouting epithets, the fact is that this group simply lacks the numbers to sway things their way, even if white males at large were mostly pro-Trump (and there is nothing to prove that they are). “Between Reagan and Romney, the white male share of the total vote had dropped from 45 percent to 35 percent,” Bernstein explains. And among right-leaning critiques of “Zootopia”—which are predominantly authored by white men (even more so than movie reviews in general)—no amount of ire has so far triggered any serious boycott. Policeone.com has even lauded the film for its “refreshingly positive” portrayal of cops, unusual for Hollywood. Like Obama himself insisting that “cops deserve our respect,” while tacitly, if belatedly, approving #blacklivesmatter from afar, “Zootopia” never presents a world so smugly harmonious that police aren’t called for to fix things up. Which is why critiques of being overly p.c. feel downright ludicrous; the heroine is an officer, after all, and her caddish partner in crime (and punishment) ultimately joins the “good guys” as an officer himself after a life of petty theft and roguery. Indeed, Chief Bogo (voiced by Idris Elba), a cape buffalo in no mood for bull, is also the outlet for some of the film’s most sagacious lines, as when he throws the falsity of the Disney premise into vivid relief: “Life isn't some cartoon musical where you sing a little song and all your insipid dreams magically come true. So let it go.” Topped off by the “Frozen” reference at the end, the film’s self-awareness as a cartoon musical and political film challenges any who seek to see it as simply either/or. Ultimately, the finale sends us off with a silly little song, performed by Gazelle (proving her hips don’t lie even when swinging from a lithe antelope) circled by a quartet of Vogueing tigers in leather chaps. “They have to figure out how to coexist,” says Byron Howard, a director of the film, commenting on the conflict at the film’s crux. Let’s see if we can do the same within our own riven species.
“Bad news, in this city gripped by fear…” –“Zootopia,” released March 4, 2016 “Riskiest Political Act of 2016? Protesting at Rallies for Donald Trump” –New York Times, March 10, 2016
As Disney’s latest animated opus, “Zootopia” has not only toppled records with a $75 million opening weekend, but might very well beat “Frozen” as the most profitable cartoon to date. Pegged 99 percent “fresh” by Rotten Tomatoes’ Tomatometer, the film has been roundly praised not only for its quippy script and solid storytelling, but for how aptly it captures early 2016 anxieties about race, police brutality, and the rise of political demagogues. Chronicling the trials of the tenacious Judy Hopps (Ginnifer Goodwin), who chirpily sets off from Bunny Burrow to the big city to pursue her dreams of joining the police force under its new “Mammal Inclusion Initiative,” the film initially seems headed in Disney’s familiar “dream big” direction. But after her first day on the job goes horribly awry, she and caddish fox Nick Wilde (Jason Bateman) form an unlikely gumshoe duo to investigate a case of a missing otter, exposing that certain Zootopians have mysteriously “gone savage,” devolving to a state of wildness believed by the city to be long left behind. What follows is a mix of caper, character study and inter-species melodrama, as presumptions about “predator” versus “prey” are tested, confirmed or daringly tossed aside. “Zootopia isn’t simply another fun Disney animated movie,” says Dirk Libbey of Cinemablend. “It’s one of the greatest Disney animated movies the company has ever produced.” The Washington Post calls it “the best political film so far this year.” Not everybody’s cheering—critics on both the left and the right point out problems in the movie’s sociopolitical ambitions. In one zinger of a critique for Consequence of Sound, Nico Lang claims that “Disney attempts to confront racism but instead delivers the kids’ version of ‘Crash,’” while Jason Johnson of The Root more forgivingly dubs it “‘The Wire’ with webbed feet.” Meanwhile, one conservative cleverly riffs that rather than a “milestone of dynamic storytelling,” the film is “a millstone around the neck of the establishment leftist social justice engineers,” denouncing the film as anti-white propaganda. “The Left has infected every facet of life,” says a young conservative in another review, “and they want to make sure we can’t go anywhere in this country without having the ‘white privilege’ hoax shoved in our faces.” Such splintered reactions suggest the film has clearly hit a nerve, a nerve presumably dulled down in those critics beguiled by its feel-good ethos. It is this same mainstream audience that has been bombarded by images of unrest across America that have monopolized the March news cycle. In the same two weeks of late-winter thaw during which “Zootopia” launched to the top, Donald Trump’s presidential campaign rallies prompted heated pushback in Fayetteville, St. Louis, Chicago and Tucson—with five deputies from North Carolina recently punished for neglecting to disrupt an assault on a black protester by a white Trump defender. Days later, an African-American officer in Arizona calls Trump protesters “the most hateful, evil people I've ever seen,” fearing “a full-fledged riot” at the event he attended undercover to “see [the protesters’] point of view.” At that same rally last Saturday, a black supporter and U.S. airman sucker-punched and kicked a white protester, a startling volte-face from Fayetteville. If these outbursts come across as convoluted spectacle, it’s because they are—eschewing the expected narratives for left/right, white/black, and everyone in between. Pandemonium rules and logic founders, predictable lines between victim and aggressor, predator and prey haphazardly erased. Suddenly a Disney film feels eerily on point, however muddled its rainbow rhetoric. “Give me back the Zootopia I love,” pleads Gazelle, the pop star voiced by Shakira, leading a herd of picketers shaking signs against the injustice of species-centered profiling. In reply their mammalian rivals mount protests of their own, commanding all natural “predators” to “go back where you came from.” Sound familiar? “If you’re an African first, go back to Africa!” taunted a livid Trumpist at a Black Lives Matter protester prior to the canceled Chicago rally. “Trump is a racist and so are his supporters!” cried Tucson activists last Saturday, chanting, “Shame on you!” in unison to those lining up to attend. In a recent CBS/NYT poll, 85 percent of Democratic primary voters cast blame on Trump for doing little to keep the peace, while 80 percent of his supporters approve of how he’s responded to rally violence. That said, it’s hard to believe that “go back” bombast and concomitant scuffles materialize from thin air; Trump has implored no small number of people to “go back” to some ostensibly native place—Latinos, Muslim immigrants, African-Americans, women, pretty much anyone invading his sensationalized kingdom (population one), a space no less fantastic, delusional or escapist than anything Disney could dream up. Uncannily, the studio green-lit “Zootopia” back when Trump was best known for a Tower and a toupee. “We’ve been happy to find that both the humor and the meat of it are resonating,” director Rich Moore told the LA Times in early March, a massive understatement as domestic profits pass the $200 million mark less than three weeks from release. Are conservative viewers largely ignoring its heavy-pawed allegories, or could its broad appeal suggest a resistance across party lines to bigotry and hate? The answer would seem a bit of both. In a trenchant piece for Politico, “Donald Trump Needs 7 of 10 White Guys,” David Bernstein calls attention to just how statistically unlikely a Trump presidency will be, no matter his lead in the primaries. “[T]he argument often made by Trump’s followers is that he will win in November because he will bring so many disengaged Americans to the polls. But they’re talking about disengaged white voters, mostly men—and unfortunately for him, the turnout rate for white men is already relatively high.” In essence, if any new voters turn up in November, they are much more likely to be female, Hispanic or black, and “given his flirtations with racism and fascism, he’s likely done too much damage to salvage much crossover appeal.” No matter what disconcerting percentage of current Trump supporters are white men shouting epithets, the fact is that this group simply lacks the numbers to sway things their way, even if white males at large were mostly pro-Trump (and there is nothing to prove that they are). “Between Reagan and Romney, the white male share of the total vote had dropped from 45 percent to 35 percent,” Bernstein explains. And among right-leaning critiques of “Zootopia”—which are predominantly authored by white men (even more so than movie reviews in general)—no amount of ire has so far triggered any serious boycott. Policeone.com has even lauded the film for its “refreshingly positive” portrayal of cops, unusual for Hollywood. Like Obama himself insisting that “cops deserve our respect,” while tacitly, if belatedly, approving #blacklivesmatter from afar, “Zootopia” never presents a world so smugly harmonious that police aren’t called for to fix things up. Which is why critiques of being overly p.c. feel downright ludicrous; the heroine is an officer, after all, and her caddish partner in crime (and punishment) ultimately joins the “good guys” as an officer himself after a life of petty theft and roguery. Indeed, Chief Bogo (voiced by Idris Elba), a cape buffalo in no mood for bull, is also the outlet for some of the film’s most sagacious lines, as when he throws the falsity of the Disney premise into vivid relief: “Life isn't some cartoon musical where you sing a little song and all your insipid dreams magically come true. So let it go.” Topped off by the “Frozen” reference at the end, the film’s self-awareness as a cartoon musical and political film challenges any who seek to see it as simply either/or. Ultimately, the finale sends us off with a silly little song, performed by Gazelle (proving her hips don’t lie even when swinging from a lithe antelope) circled by a quartet of Vogueing tigers in leather chaps. “They have to figure out how to coexist,” says Byron Howard, a director of the film, commenting on the conflict at the film’s crux. Let’s see if we can do the same within our own riven species.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 16:30

My body doesn’t need a cure: Sizeism, classism and the big-business hustle of the clean-eating industry

The Facebook chat box bears the tiny smiling face of a woman in workout gear: a tank top cut off at the midriff and leggings in a matching black, bearing her taut, tanned belly. I can’t quite place her; she might be some casual acquaintance from high school. She greets me with a “hey there” followed by an infinite number of exclamation points, and, as I’m scouting her profile—all imports from a Fit-Bit, miles run and steps taken; “inspirational” quotes like “strong is the new skinny”; and photos of protein shakes with various fruits and vegetables artfully arranged around them in a kind of pornography of healthfulness—she asks me if I’d want to join her “weight loss” program (consisting of a certain number of shakes per day, at a certain number of dollars per shake—but, for one time only, I can get a special discount). This is not the first time I’ve gotten one of these offers: I’m a fat woman, which apparently gives everyone license to express their opinions about what must be my obvious, inevitable health needs. So I block Miss Beach-Body Busy-Body.

She is, after all, a product of a culture that—through shows like "The Biggest Loser" and (the even more bluntly titled) "My Diet is Better Than Yours"; the blitzkrieg of news reports extolling the virtues of eating organic, or, better yet, a raw diet; and a myriad of sponsored listicles where free-range, hormone-free omelets and quinoa salads are photographed like they’re on the cover of Vogue—promotes the ideal that a virtuous life is one devoted to racking up stats on our fitness apps or spending a Sallie Mae payment at the Whole Foods, and the evidence of that ideal is in our slim hips and immaculate abs. The Ladies Who Lunch have given up their martinis at noon for Cross-Fit and kale. Healthy living has become a new mode of conspicuous consumption, with thin, yoga-toned bodies emblematic of one’s social standing: Cheap-and-easy fried foods are for the poor and uneducated, people who couldn’t possibly spell serotonin, let alone realize that 30-to-40 minutes of rigorous cardiovascular exercise will boost their levels of it. 

Our culture has always found ways to problematize poor people and fat people, often conflating the two groups in the terrible stereotypes of the Pepsi-swilling welfare queen, and the Cheeto-munching, NASCAR-loving boogeyman who willfully inflates healthcare costs for everyone with his abominable laziness. The great unforgivable ugliness of these types is their perceived lack of virtue: They don’t have the hustle or the grind to make something better of themselves, something more productive. Something useful. People who can afford to spend top dollar on personal trainers and “clean eating” must, by contrast, be go-getters, the holders of high-paying jobs with impressive titles. Thin bodies, or “healthy bodies,” are, therefore, associated with industriousness—which makes them inherently more worthy, more respectable. Clean eating is really about the purity of the soul. And if we are what we eat, then healthiness is close to godliness.

Or, as Sondra Kronberg, MS, RD, CEDRD, a spokesperson for the National Eating Disorder Association (NEDA) and executive director of the Eating Disorder Treatment Collaborative, puts it, “People are no longer going to church, they’re going to the gym every day.” Over the past several years, Kronberg has seen an uptick in disordered eating that is fixated on the healthiness and wholesomeness of foods consumed: “People idolize fitness,” she says. Though (often untenable) thinness has been the look du jour (and by du jour I mean for the past several decades), there’s been a shift from a standard born of the cigarettes-and-black-coffee diet to an emphasis on a hard body forged through clean eating and hours of treating hot yoga like a blood sport.

Size-based bigotry has always hinged on looks, but it has evolved into “concern trolling” (or couching said bigotry in an “I’m just worried about your health”). Recently, former Sports Illustrated cover girl Cheryl Tiegs slammed the magazine for including “plus-size” model Ashley Graham on the cover of its famous swimsuit issue. “I don't like that we're talking about full-figured women because it's glamorizing them because your waist should be smaller than 35 [inches],” she told a reporter from E! News. “That's what Dr. Oz said, and I'm sticking to it … I don't think it's healthy.”  Dr. Oz has peddled “magic” green coffee beans as a weight loss “cure”; the marketing team behind that particular piece of chicanery earned a $9 million fine from the Federal Trade Commission, and Dr. Oz has been called out by fellow physicians, and on the floor of the United States Senate.

But when body size becomes something that must be cured, at any cost, so that we can, as so much ad copy promises us, “look—but most importantly, feel—our best,” we will buy the magic beans; we will run until we decimate our knees; we will give up carbs, and then saturated fats, and then vegetable oils, and keep on giving up until we’re living on air—because there are no trans fats or pesticides in air (right?). This delusion is the engine purring in the heart of some great machine that sucks in human insecurity and spits out money. As Claire Mysko, president and CEO for the National Eating Disorders Association, puts it, that machine is the diet industry itself: “It's problematic [to] equate the ‘perfect’ body to happiness, success, love and confidence,” she says. “A person's appearance has very little to do with health. Weight loss, ‘clean’ eating and extreme exercise are couched in conversations about health, but when we look at the bigger picture … weight loss is a $61 billion industry … we see that selling the ‘perfect body’ is big business.”

Supermodels like Tiegs, and the fitness personalities who have made Instagram and YouTube the modern-day mid-morning infomercial, have the luxury of such ideals: They are literally paid to be hard bodies, doing crunches the same way most of us crunch expense reports; and even if they don’t have private chefs, then they could, perhaps, turn their sojourns to Fresh Market into tax writeoffs. The immaculately packaged uber-fit lifestyle they present is a world away from the workaday drudgery that keeps so many of us housed and fed (even if from an office vending machine): There are no gray cubicles under fluorescent lighting better-suited for an interrogation room; no hands chapped and raw from washing other people’s dishes; no slow grinds through traffic and no bus rides spent inhaling our neighbors’ armpits. The hot, “healthy” bod with the 35-inch waist is, in its own way, a totem of leisure, just as the soft-bodied beauties of yore relied on their double chins and their fair skins to show that they were the pampered elite.

Consider the untenable hours of exercise that the trainers (and, real talk, the producers) of "The Biggest Loser" expect their alumni to adhere to, or the eating plans that the nutrition wunderkinds of "My Diet Is Better Than Yours" craft for hapless contestants: There is the “Wellness Smackdown Plan,” an oh-so-doable (especially for anyone who has any kind of job, or children) “anti-inflammatory vegan diet that uses herbs to detoxify the body … only feeding it between the hours of 10AM and 7PM …”; or the fits-in-any-budget “Strong, Safe & Sexy Plan,” which “… allows wild caught seafood but no other animal flesh … clean and lean protein choices including: … eggs, beans, nuts, seeds and small amounts of organic or grass-fed dairy like Greek yogurt.” Sure, I could be “strong and sexy,” I could be lithe and muscular, an approximation of the ever-airbrushed fitness mag cover cutie; I could even achieve bowel movements that rivaled those of GOOP-era Gwyneth in color, consistency and spiritual enlightenment—but I’d be living in my parents’ basement. So let’s face it— kale isn’t the only green stuff that drives certain parts of the “clean eating” movement—“strong is the new skinny” is more like “strong is the new wealthy.”

The clean eating movement, with its veneer of privilege and wealth, its promise of a better life through a “better” body, can instill the same neurotic fixation on thinness that has driven many people to eating disorders, which can cause everything from kidney failure to cardiac arrest, osteoarthritis to acid reflux. When high-school senior Ashley G. was exposed to more and more information about foods and nutrition, she became particularly attentive to eating organic—this attentiveness became an obsession: “I was always thinking of what would be the healthier option,” she says. This constant worrying about the “healthier option” spurred Ashley to drive 30 miles or so just to buy organic food; she also began to restrict calories with a severity that eventually, she says, manifested in anorexia.

Sojourns to the gym became so long, and so intense, that her parents approached the facility’s managers and asked them to keep Ashley out. Her life was reduced to the four walls of her bedroom, where she devoted hours to researching “good food and bad food,” and logging information into online calorie counters. “I was young and vulnerable,” she explains. “I do think I was predisposed because I was [dealing with] depression, and this gave me an identity: ‘Ashley is so healthy. She has so much control.’ I felt such guilt whenever I stepped out of these restrictions.” Eventually, Ashley sought treatment; still, she remembers standing in a hot shower and writing her calorie intake in the steam along her shower door.  Here, in this image, we see the true toxicity of “clean eating”: a woman’s worth distilled into numbers, vested in everything she consumes (or doesn’t).

Stories like Ashley’s show the peril of making a particular body type public health enemy No. 1. And, in doing this, we forget the actual public health issues that plague the people who can’t afford weekend jaunts to the farmer’s market. Our supposedly health-conscious culture wages “the war against obesity,” with everything in its arsenal, but it doesn’t expend a tenth of the effort on ensuring that all people have clean water. Poison coming out of the tap is a far greater public health crisis than the circumference of anyone’s waist. Still, we’re far more likely to hear about how eating hormone-free chicken nuggets will boost our children’s brain power than about contaminated water pipes, or lead paint in city housing.  The focus on fat bodies as inherently unhealthy is knotted up with elitist, consumer-driven ideals of wellness: After all, the corporate bottom line becomes a fat-bottomed line when people feel impure and ashamed enough to pump more and more money into “get thin quick” schemes. If we truly cared about health, we’d be investing all of the time, energy and, above all, the money we spend on smoothies and supplements and premium plus gym memberships into creating oases in the food deserts (or areas where people, especially people without cars, can’t find affordable nutritious food) across the country.

Driving through broad swaths of my hometown, Baltimore, I see a culinary Sahara of corner stores and fast food joints. And often enough, anything pre-cooked or in a box is not only more available, but preferable—“just add water” isn’t just cheap, it’s easy, and easy is all you can manage after being ground down by low-wage, long-hours work. Any authentic campaign to prioritize health would address poverty and push for a living wage one could actually live on; it would push, with a typical Cross-Fitter’s evangelical zeal, for safer neighborhoods and more public parks. A truly positive, lasting vision of health—one that doesn’t drive people to turn their calorie counts into a calculus of their worth—should be about community, not competition. It isn’t about being “holier than thou,” about making that 6 p.m. yoga class, and then dashing off to the My Organic Market to pick up a hormone-free chicken for dinner. Healthiness should be about making each and every body, no matter how big or small, how fit or able, feel protected and cherished.   

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 16:29

We had to get beyond irony: How David Foster Wallace, Dave Eggers and a new generation of believers changed fiction

Believing in stuff The American 1990s saw the rise of a second popular eschatological vision, one primarily socioeconomic but no less millenarian in temper than the vision offered by "Left Behind." In the fall of 1989, in the National Interest, Francis Fukuyama published “The End of History?” which argues that the end of the Cold War might well have proved Hegel right: history might indeed have ended—and permanently enthroned liberalism—in 1806 with the Battle of Jena. Fukuyama’s conclusion is in one sense optimistic. He regards liberal, consumer- oriented capitalist democracy as the best of all possible political worlds. Serious rivals had systematically proved themselves not only morally bankrupt but also pragmatically unsustainable. Though all nations everywhere had yet to arrive at the ideological “Promised Land,” it was inevitable that everyone would in due course hear the “good news.” But the news was not all good. The end of history might also be viewed as “a very sad time,” as an era in which “daring, courage, imagination, and idealism, will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands.” When history ends, we face nothing less than “centuries of boredom.” Such is life in the secular millennial kingdom. However else we might attack the end-of-history thesis—and even though Samuel Cohen makes a convincing case that many major American writers took a historical, “retrospective turn” in the 1990s—Fukuyama’s claim rep- resented something like posthistorical common sense after the end of the Cold War. It was, as David Foster Wallace writes in "Infinite Jest," “a post-Soviet and -Jihad era when . . . there was no real Foreign Menace of any real unified potency to hate and fear, and the U.S. sort of turned on itself and its own philosophical fatigue and hideous redolent wastes with a spasm of panicked rage.” Life in postindustrial democracies came to seem listless and without flavor; loneliness and a kind of bland sadness were all one could expect of the new world order. At the end of history, irony transformed from an instrument of revolution to a symptom of the impossibility of revolution. In a 1994 issue of The Modern Review, a London-based cultural studies magazine, Toby Young linked ubiquitous irony to the end of history: “it’s difficult to imagine what a post-ironic sensibility would be like. It’s a bit like finding yourself at the end of history. You’re bored because you’re not participating in any historic events but you can’t very well up sticks and go and fight in a war in a less evolved society. To do so would be untrue to your own historical experience; it would require you to unlearn the lessons history has already taught you. And what would be the point?” One can detect the same weariness of tone (“what would be the point?”) in the writing of Richard Rorty. Finding himself at the end of history, Rorty feels compelled to undermine any philosophical ambition to fuse our private and public commitments, leaving us in a position in which our public political commitments must remain philosophically ungrounded. Though he would reject the historical necessity of liberalism’s final victory, Rorty nonetheless thought that the left should “concede Francis Fukuyama’s point” and agreed that “no more romantic prospect stretches before the Left than an attempt to create bourgeois democratic welfare states and equalize life chances among the citizens of those states by redistributing the surplus produced by market economies.” At this moment, capitalism’s Cold War victory, individual irony, and philosophical antifoundationalism merged into a single discourse. Irony’s dominance could sometimes seem like the unavoidable cultural and philosophical consequence of our having arrived at history’s end. This background clarifies the ultimate stakes of discovering or inventing a viable post-ironic ethos. Both David Foster Wallace and Dave Eggers sought to reconnect private and public life, and they pursued this aim by used techniques associated with postmodern metafiction to attempt to generate forms of belief theory held to be no longer possible. For Wallace, post-ironic belief underwrites the possibility of genuine human communication. This is why Wallace distrusts the death-of-the-author thesis and constructs his fictions around a drama of unfulfilled communication. When the wraith of James O. Incandenza appears to the convalescing Don Gately, late in Infinite Jest, he explains that he created the irresistibly addictive avant-garde film “Infinite Jest” in order “to contrive a medium via which he [James] and the muted son [Hal] could simply converse,” a form of entertainment that “would reverse thrust on a young self’s fall into the womb of solipsism, anhedonia, death in life.". Wallace himself claims to know in his “gut that writing is an act of communication between one human being and another,” and justifies his conviction with reference to a reading of Wittgenstein as an incipient post-postmodernist, someone who understood the deadly necessity of transcending solipsistic relativism. Eggers and many of those associated with his various literary enterprises (McSweeney’s Quarterly Concern, McSweeney’s Books, The Believer, Wholphin, his network of 826 tutoring centers), for their part, explode bibliographic form, conflating text and paratext, in order to regenerate a sense of wonder around reading, all of which is part of an effort to undermine what they take to be an overly cynical or snarky literary culture. Eggers’s memoir, A Heartbreaking Work of Staggering Genius (2000), and his first novel, You Shall Know Our Velocity! (2002), proffer an aesthetic of the “quirky,” an aesthetic also visible (as we will see) in the various exhibits of the Museum of Jurassic Technology (MJT) in Los Angeles. Wallace and Eggers write against a culture defined by solipsism, anhedonia, cynicism, snark, and toxic irony—a culture whose disenchantment and sadness can be traced back, in one way or another, to the consumerist end of history. Their primary oppositional strategy is to imagine a characterological countertype to the incredulous ironist. I will call this countertype “the believer,” after Eggers’s influential magazine of the same name, though Wallace is arguably the architect of this post-ironic ethos. When he calls for the rise of an “anti-rebel”—a kind of post-countercultural or newly earnest countercultural figure who opposes a now mainstream irony—Wallace does not give a positive content to the figure, but this antirebellious believer is quite different from the one imagined by LaHaye and Jenkins in Left Behind. In Left Behind, secular postmodernity and neoliberal globalization demand a counterforce that returns to biblical fundamentals. Wallace, by contrast, cannot accept a religious response to the fallen world, nor can he embrace a simple return to a preironic sensibility. Wallace wants to invent a new form of secular belief, a religious vocabulary (God, prayer) that is emptied out of any specific content and is engineered to confront the possibly insuperable condition of postmodernity. This desire to believe is part of the lineage of the avant-garde and simultaneously criticizes that tradition. Eggers, meanwhile, has taken the impulse behind Wallace’s fiction and has successfully popularized it, transforming Wallace’s post-irony into a literary brand that promises consumer reenchantment, a kind of (nonprofit) retail avant-gardism. Lee Siegel has criticized Eggers for propagating a “self-conscious equivalence between decent living and good writing” and has denounced “the McSweeneyite confusion of good intentions with good art, and of its blithe elision . . . of truth with untruth, prevarication with pretense.” Whether or not he is here fairly criticizing “McSweeneyite” art, Siegel correctly identifies a key formal strategy embraced by many post-ironists: the elision of “truth” and “untruth” in pursuit of creating belief and reenchantment. In what follows, I will treat post-ironic belief both as a theoretical project and as a literary intervention. Understood properly, post-ironists differ from writers of “hysterical realism,” a category the critic James Wood associates with a range of authors including Don DeLillo, Thomas Pynchon, Salman Rushdie, Wallace, and Zadie Smith. By equating canonical post-modernists with those who have sought to succeed postmodernism, Wood misses that these younger writers have developed a critique of metafiction that, in its interrogation of the status of belief, resembles his own attack on hysterical realism. In a review of Toni Morrison’s Paradise (1997), Wood argues that fiction constitutes an invitation to belief, or rather—in a secular age—an invitation to read “as if” one believed in fiction: “Fiction demands belief from us, and this request is demanding in part because we can choose not to believe.” Wood distinguishes the ontological commitment required by some religious traditions from the reader’s belief in fiction. Fiction can, at best, only “gently request” that readers act “as if” they believed. Belief in fiction turns out to be a metaphorical sort of belief. This argument suffers from an obvious contradiction: How can fiction “demand” from us a stance that, by definition, belies choice? Belief is involuntary, which is not the same as saying it is unchanging or saying that all beliefs are equally fixed. Nonetheless, a believer is someone who cannot help but hold his or her particular ontological convictions. The leap from nonbeliever to born again in Left Behind is thus not directly the product of will, but arises from an act of freely chosen submission or supplication, a willingness to be changed. Likewise, for Don Gately, Wallace suggests that practices identified with religion (kneeling, prayer) may precede belief—they may be necessary preconditions for belief—but that the transition from nonbelief to belief happens apart from one’s will. Wood might counter my criticism by suggesting that he is throughout his writing using belief in a consistently metaphoric sense. The problem with such a counterargument would be that Wood is actually correct that fiction demands belief of us. We do judge novels based on what they can convince us to believe. We believe or disbelieve in fiction based on a range of criteria—aesthetic, cognitive, social, historical—over which we exercise only partial control. This is the only reason that writing can sensibly be described as “plausible” or “implausible.” In its modes of world-building, and no matter the genre, every narrative engages with our capacity to believe. What we believe in (or disbelieve) includes a range of complexly interlocking phenomena. Not only do we read fictions using ontological criteria—judging its social, historical, and scientific plausibility—but fictions can often make ontological demands of us, can try to convert us into believers. If the job of the novel is to make a persuasive request of us to believe in the events depicted, then hysterical realism fails for Wood because its too-rapid accretion of interesting detail breaks the trance of credulity. “[Zadie] Smith does not lack for powers of invention,” Wood writes, with reference to a characteristic passage in White Teeth. “The problem is that there is too much of it . . . on its own, almost any of these details . . . might be persuasive. Together, they vandalize each other . . . As realism, it is incredible; as satire, it is cartoonish; as cartoon, it is too realistic.” What Woods denounces is the sort of novel that, in its particulars, cannot be faulted for lacking realism, but whose overall pattern takes on an implausible shape. Global implausibility disrupts the local pleasure one might take in a work of fiction that more judiciously doled out its unlikelihoods. Novels of hysterical realism simultaneously feel allegorical and do not allegorize; they present characters that almost but do not quite rise to the level of the identifiably human. Hysterical realism’s hysteria short-circuits its realism. So Wood is, in a sense, right. Fiction tries to compel belief from readers, but certain techniques—often associated with metafiction— can erect barriers to belief. Nonetheless, Wood’s attack on hysterical realism fails on two fronts. First, it does not address metafiction’s historical mission. After all, undermining a naïve version of realism— disrupting the process through which belief formation occurs—was precisely the goal of this type of fiction. To fault a mode of writing for successfully achieving its aims, Wood would need to make a case that those aims are not worth pursuing in the first place, which he does not do. Even in How Fiction Works, he promotes realism (or more precisely, works that possess what he calls “lifeness”) by critical fiat. Second, Wood misunderstands Wallace and Smith’s complex relationship to postmodernism. Post-ironists often try to cultivate belief among readers, and are reacting against the same picture of metafiction and postmodernism they learned about in university literature departments. For his part, Wallace carefully studied academic criticism on postmodernism. The Harry Ransom Center includes heavily annotated copies of Tom LeClair’s In the Loop: Don DeLillo and the Systems Novel as well as Frank Lentricchia’s Introducing Don DeLillo. Like Wood, Wallace came to find postmodern literature wanting in “final seriousness.” Patricia Waugh’s Metafiction, published in 1984, expresses what remains the consensus view on the historical mission of metafiction. Though there are many different techniques associated with metafiction, all draw attention to practices of reading and writing, often by exposing how worlds of fiction are embedded within higher-order fictional worlds. Metafiction “self-consciously and systematically draws attention to its status as an artefact in order to pose questions about the relationship between fiction and reality.” When we read characters reading, we are supposed to become aware of how our own reading habit is homologous to the inscribed reading practice. Metafiction’s power to “pose questions” about reality depends on a homology between fiction and society. Waugh emphasizes “the extent to which the dominant issues of contemporary critical and sociological thought are shared by writers of fiction,” suggesting that transformations in fiction and society are linked. The specific nature of this link remains ambiguous. Are writers of metafiction studying contemporary sociological literature—or arriving at their own amateur sociological insights—and seeking to allegorize these findings? Is the turn to metafiction a coincidental development? Or does some underlying shift in the world—economic or epistemic— account for this homology? Answers differ. Waugh distinguishes between “two poles of metafiction”: one that “finally accepts a substantial real world whose significance is not entirely composed of relationships within language; and one that suggests that there can never be an escape from the prisonhouse of language.” Despite these serious differences, Waugh treats all metafiction as a species of critique, a way of exposing myths and ideological cant. Either metafiction is an allegory for the breakdown of master narratives and coherent frames in the social world (the weak interpretation) or metafiction, because it changes our relation with language, actually breaks down our confidence in norms, values, and conventions, such that we’re thrown into a bottomless well of doubt (the strong interpretation). This latter, strong interpretation has often been compared to a version of critical self-consciousness associated with Romantic irony, and especially Friedrich Schlegel’s fragmentary commentary on irony as a mode of “permanent parabasis.” Either way, society and culture become especially susceptible to the critical power of metafiction. According to the traditional understanding, metafiction is a form of irony because it forces the reader or subject to question all grounds for understanding, including the grounds one uses to justify being an ironist in the first place. Hegel, and Kierkegaard after him, called this form of questioning irony’s “infinite absolute negativity,” its self-negating nature. Metafiction does not therefore undermine this or that belief, but belief as such. It operates according to an inverted mechanism found in Left Behind. Like LaHaye and Jenkins, metafictionists assume that if a range of vocabulary could be mapped cleanly onto a domain of worldly things, then literary or ontological realism would be possible. We would have to take seriously some version of a correspondence theory of truth. By foregrounding linguistic self-referentiality or the infinite connotative range of words, metafiction tries to show that such a mapping is impossible, undermining our naïve belief in realism. Educated in this consensus view, post-ironists such as Wallace and Eggers attempt to use metafictional techniques differently—to help readers cultivate belief. In an unpublished contribution to James L. Harmon’s Take My Advice, Wallace makes the connection between irony and belief explicit: “Ridicule, nihilism, sarcasm, cool, and irony worked for the USA’s young when there were big adult hypocrisies for the young to explode and thus transcend . . . But now there are no really interesting hypocrisies left: you can’t be a hypocrite if you don’t even pretend to believe in anything. Irony and cool keep us from believing in stuff.” In his novella “Westward the Course of Empire Takes Its Way,” Wallace takes great pains to deny that “cynicism and naiveté are mutually incompatible.” “Westward” links the belief in the importance and power of irony to the university, the creative writing workshop, and critical theories of postmodernity. If a cynic can be naïve, then someone nonnaïve can be also noncynical. Wallace attempts to help his reader adopt a stance of nonnaïve noncynicism by means of metafiction. What is paradoxical about this project is the emptiness of the proposed post-ironic belief. Post-ironists do not advocate a stance of belief toward any particular aspect of the world, but rather promote a general ethos of belief. By this account, post-ironic belief might easily be mistaken for what Amy Hungerford calls “postmodern belief,” a “belief without meaning” or “belief without content” whose purpose is to “hedge against the inescapable fact of pluralism.” The language we use is similar, but I would resist this equation. Hungerford outlines a tradition of “belief in belief” that meant, above all, to sustain faith without having to commit to any particular religious community. By contrast, post-ironic believers are interested in belief apart from questions of social pluralism or religious institutions. Moreover, by Hungerford’s account, New Critical arguments against “the heresy of paraphrase” vouchsafed a religious function for literature; but as we saw in my first chapter, it was irony that made poetry resistant to paraphrase in the first place. So, though she doesn’t describe it in these terms, postmodern literature’s “belief in belief” depended on a prior, tacit affirmation of irony. Unlike postmodernists such as Don DeLillo, however, post-ironic believers do not want to keep faith with irony. Irony’s disruptive negativity seems too threatening. This is why it would also be a mistake to describe post-ironic belief as “metaironic.” Metairony is “a gambit . . . to turn irony back on itself,” but such a practice quickly threatens to become merely a higher-order iteration of the ironist’s infinite absolute negativity. Wallace and Eggers’s project more resembles what has been called New Sincerity. Adam Kelly focuses on the intersubjective anxiety that drives much contemporary fiction (including that of Wallace, Eggers, Dana Spiotta, Jennifer Egan, and Colson Whitehead), arguing that such fiction “asks what happens when . . . inner states lose their originating causal status and instead become effects of that anticipatory logic.” What is new about New Sincerity is that, though it understands that there is no pure form of communication, it nonetheless seeks to invent new ways of negotiating the problem of coordinating inner and outer states. But Kelly’s account does not address the specific threat these writers see in irony. We might wonder, for instance, whether the “anticipatory logic” that demotes the centrality of the “acting self” is a form of irony. I would argue against this equation. The battle between inner and outer motivation, which dialectically resolves itself in the form of New Sincerity, can arrive only after a prior struggle, the struggle to achieve post-ironic belief. If they did not believe in the actuality of other persons, Newly Sincere writers would not feel much need to lash together inner intentions and outer performances in the first place, let alone ask readers to trust in them. Post-ironic belief must precede the ethics of New Sincerity. Wallace’s commentary on Wittgenstein’s private language argument is one effort to vouchsafe the grounds for such belief. We also see the importance of belief in Eggers’s fictionalized autobiography of Valentino Achak Deng, What Is the What (2006), which ends with the following lines: “How can I pretend that you do not exist? It would be almost as impossible as you pretending that I do not exist.” Valentino Achak Deng’s prefatory declamation of his “belief in humanity” uses similar language: “Since you and I exist we can make a difference!” While our trust in Eggers’s and Deng’s sincerity matters—we hope neither is being mercenary in proffering Deng’s story of suffering—the ultimate stakes are more profound but easy to overlook: Deng’s very existence. Likewise, it is easy to misread Wallace’s declaration that fiction ought “to dramatize the fact that we still ‘are’ human beings, now. Or can be.” Here too, nothing less than existence—both our being human and the dramatized proof that we are—is at stake. The post-ironic believer therefore does not only affirm the fact of the existence of other persons but also attempts to reconstruct our capacity to believe, seeking a literary means of dissolving the barriers that block that capacity. Excerpted from "COOL CHARACTERS: IRONY AND AMERICAN FICTION" by Lee Konstantinou, published by Harvard University Press. Copyright © 2016 by the President and Fellows of Harvard College. Used by permission. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 15:30

The queering of Pee-wee Herman: How the gay icon redefines queer boundaries beyond sexuality

“I’m a loner, Dottie. A rebel.” These sentiments are voiced three different times in “Pee-wee’s Big Adventure,” the Tim Burton-directed film that popularized Pee-wee Herman, the guileless manchild in the grey suit and red bowtie. Throughout the 1985 film, Pee-wee routinely rejects the advances of Dottie, a button-cute blonde played in a rare on-screen role by voice-over actress Elizabeth Daly (“Rugrats”). The joke isn’t just that Dottie is hopelessly, cluelessly in love with Pee-wee but seeing the character’s own image juxtaposed with his statement. These are the words you might hear from Johnny Depp’s meticulously coiffed greaser in John Waters’ “Cry-Baby,” not someone whose closest antecedent is Waters himself. Pee-wee’s noted lack of interest in the fairer sex has long led to speculation about his sexuality—with the implication that he’s gay. If you’ve come into “Pee-wee’s Big Holiday” with that idea in mind, the film will do little to dissuade you. The Netflix release, directed by John Lee (“Wonder Showzen”) and produced by Judd Apatow (“Freaks and Geeks”), finds Pee-wee Herman getting his bromance on with Joe Manganiello, playing a version of himself. Manganiello rides up to the diner where Pee-wee works on a motorcycle wearing a too-tight tee, and Pee-wee nearly faints. True to form, he refers to the “True Blood” actor as “triple cool!” The movie never plays down its potential homoerotic elements: Aside from Pee-wee’s clear overexuberance at serving Joe Manganiello a milkshake, the character develops something of a crush on the hunky actor (and who could blame him?). He expresses his desires for “friendship” with Manganiello in fantasies where the two joust on what appear to be giant piñatas; meanwhile, fireworks explode in the background. It’s about as subtle as the end credits of “Deadpool,” in which the spandexed superhero jerks off a unicorn. The film’s overt gayness led many, like BuzzFeed’s Louis Peitzman, to declare it a “queer romance.” “[C]ategorizing Pee-wee and Joe as ‘just friends’ would be, at best, a euphemistic solution to a relationship that’s deliberately vague but undeniably queer,” he writes. “Because Pee-wee is almost entirely sexless, his age indeterminate but his interests decidedly childlike, he can never consummate anything. Instead, he and Joe share the kind of mutual crush that passes for grade-school intimacy.” Slate’s Paul H. Johnson added that the character has always been gay, even back in the days of “Pee-wee’s Playhouse.” He married a fruit salad in one episode, forgodsakes. These takes are smart, well-written and accurate: There’s something that’s always been defiantly, unmistakably queer about Pee-wee Herman, but it would be a mistake to solely ascribe that to his sexuality. Pee-wee might live in a world of adult longings and eroticism, but Pee-wee does not partake. It’s not that he’s queer or even asexual, but that—by being frozen in a state of stunted adolescence—he exists in a prepubescent universe where sexuality doesn’t quite exist yet. The queerest thing about Pee-wee Herman is that he purposefully breaks with those notions by rethinking the boundaries of what “queerness” really is—an act of societal rebellion. If Pee-wee is a PG character in an R-rated universe, that’s no accident: The persona was developed while Paul Reubens was a member of the Groundlings, the L.A.-based improvisational comedy group that also gave Will Ferrell, Phil Hartman and Melissa McCarthy their starts. Reubens’ “Pee-wee Herman Show” began as a weekly midnight show with a grown-up slant. When he appeared on “Late Night With David Letterman” in 1983, Pee-wee romped around the set like a kid in a candy store—seemingly unaware of the adult show he’s on. “Camping with Pee-wee, that’s like a headline in the Post!” Letterman jokes. Pee-wee has no idea what he’s talking about. His own lack of awareness is central to how we understand Pee-wee Herman—and how we read his character. In Peitzman’s essay, he refers to homosexuality as the “subtext” of “Pee-wee’s Big Holiday,” but if we’re being honest, it’s the context. “Pee-wee’s Playhouse” might as well take place in a gay bar. Take the program’s holiday special, for instance: The episode boasted a queer cornucopia of guest stars, including community icons like Cher, Zsa Zsa Gabor, Dinah Shore and kd lang. Pee-wee goes ice-skating with Little Richard. Grace Jones even drops by to sing “The Little Drummer Boy.” If that’s not enough, a group of shirtless construction workers build a tower out of fruitcakes. If subtext by definition is furtive and secret, the queer reading of this episode couldn’t be any more overt if it were singing the Village People in gold booty shorts. The joke is that if the world around him is fabulously gay, Pee-wee hasn’t the slightest clue. The ongoing series of Pee-wee Herman films and television programs delight in placing the character in settings where his naive innocence is at odds with his surroundings, even the film he’s in: “Pee-wee’s Big Adventure” finds him in a coming-of-age tale, even if Herman—by nature—doesn’t grow. “Big Holiday,” however, reimagines his hero’s journey as a romantic quest—except that the character himself finds that concept gross, like eating his peas. This point is repeatedly reinforced throughout the series. In addition to shutting down Dottie, Pee-wee rebuffs the attentions of Emily in “Big Holiday,” a fellow resident of Fairville who hopes he will attend her book club. Pee-wee mentions that he “neglected to R.S.V.P. [his] regrets.” He continues, “I have rehearsal that night. N.A.—not available. ... L.A.T.T.I.H.T.B.G—look at the time, I have to be going.” In “Pee-wee’s Big Top,” he’s given a fiancée, Winnie, but the most he does is admire her hair. When the two snuggle on a picnic blanket, he repeatedly yanks her blond locks. Winnie later spills her egg salad sandwich all over him. This thwarted romance might seem to break with what we know about his character, but it merely reinforces the point: Even when he does begin to feel something for someone else—in whatever way someone with the capabilities of a 5-year-old can—he ends up with actual egg on his face. In “Big Top,” he meets Gina (Valeria Golino), a bosomly trapeze artist at a carnival. He glances luxuriously at her breasts, as the camera moves in for a generous close-up, and then he faints. It’s as if even processing the very idea of sexuality is too much for his brain to handle. In these situations, it’s clear that Pee-wee Herman represents a different version of the word “queerness”—the strange or abhorrent societal outsider. As ScreenCrush’s Erin Whitney notes, such is the framing device for “Big Holiday” in its very first scene. “Pee-wee’s otherness has always been a part of his DNA … but is most blatantly implied in "Big Holiday’s" opening dream sequence,” she writes. “In it, Pee-wee and his alien best friend, Yule, cry over having to say goodbye as Yule’s spaceship arrives to take him home. Beyond echoing the usual absurdity of the Pee-wee Herman universe, this dream reveals how Pee-wee literally feels alien to the culture he lives in.” Pee-wee Herman is such an outsider that he’s often on the outside in his own life. A scene in “Big Holiday” underscores that idea well: Joe Manganiello, Pee-wee’s new best friend, asks him if he’s seen “Magic Mike.” Pee-wee responds, “Ha! You’d think so, but no.” While many have taken this as admission of the character’s sexuality, the statement serves the opposite purpose: Given what we think we know about Pee-wee Herman, it seems as if he might be gay (aka the type of guy who would be familiar with “Magic Mike”). But if our hero were actually attracted to other men, Pee-wee Herman would be the last person to know about it. Many critics might write this off as “queerbaiting,” but it’s merely expanding our notions of what queerness really is—redefining the concept as more than about sexuality. As the New York Times’ Jonah Weiner argues, Pee-wee Herman defied all societal norms in “[creating] a place where desires are not policed, otherness is not demonized, gender roles are juggled and erotic energies attach where they will.” Although Whitney writes that his “feminine boyish persona…  [oscillated] between effeminate gay man and asexual man-child,” that was as much about masculinity as it was being queer. He broke with every expectation of maleness—from his ethereally pale skin to his airy prance. There’s something that’s both deeply sad and immensely transgressive about Herman’s contradictions: He’s both an adult and a child. He’s a girly boy. He’s a dreamer who never left his hometown. He’s simultaneously incredibly gay and not gay at all. And in “Big Holiday,” the character does something arguably even greater than come out of the closet: He resolves these identity conflicts to find happiness on his own terms. At the end of the film, he and Manganiello exchange friendship bracelets in a ceremony clearly intended to symbolize their own form of a commitment. What’s made Pee-wee Herman such a celebrated character over the years is that he belongs to everyone. Pee-wee stands in for the universal experience of being an outsider—whether you’re gay, straight or an overgrown infant who appears to be nothing and everything in between. That “f-word” he shares with Manganiello might be code for something else, but if you’ve been paying attention, it isn’t really: Pee-wee Herman doesn’t have to have actual sex with other men to be wonderfully, beautifully queer. “I’m a loner, Dottie. A rebel.” These sentiments are voiced three different times in “Pee-wee’s Big Adventure,” the Tim Burton-directed film that popularized Pee-wee Herman, the guileless manchild in the grey suit and red bowtie. Throughout the 1985 film, Pee-wee routinely rejects the advances of Dottie, a button-cute blonde played in a rare on-screen role by voice-over actress Elizabeth Daly (“Rugrats”). The joke isn’t just that Dottie is hopelessly, cluelessly in love with Pee-wee but seeing the character’s own image juxtaposed with his statement. These are the words you might hear from Johnny Depp’s meticulously coiffed greaser in John Waters’ “Cry-Baby,” not someone whose closest antecedent is Waters himself. Pee-wee’s noted lack of interest in the fairer sex has long led to speculation about his sexuality—with the implication that he’s gay. If you’ve come into “Pee-wee’s Big Holiday” with that idea in mind, the film will do little to dissuade you. The Netflix release, directed by John Lee (“Wonder Showzen”) and produced by Judd Apatow (“Freaks and Geeks”), finds Pee-wee Herman getting his bromance on with Joe Manganiello, playing a version of himself. Manganiello rides up to the diner where Pee-wee works on a motorcycle wearing a too-tight tee, and Pee-wee nearly faints. True to form, he refers to the “True Blood” actor as “triple cool!” The movie never plays down its potential homoerotic elements: Aside from Pee-wee’s clear overexuberance at serving Joe Manganiello a milkshake, the character develops something of a crush on the hunky actor (and who could blame him?). He expresses his desires for “friendship” with Manganiello in fantasies where the two joust on what appear to be giant piñatas; meanwhile, fireworks explode in the background. It’s about as subtle as the end credits of “Deadpool,” in which the spandexed superhero jerks off a unicorn. The film’s overt gayness led many, like BuzzFeed’s Louis Peitzman, to declare it a “queer romance.” “[C]ategorizing Pee-wee and Joe as ‘just friends’ would be, at best, a euphemistic solution to a relationship that’s deliberately vague but undeniably queer,” he writes. “Because Pee-wee is almost entirely sexless, his age indeterminate but his interests decidedly childlike, he can never consummate anything. Instead, he and Joe share the kind of mutual crush that passes for grade-school intimacy.” Slate’s Paul H. Johnson added that the character has always been gay, even back in the days of “Pee-wee’s Playhouse.” He married a fruit salad in one episode, forgodsakes. These takes are smart, well-written and accurate: There’s something that’s always been defiantly, unmistakably queer about Pee-wee Herman, but it would be a mistake to solely ascribe that to his sexuality. Pee-wee might live in a world of adult longings and eroticism, but Pee-wee does not partake. It’s not that he’s queer or even asexual, but that—by being frozen in a state of stunted adolescence—he exists in a prepubescent universe where sexuality doesn’t quite exist yet. The queerest thing about Pee-wee Herman is that he purposefully breaks with those notions by rethinking the boundaries of what “queerness” really is—an act of societal rebellion. If Pee-wee is a PG character in an R-rated universe, that’s no accident: The persona was developed while Paul Reubens was a member of the Groundlings, the L.A.-based improvisational comedy group that also gave Will Ferrell, Phil Hartman and Melissa McCarthy their starts. Reubens’ “Pee-wee Herman Show” began as a weekly midnight show with a grown-up slant. When he appeared on “Late Night With David Letterman” in 1983, Pee-wee romped around the set like a kid in a candy store—seemingly unaware of the adult show he’s on. “Camping with Pee-wee, that’s like a headline in the Post!” Letterman jokes. Pee-wee has no idea what he’s talking about. His own lack of awareness is central to how we understand Pee-wee Herman—and how we read his character. In Peitzman’s essay, he refers to homosexuality as the “subtext” of “Pee-wee’s Big Holiday,” but if we’re being honest, it’s the context. “Pee-wee’s Playhouse” might as well take place in a gay bar. Take the program’s holiday special, for instance: The episode boasted a queer cornucopia of guest stars, including community icons like Cher, Zsa Zsa Gabor, Dinah Shore and kd lang. Pee-wee goes ice-skating with Little Richard. Grace Jones even drops by to sing “The Little Drummer Boy.” If that’s not enough, a group of shirtless construction workers build a tower out of fruitcakes. If subtext by definition is furtive and secret, the queer reading of this episode couldn’t be any more overt if it were singing the Village People in gold booty shorts. The joke is that if the world around him is fabulously gay, Pee-wee hasn’t the slightest clue. The ongoing series of Pee-wee Herman films and television programs delight in placing the character in settings where his naive innocence is at odds with his surroundings, even the film he’s in: “Pee-wee’s Big Adventure” finds him in a coming-of-age tale, even if Herman—by nature—doesn’t grow. “Big Holiday,” however, reimagines his hero’s journey as a romantic quest—except that the character himself finds that concept gross, like eating his peas. This point is repeatedly reinforced throughout the series. In addition to shutting down Dottie, Pee-wee rebuffs the attentions of Emily in “Big Holiday,” a fellow resident of Fairville who hopes he will attend her book club. Pee-wee mentions that he “neglected to R.S.V.P. [his] regrets.” He continues, “I have rehearsal that night. N.A.—not available. ... L.A.T.T.I.H.T.B.G—look at the time, I have to be going.” In “Pee-wee’s Big Top,” he’s given a fiancée, Winnie, but the most he does is admire her hair. When the two snuggle on a picnic blanket, he repeatedly yanks her blond locks. Winnie later spills her egg salad sandwich all over him. This thwarted romance might seem to break with what we know about his character, but it merely reinforces the point: Even when he does begin to feel something for someone else—in whatever way someone with the capabilities of a 5-year-old can—he ends up with actual egg on his face. In “Big Top,” he meets Gina (Valeria Golino), a bosomly trapeze artist at a carnival. He glances luxuriously at her breasts, as the camera moves in for a generous close-up, and then he faints. It’s as if even processing the very idea of sexuality is too much for his brain to handle. In these situations, it’s clear that Pee-wee Herman represents a different version of the word “queerness”—the strange or abhorrent societal outsider. As ScreenCrush’s Erin Whitney notes, such is the framing device for “Big Holiday” in its very first scene. “Pee-wee’s otherness has always been a part of his DNA … but is most blatantly implied in "Big Holiday’s" opening dream sequence,” she writes. “In it, Pee-wee and his alien best friend, Yule, cry over having to say goodbye as Yule’s spaceship arrives to take him home. Beyond echoing the usual absurdity of the Pee-wee Herman universe, this dream reveals how Pee-wee literally feels alien to the culture he lives in.” Pee-wee Herman is such an outsider that he’s often on the outside in his own life. A scene in “Big Holiday” underscores that idea well: Joe Manganiello, Pee-wee’s new best friend, asks him if he’s seen “Magic Mike.” Pee-wee responds, “Ha! You’d think so, but no.” While many have taken this as admission of the character’s sexuality, the statement serves the opposite purpose: Given what we think we know about Pee-wee Herman, it seems as if he might be gay (aka the type of guy who would be familiar with “Magic Mike”). But if our hero were actually attracted to other men, Pee-wee Herman would be the last person to know about it. Many critics might write this off as “queerbaiting,” but it’s merely expanding our notions of what queerness really is—redefining the concept as more than about sexuality. As the New York Times’ Jonah Weiner argues, Pee-wee Herman defied all societal norms in “[creating] a place where desires are not policed, otherness is not demonized, gender roles are juggled and erotic energies attach where they will.” Although Whitney writes that his “feminine boyish persona…  [oscillated] between effeminate gay man and asexual man-child,” that was as much about masculinity as it was being queer. He broke with every expectation of maleness—from his ethereally pale skin to his airy prance. There’s something that’s both deeply sad and immensely transgressive about Herman’s contradictions: He’s both an adult and a child. He’s a girly boy. He’s a dreamer who never left his hometown. He’s simultaneously incredibly gay and not gay at all. And in “Big Holiday,” the character does something arguably even greater than come out of the closet: He resolves these identity conflicts to find happiness on his own terms. At the end of the film, he and Manganiello exchange friendship bracelets in a ceremony clearly intended to symbolize their own form of a commitment. What’s made Pee-wee Herman such a celebrated character over the years is that he belongs to everyone. Pee-wee stands in for the universal experience of being an outsider—whether you’re gay, straight or an overgrown infant who appears to be nothing and everything in between. That “f-word” he shares with Manganiello might be code for something else, but if you’ve been paying attention, it isn’t really: Pee-wee Herman doesn’t have to have actual sex with other men to be wonderfully, beautifully queer.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 14:30

SCOTUS’ shameful eugenics history: “We don’t provide the kind of justice that we claim to in our society”

Less than a 100 years ago, the practice of sterilizing Americans seen as unfit was widespread. A new book, “Imbeciles,” looks at the issue through one heart-rending 1927 court case that collects so much that was sad, bad and misguided into one specific time and place. Subtitled “The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck,” the book looks at the Buck v. Bell case, which concluded with the court allowing Virginia to sterilize a poor, luckless teenager. Author Adam Cohen tells the story close-up, by drawing Buck, a doctor, a prominent eugenic scientist, a lawyer and Justice Oliver Wendell Holmes in detail. Holmes famously wrote in the court’s majority decision that “Three generations of imbeciles is enough.” In the background of the story – and sometimes its foreground – is the American fascination with eugenics, the belief in improving the human race through selective breeding. In its heyday, support for eugenics was mainstream, and its advocates included progressive figures like Teddy Roosevelt and Margaret Sanger. Cohen is a former assistant editorial page editor for the New York Times and the author of the well-regarded “Nothing to Fear: FDR's Inner Circle and the Hundred Days That Created Modern America.” The interview has been lightly edited for clarity. Let’s start with the idea of sterilization. What was the justification for this in the early 20th century? In the early part of the 20th century, eugenics was actually a very popular idea. There was a broad feeling in the country that the nation’s gene pool was threatened, and that it was necessary to take action to uplift the nation and humanity. To get better people to reproduce, and worse people not to. So sterilization was one of the tactics they came up with to improve the gene pool. And what kind of people were in favor of it, and disseminated the idea? This was an idea very popular among progressives in the country. Universities were very prominent in promoting sterilization, so, for example, president emeritus Eliot of Harvard University, who was a revered figure, wrote an article supporting eugenic sterilization. The medical profession was a big supporter of sterilization. So some of the most educated and progressive parts of the country were in favor of it. How widespread was this idea that we could improve the race by limiting reproduction? It was not just a fringe-y idea in the ‘20s. Not at all – it was widely popular. Indiana passed the first eugenic sterilization law, in 1907… And many other states followed. There was broad support. Eugenics as a topic was broadly popular; it was taught at over 300 colleges and universities including Harvard and Berkeley. It was talked about a lot in popular magazines. In my book I quote Cosmopolitan magazine saying what a great thing eugenics was and how it will make the race better. So this was widely popular and winning majoritarian support in places like state legislatures. Tell us a little about Carrie Buck, the central figure in your book. She’s a very sad figure – a victim of this movement. She was born into a poor family in Charlottesville, Virginia, and raised by a single mother. Back at that time there was a lot of thought that it was better for poor children to be raised in middle-class homes, so she was taken in by a foster family…. They didn’t treat her very well… And then one summer she was raped by the nephew of her foster mother. So she was pregnant out of wedlock and her family was eager to get rid of her, both because of the stigma of out-of-wedlock pregnancy but also because their relative had committed a crime. So they got her declared epileptic and feeble-minded; she was neither… She’d been doing perfectly well in school. She’s then sent to the Colony For Epileptics and Feeble-Minded. She’d had a lot of bad luck in her short life, and continued. She got there just after Virginia had passed its eugenics and sterilization law, and they were looking for someone to put in the middle of a test case. The decision had been made in Virginia hospitals that they did not want to sterilize people until the law had been tested in the courts. The Colony president chose Carrie as the test case. She’s put in the middle of the story. You write about a broad, sweeping topic and a set of moral concerns. Why did it make sense to tell it through a court case and five main characters? There are so many ways to tell a story, and the eugenics story is such a sprawling one in the United States. One reason I fixed on this is that it shows just how strongly the nation adopted it and how wrong we were as a country. It might not be shocking to hear that in 1920s there was a doctor in Virginia, at some benighted colony for epileptics and the feeble-minded who was sterilizing people… What’s shocking is that when this was brought to the U.S. Supreme Court, the chief justice was a former president of the United States – William Howard Taft. The court included Louis Brandeis, one of the greatest progressive heroes ever to serve. And the decision was written by Oliver Wendell Holmes, the great justice. That really shows, at a much higher level, our societal breakdown. This was not a few wrong-headed people. At the point in our society, where we expected justice to be meted out – the U.S. Supreme Court – it wasn’t even close. It was just a horrible decision. But also the book is a story about justice, and that we don’t provide the kind of justice that we claim to in our society. Buck v. Bell is in part a eugenics story, but it’s also part of a larger Supreme Court story, where – as I talk about a little bit in the book – that we like to think that justice prevails, that the strong people in society are not allowed to harm the weak. But over and over again, the court has failed to do that. Your book is mostly concerned about a specific moment in time. But sterilization certainly didn’t go away. How long did it continue? It sounds like it continued almost right up to the present. It’s shocking. Eugenic sterilization – done under the auspices of a eugenics office in a state – continued until Oregon did the last one in 1981. We know it’s still done, not in [accordance] with these laws but more [underground]. A few years ago it came out that there was a lot of sterilization going on at some California prisons. There was a case in which a Tennessee prosecutor was fired, a few years ago, for making sterilization part of his plea negotiation for female defendants. So there’s probably more of this still going on. There was a moment where the movement lost steam – with the rise of the Nazis in the 1930s. As the Nazis talked more and more about the Aryan race, the master race, and talked in eugenic terms, it actually made Americans pretty queasy about it. The major eugenics office in the United States, on Long Island, began to lose its funding in the 1930s and shut down in 1939, because of that. But at the same time, that did not stop eugenic sterilization from occurring. In Virginia, where the Carrie Buck story unfolded, Nazism tainted the movement but it did not entirely obliterate it.Less than a 100 years ago, the practice of sterilizing Americans seen as unfit was widespread. A new book, “Imbeciles,” looks at the issue through one heart-rending 1927 court case that collects so much that was sad, bad and misguided into one specific time and place. Subtitled “The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck,” the book looks at the Buck v. Bell case, which concluded with the court allowing Virginia to sterilize a poor, luckless teenager. Author Adam Cohen tells the story close-up, by drawing Buck, a doctor, a prominent eugenic scientist, a lawyer and Justice Oliver Wendell Holmes in detail. Holmes famously wrote in the court’s majority decision that “Three generations of imbeciles is enough.” In the background of the story – and sometimes its foreground – is the American fascination with eugenics, the belief in improving the human race through selective breeding. In its heyday, support for eugenics was mainstream, and its advocates included progressive figures like Teddy Roosevelt and Margaret Sanger. Cohen is a former assistant editorial page editor for the New York Times and the author of the well-regarded “Nothing to Fear: FDR's Inner Circle and the Hundred Days That Created Modern America.” The interview has been lightly edited for clarity. Let’s start with the idea of sterilization. What was the justification for this in the early 20th century? In the early part of the 20th century, eugenics was actually a very popular idea. There was a broad feeling in the country that the nation’s gene pool was threatened, and that it was necessary to take action to uplift the nation and humanity. To get better people to reproduce, and worse people not to. So sterilization was one of the tactics they came up with to improve the gene pool. And what kind of people were in favor of it, and disseminated the idea? This was an idea very popular among progressives in the country. Universities were very prominent in promoting sterilization, so, for example, president emeritus Eliot of Harvard University, who was a revered figure, wrote an article supporting eugenic sterilization. The medical profession was a big supporter of sterilization. So some of the most educated and progressive parts of the country were in favor of it. How widespread was this idea that we could improve the race by limiting reproduction? It was not just a fringe-y idea in the ‘20s. Not at all – it was widely popular. Indiana passed the first eugenic sterilization law, in 1907… And many other states followed. There was broad support. Eugenics as a topic was broadly popular; it was taught at over 300 colleges and universities including Harvard and Berkeley. It was talked about a lot in popular magazines. In my book I quote Cosmopolitan magazine saying what a great thing eugenics was and how it will make the race better. So this was widely popular and winning majoritarian support in places like state legislatures. Tell us a little about Carrie Buck, the central figure in your book. She’s a very sad figure – a victim of this movement. She was born into a poor family in Charlottesville, Virginia, and raised by a single mother. Back at that time there was a lot of thought that it was better for poor children to be raised in middle-class homes, so she was taken in by a foster family…. They didn’t treat her very well… And then one summer she was raped by the nephew of her foster mother. So she was pregnant out of wedlock and her family was eager to get rid of her, both because of the stigma of out-of-wedlock pregnancy but also because their relative had committed a crime. So they got her declared epileptic and feeble-minded; she was neither… She’d been doing perfectly well in school. She’s then sent to the Colony For Epileptics and Feeble-Minded. She’d had a lot of bad luck in her short life, and continued. She got there just after Virginia had passed its eugenics and sterilization law, and they were looking for someone to put in the middle of a test case. The decision had been made in Virginia hospitals that they did not want to sterilize people until the law had been tested in the courts. The Colony president chose Carrie as the test case. She’s put in the middle of the story. You write about a broad, sweeping topic and a set of moral concerns. Why did it make sense to tell it through a court case and five main characters? There are so many ways to tell a story, and the eugenics story is such a sprawling one in the United States. One reason I fixed on this is that it shows just how strongly the nation adopted it and how wrong we were as a country. It might not be shocking to hear that in 1920s there was a doctor in Virginia, at some benighted colony for epileptics and the feeble-minded who was sterilizing people… What’s shocking is that when this was brought to the U.S. Supreme Court, the chief justice was a former president of the United States – William Howard Taft. The court included Louis Brandeis, one of the greatest progressive heroes ever to serve. And the decision was written by Oliver Wendell Holmes, the great justice. That really shows, at a much higher level, our societal breakdown. This was not a few wrong-headed people. At the point in our society, where we expected justice to be meted out – the U.S. Supreme Court – it wasn’t even close. It was just a horrible decision. But also the book is a story about justice, and that we don’t provide the kind of justice that we claim to in our society. Buck v. Bell is in part a eugenics story, but it’s also part of a larger Supreme Court story, where – as I talk about a little bit in the book – that we like to think that justice prevails, that the strong people in society are not allowed to harm the weak. But over and over again, the court has failed to do that. Your book is mostly concerned about a specific moment in time. But sterilization certainly didn’t go away. How long did it continue? It sounds like it continued almost right up to the present. It’s shocking. Eugenic sterilization – done under the auspices of a eugenics office in a state – continued until Oregon did the last one in 1981. We know it’s still done, not in [accordance] with these laws but more [underground]. A few years ago it came out that there was a lot of sterilization going on at some California prisons. There was a case in which a Tennessee prosecutor was fired, a few years ago, for making sterilization part of his plea negotiation for female defendants. So there’s probably more of this still going on. There was a moment where the movement lost steam – with the rise of the Nazis in the 1930s. As the Nazis talked more and more about the Aryan race, the master race, and talked in eugenic terms, it actually made Americans pretty queasy about it. The major eugenics office in the United States, on Long Island, began to lose its funding in the 1930s and shut down in 1939, because of that. But at the same time, that did not stop eugenic sterilization from occurring. In Virginia, where the Carrie Buck story unfolded, Nazism tainted the movement but it did not entirely obliterate it.Less than a 100 years ago, the practice of sterilizing Americans seen as unfit was widespread. A new book, “Imbeciles,” looks at the issue through one heart-rending 1927 court case that collects so much that was sad, bad and misguided into one specific time and place. Subtitled “The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck,” the book looks at the Buck v. Bell case, which concluded with the court allowing Virginia to sterilize a poor, luckless teenager. Author Adam Cohen tells the story close-up, by drawing Buck, a doctor, a prominent eugenic scientist, a lawyer and Justice Oliver Wendell Holmes in detail. Holmes famously wrote in the court’s majority decision that “Three generations of imbeciles is enough.” In the background of the story – and sometimes its foreground – is the American fascination with eugenics, the belief in improving the human race through selective breeding. In its heyday, support for eugenics was mainstream, and its advocates included progressive figures like Teddy Roosevelt and Margaret Sanger. Cohen is a former assistant editorial page editor for the New York Times and the author of the well-regarded “Nothing to Fear: FDR's Inner Circle and the Hundred Days That Created Modern America.” The interview has been lightly edited for clarity. Let’s start with the idea of sterilization. What was the justification for this in the early 20th century? In the early part of the 20th century, eugenics was actually a very popular idea. There was a broad feeling in the country that the nation’s gene pool was threatened, and that it was necessary to take action to uplift the nation and humanity. To get better people to reproduce, and worse people not to. So sterilization was one of the tactics they came up with to improve the gene pool. And what kind of people were in favor of it, and disseminated the idea? This was an idea very popular among progressives in the country. Universities were very prominent in promoting sterilization, so, for example, president emeritus Eliot of Harvard University, who was a revered figure, wrote an article supporting eugenic sterilization. The medical profession was a big supporter of sterilization. So some of the most educated and progressive parts of the country were in favor of it. How widespread was this idea that we could improve the race by limiting reproduction? It was not just a fringe-y idea in the ‘20s. Not at all – it was widely popular. Indiana passed the first eugenic sterilization law, in 1907… And many other states followed. There was broad support. Eugenics as a topic was broadly popular; it was taught at over 300 colleges and universities including Harvard and Berkeley. It was talked about a lot in popular magazines. In my book I quote Cosmopolitan magazine saying what a great thing eugenics was and how it will make the race better. So this was widely popular and winning majoritarian support in places like state legislatures. Tell us a little about Carrie Buck, the central figure in your book. She’s a very sad figure – a victim of this movement. She was born into a poor family in Charlottesville, Virginia, and raised by a single mother. Back at that time there was a lot of thought that it was better for poor children to be raised in middle-class homes, so she was taken in by a foster family…. They didn’t treat her very well… And then one summer she was raped by the nephew of her foster mother. So she was pregnant out of wedlock and her family was eager to get rid of her, both because of the stigma of out-of-wedlock pregnancy but also because their relative had committed a crime. So they got her declared epileptic and feeble-minded; she was neither… She’d been doing perfectly well in school. She’s then sent to the Colony For Epileptics and Feeble-Minded. She’d had a lot of bad luck in her short life, and continued. She got there just after Virginia had passed its eugenics and sterilization law, and they were looking for someone to put in the middle of a test case. The decision had been made in Virginia hospitals that they did not want to sterilize people until the law had been tested in the courts. The Colony president chose Carrie as the test case. She’s put in the middle of the story. You write about a broad, sweeping topic and a set of moral concerns. Why did it make sense to tell it through a court case and five main characters? There are so many ways to tell a story, and the eugenics story is such a sprawling one in the United States. One reason I fixed on this is that it shows just how strongly the nation adopted it and how wrong we were as a country. It might not be shocking to hear that in 1920s there was a doctor in Virginia, at some benighted colony for epileptics and the feeble-minded who was sterilizing people… What’s shocking is that when this was brought to the U.S. Supreme Court, the chief justice was a former president of the United States – William Howard Taft. The court included Louis Brandeis, one of the greatest progressive heroes ever to serve. And the decision was written by Oliver Wendell Holmes, the great justice. That really shows, at a much higher level, our societal breakdown. This was not a few wrong-headed people. At the point in our society, where we expected justice to be meted out – the U.S. Supreme Court – it wasn’t even close. It was just a horrible decision. But also the book is a story about justice, and that we don’t provide the kind of justice that we claim to in our society. Buck v. Bell is in part a eugenics story, but it’s also part of a larger Supreme Court story, where – as I talk about a little bit in the book – that we like to think that justice prevails, that the strong people in society are not allowed to harm the weak. But over and over again, the court has failed to do that. Your book is mostly concerned about a specific moment in time. But sterilization certainly didn’t go away. How long did it continue? It sounds like it continued almost right up to the present. It’s shocking. Eugenic sterilization – done under the auspices of a eugenics office in a state – continued until Oregon did the last one in 1981. We know it’s still done, not in [accordance] with these laws but more [underground]. A few years ago it came out that there was a lot of sterilization going on at some California prisons. There was a case in which a Tennessee prosecutor was fired, a few years ago, for making sterilization part of his plea negotiation for female defendants. So there’s probably more of this still going on. There was a moment where the movement lost steam – with the rise of the Nazis in the 1930s. As the Nazis talked more and more about the Aryan race, the master race, and talked in eugenic terms, it actually made Americans pretty queasy about it. The major eugenics office in the United States, on Long Island, began to lose its funding in the 1930s and shut down in 1939, because of that. But at the same time, that did not stop eugenic sterilization from occurring. In Virginia, where the Carrie Buck story unfolded, Nazism tainted the movement but it did not entirely obliterate it.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 13:30

What most Americans don’t know about yoga

Religion Dispatches Last month, a yoga class at the University of Ottawa that was cancelled amid accusations of cultural appropriation made a quiet return to the class schedule. The controversy initially erupted in November when a yoga instructor received an email from from the school’s student union saying that her services would no longer be needed. Citing problematic legacies of “cultural genocide” and “western supremacy,” the email stated some students felt uncomfortable with how yoga was being practiced. Now the class is back with a new instructor—a South Asian instructor—who worries she was hired only because she’s Indian. The incident was the latest in a string of cultural flashpoints surrounding the centuries-old Indian practice, and has made many a yogi rethink the lines between cultural exchange and cultural appropriation. In the following conversation, journalist Michelle Goldberg and professor Andrea Jain discuss their latest books on yoga and question whether a practice hinged on the idea of reinvention can really have an “original” tradition. Andrea Jain: Michelle, you and I recently wrote books about modern yoga. Your The Goddess Pose: The Audacious Life of Indra Devi evaluates modern yoga as contemporary spirituality through the lens of a particularly influential woman, Indra Devi. My Selling Yoga: From Counterculture to Pop Culture analyzes the popularization of yoga and its religious dimensions by selecting from a wide array of yoga figures and types, from Iyengar and his Iyengar Yoga to Bikram and his Bikram Yoga. We have very different methodological approaches and subjects, but what I most recognized in your book, the thread that most echoed a major theme of Selling Yoga, was that of “reinvention.” In fact, you seem to have used reinvention as the pole around which the book revolves. For example, your chapters are named for Indra Devi’s many titles, each of which, as you note, she built and discarded without regret when it no longer suited her. Your book reiterates, in short, that Indra Devi repeatedly reinvented herself. I think this makes her an especially useful figure through which to understand the history of modern yoga, which I suggest is perpetually reinvented and therefore lacks any static core or essence around which all other qualities revolve. What does modernity have to do with all this reinvention? In other words, does the tendency toward perpetual reinvention reflect something characteristically modern about modern yoga and Indra Devi? Michelle Goldberg: Reinvention seems like a great place to start. It’s certainly a more useful frame for discussing modern yoga than “authenticity.” As you write—and I completely agree—“yoga has been perpetually context-sensitive, so there is no ‘legitimate,’ ‘authentic,’ ‘true,’ or ‘original’ tradition, only contextualized ideas and practices organized around the termyoga.” My book isn’t just about yoga: I was fascinated by Devi because of the sheer crazy scope of her life, which goes from the Russian revolution to the cabarets of Weimar Berlin to Indian independence, wartime Shanghai and beyond. But I also saw her life as a vehicle to answer my own questions about yoga, which I’ve practiced for years even as I’ve harbored skepticism about much of what I’ve heard in yoga classes. As I dove into the research for the book, I came to realize pretty quickly how big a gulf there is between the popular Western conception of yoga and the academic understanding of it. You put it perfectly: “today’s popularized yoga systems are new, not continuations of some static premodern yoga tradition… Even postures and breathing exercises were marginal to the most widely cited sources on yoga prior to the twentieth century, and the forms of postures and breathing exercises that were present in those sources dramatically differ from those idiosyncratic forms found in postural yoga today.” One thing I love about your book is that you treat this observation as a starting point rather than an end. You still take the ritual, spiritual aspects of modern yoga seriously. When I talk about my book, occasionally people will come up to me as ask—sometimes anxiously—which parts of modern yoga practice are “real.” If there’s no scriptural record of warrior poses, what about headstand or lotus? How about the breathing exercises? I get it—a lot of people turn to yoga, as opposed to, say, Pilates, because they want to partake of a deep tradition, and finding out that that tradition isn’t very long can be disillusioning. But most people also understand that every religious and cultural practice undergoes a huge amount of adaptation and reinvention as it moves through space and time. I was raised in Reform Judaism. The religion of my childhood would be unintelligible to the animal-sacrificing desert Jews of two thousand years ago. That doesn’t mean it isn’t “real,” or meaningful. “I think yoga is a powerful ritual of reinvention for modern people, especially modern women, who constantly feel like they have to create and recreate themselves.” So the question becomes: what is the ritual of a modern yoga class doing for people? Why is it so meaningful to them? The answer starts, I think, with the quasi-religious nature of physical fitness culture in general, with its cycle of sin, guilt and expiation. We have great admiration for people, particularly women, whose bodies testify to their asceticism; they seem somehow good and pure in addition to being beautiful. But yoga has something that CrossFit doesn’t—an aura of magic. It’s satisfying to believe that differing arrangements of limbs can cure one problem or another, and that they can ultimately, in some inchoate way, lead us closer to becoming who we’re meant to be, both physically and emotionally. I think yoga is a powerful ritual of reinvention for modern people, especially modern women, who constantly feel like they have to create and recreate themselves. In this sense, Indra Devi, a protean, self-made, stateless woman forever refusing to be bound by her own history, is much more the progenitor of modern yoga culture than, say, Swami Vivekananda. Here’s a question for you, as a scholar of religion. We’re both interested in all the ways that postural yoga is a distinctly modern phenomenon. But is there any through line at all to older traditions? Obviously there’s a lot of silliness that goes under the name “tantra,” but you also suggest that there’s some sort of real continuity between Indian tantra and contemporary New Age spirituality. If there is, what is it? Indra Devi’s first book was called Forever Young, Forever Healthy. In my understanding, most Eastern religious traditions would teach us to let go of such an impossible fixated hope. But is tantra different? AJ: You have honed in on a subtle but important point fromSelling Yoga. In several places throughout the book, I remind readers that nothing like modern postural yoga existed prior to the twentieth century. Yet, I suggest there is continuity between premodern yoga traditions and modern postural yoga. Upon first consideration, those two claims might seem incompatible, but, although a modern postural yogi would not recognize herself if she was to transport herself back in time to witness a premodern yoga tradition from within, she would nonetheless share two connections with at least some of those premodern practitioners. The first connection, as you point out, has to do with tantra, a category that is particularly useful for thinking about change and continuity in the history of yoga. Although modern postural yogis would generally not recognize themselves in premodern tantric yoga practitioners, they share with some tantric practitioners a non-dualist vision. I’m drawing from Hugh Urban’s work here. He has suggested that tantra and the new age movement intersect where bodily, sexual, and material enjoyment are integrated into spiritual pursuits and that, simultaneously, this integration reflects an intersection between tantra and contemporary consumer culture. In other words, what the New Age movement shares with consumer culture can be found in tantra, the preeminent South Asian model of nondualism. In my analysis of the intersections of consumer culture and postural yoga, I found that postural yoga also shares a nondualist approach to the world and so reflects the dominant metaphysical mode of consumer culture. Of course, the type of nondualism assumed by premodern tantric yoga practitioners and that assumed by modern postural yogis are different. On the one hand, tantric yoga served to increasingly refine consciousness as a means to achieving divinized embodiment or, in other words, becoming a god while remaining in the body. Divinity here has nothing to do with beauty or health in any modern sense, but with direct awareness of the unity of everything, which allows consciousness to transcend or break through the usual false binaries within which we tend to live. Most notably, the practitioner awakens to the reality that there is no distinction between consciousness and the body or between god or the cosmos and the individual. Given this awareness, anything is possible, even immortality, since there is no real distinction between life and death. On the other hand, postural yoga assumes the unity of the self with the body, so that the attainment of health and beauty as defined in modern terms based on biomedical and contemporary cultural ideals is central to self-development, envisioned as an individualized, not cosmic, process. The second continuity between premodern tantric yoga and modern postural yoga is a mythical one. Modern postural yoga practitioners often believe themselves to be a part of a transmission that can be traced back to ancient traditions, especially those of hatha yoga. In the tenth to eleventh centuries, hatha yoga or “yoga of forceful exertion” emerged and involved the tantric manipulation and channeling of energy within the body. Breath control would serve to purify and balance energy in the body and, in combination with other techniques, including postures, would awaken divine feminine energy or Shakti, who otherwise lies dormant, coiled up at the bottom of the spine. The techniques of hatha yoga draw her up through the body, and, as she rises she awakens latent energy. Finally, she reaches the top of the head, and the practitioner achieves embodied enlightenment. In popular yoga discourse, claims to a linear trajectory of transmission—premodern yoga functions as what Mark Singleton has described as “the touchstone of authenticity” for proponents of modern yoga—are frequently made and assumed to be historically accurate. For example, postural yoga giants B.K.S. Iyengar and K. Pattabhi Jois have claimed direct historical ties between their postural yoga methods and ancient yoga traditions. The claim to authority with regard to ancient yoga knowledge, though historically inaccurate, serves to ground a worldview and set of values and therefore fulfills a mythological function. Since we’re on the topic of tantra, it seems to me that Indra Devi shares something with tantra that reaches beyond the nondualism and mythological functions that tie tantra to postural yoga generally. I would suggest it’s their subversive qualities. Tantra is known for flipping conventional cultural rules on their head, such as key distinctions between pure and impure foods (for example, vegetarian versus meat) or pure and impure occupations (for example, priesthood versus prostitution). Indra Devi was also subversive. For example, she was an unwed actress in a time and place when both designations were deemed unsavory. In your view, what were the most profound ways in which Indra Devi was subversive or transgressive and how did they serve to empower her in her pursuit of India and eventually yoga? MG: I think Devi was transgressive and conservative all at once. Clearly, she was a proto-feminist and a fiercely independent, adventurous woman. She forged her own identity at a time when that was much harder than it is today. She took off for India on a spiritual quest, alone, in 1927. When she was nearing 50, she moved, again on her own, to Hollywood, a place not particularly welcoming to middle-aged women without a lot of money, and reinvented herself here. When she was in hereighties, she decided to start a new life in Buenos Aires. She never had children and never let either of her husbands tie her down, for better or worse. She defies almost every expectation there is for how a woman’s life is supposed to unfold, and rather than coming to ruin, she dies just short of 103 surrounded by adoring friends and acolytes. In that sense, she’s a model for the efficacy of the system she taught. There is a subtly feminist message in some of her books, particularly her first one, Forever Young, Forever Healthy, which came out in 1953. As anyone who has read The Feminine Mystique knows, the self-help books of the time often told women their suffering came from their refusal to accept their subordinate domestic role. Devi’s message was very different: “In our present age women are going through a critical period of transition,” she wrote. “Being awakened to a new freedom, they will probably have to suffer even more now than they did a generation ago when they lacked it.” Ultimately, she believed, the growing antagonism between the sexes “will continue until man grants women equality as a human being.” She also urged men to attend to their wives’ sexual pleasure: “an emotionally mature and loving husband is the best person to help his wife to overcome her frigidity, provided he himself hasn’t caused it by being a clumsy, crude, and uninspiring lover.” None of this had very much to do with what had, until then, gone under the rubric of yoga, but it was doubtlessly empowering, and set a precedent for yoga as part of the self-care regimen of independent, cosmopolitan women. Yet there was also a political quiescence built into her conception of yoga. She once wrote of students in Paris who didn’t understand why she wouldn’t condemn the Soviet Union: “They weren’t interested in talking with someone who simply accepted things as they were and decided to live her life free from historical conflicts.” The desire to float above history is certainly understandable, given the cataclysms she lived through. But as a political person, I can’t help but see the belief that one can escape history by changing one’s own consciousness as a fantasy. Devi’s conception of reincarnation as something that both explains the world’s injustices and promises to redress them in the next life feeds into this quiescence, since it makes justice in this life less urgent. In her schema, cultivating change in oneself was far more important than cultivating change in the world. I hear similar ideas in contemporary yoga classes all the time; they allow people to feel like they’re doing something altruistic when they’re performing their asanas. Much as I loveasana practice, I don’t believe this. Do you? I’d like to hear your thoughts about yoga and politics. Modern yoga culture seems vaguely progressive, but a lot of the underlying ideas are about an escape from politics. AJ: I think Devi’s emphasis on self-cultivation at the loss of attention to social justice is alive and well among many practitioners of postural yoga today. In the popular imagination, we tend to imagine yoga as a luxury activity primarily embraced by white suburbanites. Yoga has become a part of popular culture and brand-name yoga commodities are easily accessible among privileged communities across the world. And for a large number of yoga advocates, the claim to possess knowledge of yoga is closely related to the quest for power, status, or money. There is much more to today’s yoga industry, however, than self-cultivation or profit. And Devi’s conscious effort to disengage from history does not represent the approach of all contemporary postural yoga practitioners. Many modern asana practitioners have wed yoga to political or social justice agendas. At first glance, this seems unlikely since postural yoga has become a part of a consumer culture in which practitioners choose yoga products and services based on individual desires and needs. Consumption of this kind appears rather hedonistic, or perhaps, as Jeremy Carrette and Richard King have put it strongly, is characterized by an “obsession with the individual self and a distinct lack of interest in compassion, the disciplining of desire, self-less service to others and questions of social justice.” The last people we imagine doing yoga are impoverished or otherwise disenfranchised people, such as inner-city at-risk youth or incarcerated people trapped in dilapidated jails and prisons. Yet, our vision of what people doing postural yoga look like should change. In locations across the country, modern postural yoga has been introduced to disenfranchised communities as a healing method, rehabilitative method, and as an initiative to empower those who have been socially, politically and economically marginalized. Here are a few examples: In 2002, postural yoga teacher and founder and director of the Prison Yoga Project James Fox, first began teaching yoga to prisoners at the San Quentin State Prison, a California prison for men. According to the Prison Yoga Project, most prisoners suffer from “original pain,” pain caused by chronic trauma experienced early in life. The consequent suffering leads to violence and thus more suffering in a vicious cycle that can last a lifetime. Yoga, according to the Prison Yoga Project, provides prisoners with a path toward healing and recovery. I recently made a couple of visits to Chicago, a city that testifies to the fact that yoga is not just the pastime of the rich and privileged. In Chicago, I interviewed Marshawn Feltus and Carol Horton, two yoga instructors and social activists whose efforts to make yoga available to disenfranchised people by teaching in poverty-stricken neighborhoods and schools as well as the Cook County Jail have been substantial. Their yoga advocacy aims to foster a critical consciousness regarding racism, poverty, and incarceration. Feltus learned yoga while incarcerated at Illinois River Correctional Center where he spent nineteen years of his life having been convicted of murder. He is the founder and owner of Awareness, Change, Triumph Yoga (ACT Yoga), a yoga studio in the black-majority inner urban neighborhood of Austin, Chicago, which Feltus describes as a “yoga desert.” He also offers yoga classes to men at the Cook County Jail. His classes consist of postures, breathing exercises, meditation, and talk therapy, which he suggests offer students the much-needed opportunity to purge otherwise painful memories, feelings, and thoughts. A teacher with Yoga for Recovery, a Chicago nonprofit offering yoga classes to women in the Cook County Jail, and co-founder of Chicago’s Socially Engaged Yoga Network (SEYN), Horton is an advocate for yoga as a tool for social justice. She offers workshops, teacher trainings, and public lectures on trauma-sensitive yoga. Beyond the Cook County Jail, Horton has taught yoga in a homeless shelter, a community health center, an inner urban public school program, and a residential foster care facility. She has suggested that yoga offers economically- and socially-disenfranchised people “a more robust conception of what life can be.” The above examples are not the only ones where yoga advocates are expressing concern about the disparity in access to yoga and its concomitant near absence among disenfranchised communities. In the last several years, many non-profit organizations around the country and beyond have engaged in concerted efforts to make yoga more accessible to at-risk youth, veterans, prisoners, and those suffering from HIV/AIDS or substance abuse. Reading your book, I was struck by the story of Devi giving a “prison class” in a Shanghai hotel where Americans were held captive by Japanese forces. Even if she didn’t set out to teach yoga as a part of a social justice project, she at least had this moment in which she used modern yoga’s meditative and postural components as tools for change in a socially heated and politically complex situation, if only among this small group of prisoners desperate for help. Much has changed in the yoga world since Devi taught those yoga classes in Shanghai. Most notably, yoga has undergone popularization and has become a hot, and oftentimes expensive, commodity among countless consumers around the world. Speaking as a practitioner of postural yoga, what do you think has remained a constant? In other words, do you recognize anything in your own yoga practice or in pop culture yoga more generally that echoes Devi’s story? If so, at what moments has she been most present? And, in what ways does her story not resonate with what is most commonly found in a pop-culture yoga class today? MG: I certainly don’t what to suggest that Devi didn’t believe in the altruistic power of yoga. She was passionate, for example, about teaching yoga to prisoners, which she did well into the end of her life. Nevertheless, I don’t see modern postural yoga as being fundamentally about social change. It can certainly be used by those working for the greater good, but it doesn’t have any inherent moral value. When I’ve been in class and have heard teachers say that, through our practice, we’re improving the world, I can’t help but roll my eyes. There seems to be this vaguely occult idea that through yoga we can generate positive energy that will then ripple out into our surroundings. I think it comes from the mystification of yoga, the way contemporary teachers try to imbue the poses with religious magic. True, if you take care of yourself, you’re more able to take care of others, but that’s not specific to yoga—you could say the same thing of running, or spinning. (Although I guess with SoulCycle, the vague spirituality that surrounds yoga has expanded into other realms of fitness.) Where Devi is present to me is in the way that yoga has become so much an adjunct of urban, cosmopolitan female life. She sometimes presented yoga as a way for women to deal with the many competing demands that were placed on them, a way to help them sort out their own identities and, if necessary, reinvent themselves. It’s strange and improbable that these techniques associated with medieval Indian ascetics have become a crucial support system for elite 21st-century women. In addition to recounting Indra Devi’s wild life, my book tries to describe how that happened.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 12:30

Good riddance, gig economy: Uber, Ayn Rand and the awesome collapse of Silicon Valley’s dream of destroying your job

The New York Times’ Farhad Manjoo recently wrote an oddly lamenting piece about how “the Uber model, it turns out, doesn’t translate.” Manjoo describes how so many of the “Uber-of-X” companies that have sprung up as part of the so-called sharing economy have become just another way to deliver more expensively priced conveniences to those with enough money to pay. Ironically many of these Ayn Rand-inspired startups have been kept alive by subsidies of the venture capital kind which, for various reasons, are starting to dry up. Without that kind of “VC welfare,” these companies are having to raise their prices, and are finding it increasingly difficult to retain enough customers at the higher price point. Consequently, some of these startups are faltering; others are outright failing. Witness the recent collapse of SpoonRocket, an on-demand pre-made meal delivery service. Like Uber wanting to replace your car, SpoonRocket wanted to get you out of your kitchen by trying to be cheaper and faster than cooking. Its chefs mass-produced its limited menu of meals, and cars equipped with warming cases delivered the goods, aiming for “sub-10 minute delivery of sub-$10 meals.” But it didn’t work out as planned. And once the VC welfare started backing away, SpoonRocket could not maintain its low price point. The same has been happening with other on-demand services such as the valet-parking app Luxe, which has degraded to the point where Manjoo notes that “prices are rising, service is declining, business models are shifting, and in some cases, companies are closing down.” Yet the telltale signs of the many problems with this heavily subsidized startup business model have been prevalent for quite some time, for those who wanted to see. In July 2014, media darling TaskRabbit, which had been hailed as a revolutionary for the way it allowed vulnerable workers to auction themselves to the lowest bidders for short-term gigs, underwent a major “pivot.” That’s Silicon Valley-speak for acknowledging that its business model wasn’t working. It was losing too much money, and so it had to shake things up. TaskRabbit revamped how its platform worked, particularly how jobs are priced. CEO Leah Busque defended the changes as necessary to help TaskRabbit keep up with “explosive demand growth,” but published reports said the company was responding to a decline in the number of completed tasks. Too many of the Rabbits, it turns out, were not happy bunnies – they were underpaid and did a poor job, despite company rhetoric to the contrary. An increasing number of them simply failed to show up for their tasks. As a results, customers also failed to return. A contagion of pivots began happening among other sharing economy startups. Companies like Cherry (car washes), Prim (laundry), SnapGoods (gear rental), Rewinery (wine), HomeJoy (home cleaning) all went bust, some of them quietly and others with more headlines. Historical experience shows that three out of four startups fail, and more than nine out of 10 never earn a return. My favorite example is SnapGoods, which is still cited today by many journalists who are pumping up the sharing economy (and haven’t done their homework) as a fitting example of a cool, hip company that allows people to rent out their spare equipment, like that drill you never use, or your backpack or spare bicycle—even though SnapGoods went out of business in August 2012. It just disappeared, poof, without a trace, yet goes on living in the imagination of sharing economy boosters. I conducted a Twitter interview with its former CEO, Ron J. Williams, as well as with whatever wizard currently lurks behind the faux curtain of the SnapGoods Twitter account, and the only comment they would make is that “we pivoted and communicated to our 50,000 users that we had bigger fish to try.” Getting even more vague, they insisted “we decided to build tech to strengthen social relationships and facilitate trust” —classic sharing-economy speak for producing vaporware instead of substance from a company that had vanished with barely a trace. Zaarly, in its prime, was another sharing-economy darling of the venture capital set, with notable investors including Steve Jobs, hotshot VC firm Kleiner Perkins and former eBay CEO Meg Whitman on its board. It positioned itself in the marketplace as a competitor to TaskRabbit and similar services, with its brash founder and CEO, Bo Fishback, explaining his company’s mission to a conference audience: “If you’ve ever said, ‘I’d pay X amount for Y,’ then Zaarly is for you.” Fishback once spectacularly illustrated his brand by bringing on stage a cow being towed by a man in a baseball cap and carrying a jug of milk—“If I’m willing to pay $100 for someone to bring me a glass of fresh milk from an Omaha dairy cow right now, there might very well be a guy who would be super happy to do that,” he said. That kind of bravado is what gave these companies their electric juice, as media outlets like the Economist lionized them as the “on-demand” economy. Like so many of the sharing-economy evangelicals, Fishback brandished a libertarian Ayn Randianism which saw Zaarly as creating “the ultimate opt-in employment market, where there is no excuse for people who say, ‘I don’t know how to get a job, I don’t know how to get started.’” But alas, those were the heady, early years, when Zaarly was flush with VC cash. Flash forward to today and Fishback is more humble, as is his company, having gone through several “pivots.” The “request anything” model is gone, as are Fishback’s lofty sermons to American workers. Instead, Zaarly has become more narrowly focused on four comparatively mundane markets: house cleaning, handyman services, lawn care and maid service. And then there’s Exec. Like Zaarly and TaskRabbit, Exec also started with great fanfare as a broader errand-running business, this one focused on hiring a personal assistant for busy Masters of the Universe. Like other sharing startups, initially it had grand ambitions about the on-demand economy and fomenting a revolution over how we work: connecting those with more money than time with those 1099 indies who desperately needed the money. But eventually this company too was forced by its market failures to narrow its focus, in this case to housekeeping exclusively. Finally the company flamed out and was sold to another housekeeping startup, Handybook.  Exec’s former CEO, Justin Kan, wrote a self-reflective farewell blog post about what he thought went wrong with his company. His observations are illuminating. His company had charged customers $25 per hour (which later rose to $30) to hire one of their personal assistants, and the worker received 80 percent, or about $20 per hour. That seemed like a high wage to Kan, but much to his surprise he discovered that, when his errand runners made their own personal calculation, factoring in the unsteadiness of the work, the frequency of downtime, hustling from gig to gig, the on-call nature of the work as well as their own expenses, it wasn’t such a great deal. Wrote Kan, “It turns out that $20 per hour does not provide enough economic incentive to dictate when our errand runners had to be available, leading to large supply gaps at times of spiky demand . . . it was impossible to ensure that we had consistent availability. Kan says the company also acquired a “false sense that the quality of service for our customers was better than it was” because the quality of the “average recruitable errand runner”—at the low pay and on-call demands that Exec wanted—did not result in hiring the self-motivated personality types like those that start Silicon Valley companies. (Surprise, surprise.) That in turn led to too many negative experiences for too many customers, especially since, like with TaskRabbit, a too-high percentage of its on-demand workers simply failed to show up to their gigs. (Surprise, surprise.) It turns out, he discovered, that “most competent people are not looking for part-time work.” (Surprise, surprise.) Indeed, the reality that the sharing economy visionaries can’t seem to grasp is that not everyone is cut out to be a gig-preneur, or to “build out their own businesses,” as Leah Busque likes to say. Being an entrepreneur takes a uniquely wired brand of individual with a distinctive skill set, including being “psychotically optimistic,” as one business consultant put it. Simply being jobless is not a sufficient qualification. In addition, apparently nobody in Silicon Valley ever shared with Kan or Busque the old business secret that “you get what you pay for.” That’s a lesson that Uber’s Travis Kalanick seems determined to learn the hard way as well. Kan, like Leah Busque, Bo Fishman and so many of the wide-eyed visionaries of Silicon Valley, had completely underestimated the human factor. To so many of these hyperactive venture entrepreneurs, workers are just another ore to be fed into their machine. They forget that the quality of the ore is crucial to their success, and that quality was dependent on how well the workers were treated and rewarded. The low pay and uncertain nature of the work keeps the employees wondering if there isn’t a better deal somewhere else. Moreover, a degree of tunnel vision has prevented startup entrepreneurs from seeing that their business model often is not scalable or sustainable at the billionaire unicorn leve without ongoing VC welfare subsidies. Silicon Valley has an expression, “That works on Sand Hill Road”—referring to the upper-crust boulevard in Menlo Park, California, where much of the world’s venture capital makes its home. Some things that seem like great ideas—like paying low wages to personal assistants to shuffle around at your every whim, or lowballing wages for someone to hustle around parking cars for yuppies—only make sense inside the VC bubble that has lost all contact with the realities of everyday Americans. A pattern has emerged about the “white dwarf” fate of many of these once-luminous sharing startups: after launching with much fanfare and tens of millions of VC capital behind them, vowing to enact a revolution in how people work and how society organizes peer-to-peer economic transactions, in the end many of these companies morphed into the equivalent of old-fashioned temp agencies (and others have simply imploded into black hole nothingness). Market forces have resulted in a convergence of companies on a few services which had been the most used on their platforms. In a real sense, even the startup king itself, Uber, is merely a temp agency, where workers do only one task: drive cars. Rebecca Smith, deputy director of the National Employment Law Project, compares the businesses of the gig economy to old-fashioned labor brokers. Companies like Instacart, Postmates and Uber, she says, talk as if they are different from old-style employers simply because they operate online. "But in fact,” she says, “they are operating just like farm labor contractors, garment jobbers and day labor centers of old." Tech enthusiasts like the Times’ Manjoo seem to be waking up to the smell of the coffee. “The uneven service and increased prices,” writes Manjoo, “raise larger questions about on-demand apps” which he says “now often feel like just another luxury for people who have more money than time.” Yet that strikes me as too black-and-white, as overly gloomy as Manjoo once was excessively optimistic. The sharing economy apps have proven to be extremely fluid at connecting someone who needs work with someone willing to pay for that work. Some workers have praised the flexibility of the platforms, which allow labor market outsiders – young people, immigrants, minorities and seniors especially -- who have difficulty finding work to access additional options. It’s better than sitting at home as a couch potato with no income. And by narrowing the scope of their services, these companies stand a better chance of contracting with quality people, and developing real relationships with them. I suspect that, properly pivoted in the right direction, these app-based services will continue to play a role in the economy. Eventually many traditional economy companies may adapt an app-based labor market in ways that we can’t yet anticipate. But that means we need to figure out a way to launch a universal, portable safety net for all U.S. workers (hint: we can do it at the local and state levels, we don’t need to wait for a dysfunctional Congress). At the end of the day, the sharing economy startups have been hamstrung by the quality of the workers they hire. If they want good workers, they need to offer decent jobs. Otherwise, this sharing economy is not about sharing at all, and not very revolutionary. The current startup model destroys the social connection between businesses and those they employ, and these companies have failed to thrive because they provide crummy jobs that most people only want to do as a very last resort. These platforms show their workforce no allegiance or loyalty, and they engender none in return.The New York Times’ Farhad Manjoo recently wrote an oddly lamenting piece about how “the Uber model, it turns out, doesn’t translate.” Manjoo describes how so many of the “Uber-of-X” companies that have sprung up as part of the so-called sharing economy have become just another way to deliver more expensively priced conveniences to those with enough money to pay. Ironically many of these Ayn Rand-inspired startups have been kept alive by subsidies of the venture capital kind which, for various reasons, are starting to dry up. Without that kind of “VC welfare,” these companies are having to raise their prices, and are finding it increasingly difficult to retain enough customers at the higher price point. Consequently, some of these startups are faltering; others are outright failing. Witness the recent collapse of SpoonRocket, an on-demand pre-made meal delivery service. Like Uber wanting to replace your car, SpoonRocket wanted to get you out of your kitchen by trying to be cheaper and faster than cooking. Its chefs mass-produced its limited menu of meals, and cars equipped with warming cases delivered the goods, aiming for “sub-10 minute delivery of sub-$10 meals.” But it didn’t work out as planned. And once the VC welfare started backing away, SpoonRocket could not maintain its low price point. The same has been happening with other on-demand services such as the valet-parking app Luxe, which has degraded to the point where Manjoo notes that “prices are rising, service is declining, business models are shifting, and in some cases, companies are closing down.” Yet the telltale signs of the many problems with this heavily subsidized startup business model have been prevalent for quite some time, for those who wanted to see. In July 2014, media darling TaskRabbit, which had been hailed as a revolutionary for the way it allowed vulnerable workers to auction themselves to the lowest bidders for short-term gigs, underwent a major “pivot.” That’s Silicon Valley-speak for acknowledging that its business model wasn’t working. It was losing too much money, and so it had to shake things up. TaskRabbit revamped how its platform worked, particularly how jobs are priced. CEO Leah Busque defended the changes as necessary to help TaskRabbit keep up with “explosive demand growth,” but published reports said the company was responding to a decline in the number of completed tasks. Too many of the Rabbits, it turns out, were not happy bunnies – they were underpaid and did a poor job, despite company rhetoric to the contrary. An increasing number of them simply failed to show up for their tasks. As a results, customers also failed to return. A contagion of pivots began happening among other sharing economy startups. Companies like Cherry (car washes), Prim (laundry), SnapGoods (gear rental), Rewinery (wine), HomeJoy (home cleaning) all went bust, some of them quietly and others with more headlines. Historical experience shows that three out of four startups fail, and more than nine out of 10 never earn a return. My favorite example is SnapGoods, which is still cited today by many journalists who are pumping up the sharing economy (and haven’t done their homework) as a fitting example of a cool, hip company that allows people to rent out their spare equipment, like that drill you never use, or your backpack or spare bicycle—even though SnapGoods went out of business in August 2012. It just disappeared, poof, without a trace, yet goes on living in the imagination of sharing economy boosters. I conducted a Twitter interview with its former CEO, Ron J. Williams, as well as with whatever wizard currently lurks behind the faux curtain of the SnapGoods Twitter account, and the only comment they would make is that “we pivoted and communicated to our 50,000 users that we had bigger fish to try.” Getting even more vague, they insisted “we decided to build tech to strengthen social relationships and facilitate trust” —classic sharing-economy speak for producing vaporware instead of substance from a company that had vanished with barely a trace. Zaarly, in its prime, was another sharing-economy darling of the venture capital set, with notable investors including Steve Jobs, hotshot VC firm Kleiner Perkins and former eBay CEO Meg Whitman on its board. It positioned itself in the marketplace as a competitor to TaskRabbit and similar services, with its brash founder and CEO, Bo Fishback, explaining his company’s mission to a conference audience: “If you’ve ever said, ‘I’d pay X amount for Y,’ then Zaarly is for you.” Fishback once spectacularly illustrated his brand by bringing on stage a cow being towed by a man in a baseball cap and carrying a jug of milk—“If I’m willing to pay $100 for someone to bring me a glass of fresh milk from an Omaha dairy cow right now, there might very well be a guy who would be super happy to do that,” he said. That kind of bravado is what gave these companies their electric juice, as media outlets like the Economist lionized them as the “on-demand” economy. Like so many of the sharing-economy evangelicals, Fishback brandished a libertarian Ayn Randianism which saw Zaarly as creating “the ultimate opt-in employment market, where there is no excuse for people who say, ‘I don’t know how to get a job, I don’t know how to get started.’” But alas, those were the heady, early years, when Zaarly was flush with VC cash. Flash forward to today and Fishback is more humble, as is his company, having gone through several “pivots.” The “request anything” model is gone, as are Fishback’s lofty sermons to American workers. Instead, Zaarly has become more narrowly focused on four comparatively mundane markets: house cleaning, handyman services, lawn care and maid service. And then there’s Exec. Like Zaarly and TaskRabbit, Exec also started with great fanfare as a broader errand-running business, this one focused on hiring a personal assistant for busy Masters of the Universe. Like other sharing startups, initially it had grand ambitions about the on-demand economy and fomenting a revolution over how we work: connecting those with more money than time with those 1099 indies who desperately needed the money. But eventually this company too was forced by its market failures to narrow its focus, in this case to housekeeping exclusively. Finally the company flamed out and was sold to another housekeeping startup, Handybook.  Exec’s former CEO, Justin Kan, wrote a self-reflective farewell blog post about what he thought went wrong with his company. His observations are illuminating. His company had charged customers $25 per hour (which later rose to $30) to hire one of their personal assistants, and the worker received 80 percent, or about $20 per hour. That seemed like a high wage to Kan, but much to his surprise he discovered that, when his errand runners made their own personal calculation, factoring in the unsteadiness of the work, the frequency of downtime, hustling from gig to gig, the on-call nature of the work as well as their own expenses, it wasn’t such a great deal. Wrote Kan, “It turns out that $20 per hour does not provide enough economic incentive to dictate when our errand runners had to be available, leading to large supply gaps at times of spiky demand . . . it was impossible to ensure that we had consistent availability. Kan says the company also acquired a “false sense that the quality of service for our customers was better than it was” because the quality of the “average recruitable errand runner”—at the low pay and on-call demands that Exec wanted—did not result in hiring the self-motivated personality types like those that start Silicon Valley companies. (Surprise, surprise.) That in turn led to too many negative experiences for too many customers, especially since, like with TaskRabbit, a too-high percentage of its on-demand workers simply failed to show up to their gigs. (Surprise, surprise.) It turns out, he discovered, that “most competent people are not looking for part-time work.” (Surprise, surprise.) Indeed, the reality that the sharing economy visionaries can’t seem to grasp is that not everyone is cut out to be a gig-preneur, or to “build out their own businesses,” as Leah Busque likes to say. Being an entrepreneur takes a uniquely wired brand of individual with a distinctive skill set, including being “psychotically optimistic,” as one business consultant put it. Simply being jobless is not a sufficient qualification. In addition, apparently nobody in Silicon Valley ever shared with Kan or Busque the old business secret that “you get what you pay for.” That’s a lesson that Uber’s Travis Kalanick seems determined to learn the hard way as well. Kan, like Leah Busque, Bo Fishman and so many of the wide-eyed visionaries of Silicon Valley, had completely underestimated the human factor. To so many of these hyperactive venture entrepreneurs, workers are just another ore to be fed into their machine. They forget that the quality of the ore is crucial to their success, and that quality was dependent on how well the workers were treated and rewarded. The low pay and uncertain nature of the work keeps the employees wondering if there isn’t a better deal somewhere else. Moreover, a degree of tunnel vision has prevented startup entrepreneurs from seeing that their business model often is not scalable or sustainable at the billionaire unicorn leve without ongoing VC welfare subsidies. Silicon Valley has an expression, “That works on Sand Hill Road”—referring to the upper-crust boulevard in Menlo Park, California, where much of the world’s venture capital makes its home. Some things that seem like great ideas—like paying low wages to personal assistants to shuffle around at your every whim, or lowballing wages for someone to hustle around parking cars for yuppies—only make sense inside the VC bubble that has lost all contact with the realities of everyday Americans. A pattern has emerged about the “white dwarf” fate of many of these once-luminous sharing startups: after launching with much fanfare and tens of millions of VC capital behind them, vowing to enact a revolution in how people work and how society organizes peer-to-peer economic transactions, in the end many of these companies morphed into the equivalent of old-fashioned temp agencies (and others have simply imploded into black hole nothingness). Market forces have resulted in a convergence of companies on a few services which had been the most used on their platforms. In a real sense, even the startup king itself, Uber, is merely a temp agency, where workers do only one task: drive cars. Rebecca Smith, deputy director of the National Employment Law Project, compares the businesses of the gig economy to old-fashioned labor brokers. Companies like Instacart, Postmates and Uber, she says, talk as if they are different from old-style employers simply because they operate online. "But in fact,” she says, “they are operating just like farm labor contractors, garment jobbers and day labor centers of old." Tech enthusiasts like the Times’ Manjoo seem to be waking up to the smell of the coffee. “The uneven service and increased prices,” writes Manjoo, “raise larger questions about on-demand apps” which he says “now often feel like just another luxury for people who have more money than time.” Yet that strikes me as too black-and-white, as overly gloomy as Manjoo once was excessively optimistic. The sharing economy apps have proven to be extremely fluid at connecting someone who needs work with someone willing to pay for that work. Some workers have praised the flexibility of the platforms, which allow labor market outsiders – young people, immigrants, minorities and seniors especially -- who have difficulty finding work to access additional options. It’s better than sitting at home as a couch potato with no income. And by narrowing the scope of their services, these companies stand a better chance of contracting with quality people, and developing real relationships with them. I suspect that, properly pivoted in the right direction, these app-based services will continue to play a role in the economy. Eventually many traditional economy companies may adapt an app-based labor market in ways that we can’t yet anticipate. But that means we need to figure out a way to launch a universal, portable safety net for all U.S. workers (hint: we can do it at the local and state levels, we don’t need to wait for a dysfunctional Congress). At the end of the day, the sharing economy startups have been hamstrung by the quality of the workers they hire. If they want good workers, they need to offer decent jobs. Otherwise, this sharing economy is not about sharing at all, and not very revolutionary. The current startup model destroys the social connection between businesses and those they employ, and these companies have failed to thrive because they provide crummy jobs that most people only want to do as a very last resort. These platforms show their workforce no allegiance or loyalty, and they engender none in return.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 27, 2016 11:00