Helen H. Moore's Blog, page 963

November 1, 2015

When Texas fell to the wingnuts: The secret history of the Southern strategy, modern conservatism and the Lone Star State

From the vantage point of most Dallas Republicans in early 1963, Barry Goldwater represented the brightest hope for national conservative Republicanism since the death of Robert Taft in 1953. Annoyance with the New Deal, particularly the National Industrial Recovery Act’s wage and price controls, which interfered with the management of his family’s department store, led to Goldwater’s first foray into politics as a member of the Phoenix city council. A successful candidate for the United States Senate in 1952, Goldwater assailed President Truman’s New Deal. Campaigning for reelection in 1958, he attacked “labor bosses” and unions with even more ferocity than in 1952. The Arizona senator’s views echoed those of many North Texas businessmen. Enclosing a thousand-dollar check, Fort Worth oilman W. A. Moncrief wrote to Goldwater that Walter Reuther, the president of the United Auto Workers, was “the most powerful and dangerous man in America today.” Seven months later, Goldwater made a similar point. Reuther, he said, was “a more dangerous menace than the sputniks or anything that the Russians might do.” Goldwater’s 1960 book, Conscience of a Conservative, ghostwritten by Brent Bozell, had a powerful impact on many Dallas Republicans, including Catherine Colgan. “Many of us were very impressed with Barry Goldwater,” she recalled. His “line of thinking” and “personal values” “made a lot of sense to us.” Goldwater’s message and worldview also inspired numerous Democrats, many of whom attended “resignation rallies,” where they renounced their old party affiliation and declared their allegiance to the Republican Party. In Texas the “Goldwater phenomenon” originated from Dallas County, where GOP leaders like Peter O’Donnell, John Tower, and Harry and Rita Bass galvanized the drive to elect the Arizona Republican. In 1960 and 1961, Goldwater had stumped in Texas for Tower, whose own book, a Program for Conservatives, while somewhat academic, contained many of the same themes and positions as Goldwater’s book. In July 1963 Harry Bass said, “We’re working now toward the goal of replacing one-term Governor Connally and one-term President Kennedy with well-qualified fiscally responsible men who will be, of course, Republicans.” Bass epitomized the optimism that many Dallas conservatives felt: “With Goldwater heading the ticket, we can expect to elect a Republican senator to Ralph Yarborough’s seat, at least three more congressmen, and thirty-five or forty more representatives in Texas.” Goldwater’s most fervent champion, however, was O’Donnell, who injected his infectious enthusiasm and trademark organizational mastery into the movement. Moreover, he contributed significantly to Goldwater’s use of the Southern Strategy and his consequent victory in five Deep South states in 1964. As the newly elected Republican state chairman in 1962, O’Donnell publicly encouraged Goldwater to run for president and secured the Texas Republican state committee’s passage of a measure praising Conscience of a Conservative as “an affirmative philosophy and program.” As early as the fall of 1962, Texas was firmly ensconced in the Goldwater camp. So sure was O’Donnell that Texas, as one campaign sign read, was “wild about Barry,” that he left Dallas in February 1963, accepted the position of chairman of the Draft Goldwater Committee, and organized its headquarters on Connecticut Avenue in Washington, DC. O’Donnell was “the natural choice” and “the ideal man,” according to F. Clifton White, the former chairman of the Young Republicans. “I couldn’t see how Barry Goldwater—or any other leading Republican in his right mind—could possibly thumb his nose at Peter O’Donnell.” Rita Bass accepted O’Donnell’s invitation to join him in Washington and became the national canvassing director. With White serving as national director of the committee, O’Donnell and John Grenier began the task of rounding up potential Southern delegates for Goldwater. By the time the committee held its first press conference, O’Donnell and White had already lined up Draft Goldwater chairmen in thirty-three states and had nationwide Republican support from precinct chairmen to national committeewomen. Part of what made Goldwater so appealing to O’Donnell was his early affinity to the Southern Strategy. In the early 1960s, the national Republican Party stood at a crossroads on racial issues. George Hinnan, Governor Nelson Rockefeller’s advisor, called race the “Great Republican issue,” one that divided the party. He noted that segregationist discourse was on the rise among party leaders, and “Barry has been falling increasingly for it.” Hinnan outlined the reasoning of these new converts: “Their theory is that by becoming more reactionary than even the Southern Democratic Party, the Republican Party can attract Southern conservatives who have been Democrats, and by consolidating them with the conservative strength in the Middle West and Far West, the Republicans can offset the liberalism of the Northeast and finally prevail.” Rockefeller himself bemoaned as “completely incredible” the Southern strategists’ plan of “writing off” the black vote. Hinnan was correct in his perception: Following Nixon’s defeat in 1960, Goldwater told Atlanta Republicans that the GOP, despite receiving 36 percent of the African-American vote that year, is “not going to get the Negro vote . . . so we ought to go hunting where the ducks are.” With that, Goldwater headed South in search of some ducks. Goldwater empathized with the South because his own philosophy drew on the argument that the Constitution protected property rights and restricted democracy in order to preserve privilege. From John C. Calhoun and other slave-owning politicians of the antebellum Old South to their conservative disciples of the New South, the region emphasized the role of the Constitution in curbing federal power. Goldwater subscribed to Calhoun’s understanding of the Constitution as a restrictive document that protected property rights and sanctified the power of the states over the federal government. “Our right of property,” Goldwater said, “is probably our most sacred right.” Goldwater’s narrow definition of liberty as it applied mainly to property owners allowed him to embrace “freedom” even as he ignored the plight of African-Americans at midcentury and railed against the 1964 Civil Rights Act for ineluctably giving rise to “a federal police force of mammoth proportions.” It was therefore no accident that, in historian David Farber’s words, “Goldwater played in the South.” His vision of liberty distinctly paralleled that of slave owners who had regarded slaves as their property. Skeptical that every individual would be able to comport himself responsibly, Goldwater consistently supported rich over poor, employer over employee, and white over black. His vision of a restrictive Constitution caused him to attack the existing Supreme Court and champion a return to the antilabor, pro-business, segregationist courts of the Gilded Age. He revered freedom yet attacked the Brown v. Board of Education decision, arguing that it was “not based on law” because it represented a direct violation of Southern traditions of white entitlement and black exclusion. Defending his conception of the protections the Constitution offers against this kind of court interference, Goldwater said, “I am firmly convinced—not only that integrated schools are not required— but that the Constitution does not permit any interference whatsoever by the federal government in the field of education.” From assailing stronger labor laws to rejecting federal aid for education to battling colonial independence movements, he vehemently took on any reform that promoted egalitarian causes or what he perceived as the redistribution of wealth and property from privileged whites to the underprivileged and nonwhites. During the first press conference for the Draft Goldwater Committee, O’Donnell addressed the media and declared that the national Republican Party ought to pursue an intentional Southern Strategy. Because Goldwater was the only candidate who could successfully execute such a strategy, the Arizona senator ought to be the party’s nominee. “The key to Republican success,” O’Donnell argued, “lies in converting a weakness into a strength and becoming a truly national party.” The phrase “converting a weakness into a strength” meant securing the once solidly Democratic South for a Republican candidate. In his book about Goldwater’s campaign for the presidency, Suite 3505, F. Clifton White cleared up any doubt over what O’Donnell meant by including after that crucial phrase this parenthetical remark: “(the paucity of Republican votes in the South).” At this revealing moment in political history, O’Donnell had based his argument on a striking admission. The Southern Strategy was an intentional maneuver on the part of the party to win elections, and Goldwater, with his ability to appeal to racist sentiments in the South, was seemingly the only candidate who could deliver enough Southern votes to ensure a Republican victory. Republican gains in the South in the 1962 midterm elections— achieved largely through Republican opposition to the Department of Urban Affairs—only bolstered O’Donnell’s conviction that Goldwater could win the presidency with appeals to race. William Rusher, publisher for the National Review and a former assistant counsel under Robert Morris, agreed that Republicans could beat Kennedy by selecting a candidate opposed to civil rights. Rusher argued against the immorality of racial politics by observing that Southern Democrats had been making appeals to segregationists for decades. While civil rights activists faced off against intractable segregationists, party-builders like O’Donnell were planning a racial strategy for Goldwater and making Republican institutions throughout the South lily-white. The Republican National Committee had dropped all pretense of appealing to minorities when it disbanded its division for minority outreach and established Operation Dixie, which recruited white Southerners to the party. To be sure, Goldwater’s allure to middle-class Southerners in the burgeoning Sunbelt drew on class appeals as well as race. Rather than appealing to the Ku Klux Klan, Goldwater and the GOP tailored their message for moderate Sunbelt suburbanites, who supported “right to work” labor laws, militantly opposed Communism, and assailed welfare policies. Attesting to the excitement that greeted Goldwater’s potential candidacy, nine thousand Americans from forty-four states converged on Washington, DC, on July 4, 1963, and filled the Washington Armory for a rally encouraging the senator to jump into the race. It was Peter O’Donnell’s job as the primary organizer and master of ceremonies to pump up the capacity crowd: “We are embarking on a great crusade . . . to put Goldwater in and Kennedy out!” One Washingtonian later declared, “This town’s never seen anything like it.” Conservative strategists’ increasing optimism and commitment to the Southern Strategy were buoyed by the American public’s growing disenchantment with President Kennedy’s unequivocal defense of civil rights. In June 1963, President Kennedy had addressed the nation on civil rights, called it a “moral issue,” and introduced a substantial civil rights bill to Congress. That summer, his approval rating dove from 70 percent to 55 percent. “Our people are tingling with excitement. I have been receiving long distance calls from all over the nation,” O’Donnell declared. “The South will take the lead in making Kennedy a 1-term president.” “A year ago it was said that Kennedy was unbeatable. But people are not thinking that way now.” With glee, O’Donnell predicted, “if Goldwater can carry the same states that Nixon carried in 1960, and then carry the balance of the Southern States, he will have 320 electoral votes—more than enough to win.” Goldwater himself was less than cooperative. He expressed little enthusiasm for running against Kennedy and throughout 1963 declined to commit himself to the presidential race. Although he never attempted to defuse the grassroots operation by flatly refusing to run, Goldwater remained unaffiliated with the committee that often met furtively in Suite 3505 in New York’s Chanin Building. It was “their time and money,” he said, although he was reportedly “furious” over the efforts of White and O’Donnell to seek out press coverage. “If Goldwater doesn’t want to make up his mind,” O’Donnell said, “we will draft him. And because he might say ‘No,’ we’ll tell him what we’re going to do. Won’t ask his permission to do it!” O’Donnell was well aware that time was of the essence, that the candidate would need to build the campaign’s financial and institutional infrastructure to run competitively against a Kennedy machine that had strong union support. O’Donnell grew increasingly impatient and frustrated with the presumptive candidate’s aloofness, but Goldwater refused to sanction fundraising on his behalf and stonewalled even John Tower, who served as O’Donnell’s primary intermediary with the Arizona senator. “We’re like a wet noodle,” O’Donnell complained. “This thing will surprise people if it ever gets started, but right now it isn’t started.” O’Donnell grew weary of working through Goldwater’s aides, who lionized their boss and, like the senator, showed no sense of urgency about announcing for president. After visiting New Hampshire in December 1963, O’Donnell lamented to a Goldwater staffer that “there are serious weaknesses in organization, finance, public relations and advertising, and in my opinion, we stand a great chance of being clobbered.” In addition to these organizational problems, O’Donnell saw Goldwater’s extemporaneous speaking style as an issue that might imperil a presidential run. O’Donnell advised Goldwater in a memo to prepare his remarks and avoid “shooting from the hip.” This unsolicited advice only further alienated O’Donnell from the senator’s inner circle. When Goldwater formally announced his intention to run in January 1964, O’Donnell and White were passed over for all senior positions on the Goldwater for President Committee staff. Rejecting John Grenier’s recommendation that O’Donnell be made director of political operations, the campaign offered the job to Lee Edwards, the editor of Young Americans for Freedom’s magazine, New Guard. Goldwater did, however, refer to O’Donnell as the “efficiency expert” during public remarks in June 1964 and thanked him for his efforts, saying, “I wouldn’t be standing here tonight as a possible nominee of our party for president if it weren’t for you.” Although the Goldwater campaign excluded O’Donnell, it nevertheless followed the strategy that had become his trademark and targeted the South with carefully coded appeals to white supremacy. Like some other conservatives, Goldwater exploited white anxieties in the face of the social change and upheaval fomented by the civil rights movement, which many perceived as a “Second Reconstruction.” In theory, Goldwater lauded liberty, but in reality, he allied himself with agents of racial separation. Martin Luther King accused Goldwater of “[giving] comfort to the most vicious racists and most extreme rightists in America.” Brooklyn Dodger great Jackie Robinson, himself a Republican, remarked that any black person voting for Barry Goldwater “would have a difficult time living among Negroes.” William P. Young of Pennsylvania, a black delegate to the Republican Convention in San Francisco’s Cow Palace, charged that Goldwater’s platform was “attempting to make the party of Lincoln a machine for dispensing discord and racial conflict.” There were three components to Goldwater’s version of the Southern Strategy. First, he demonized the Civil Rights Act, which became law in the summer of 1964. Abandoning an earlier attack that claimed the law was unconstitutional, Goldwater now insisted it “dangerously [tread] in the private affairs of men.” His opposition earned swift congratulations from O’Donnell, who called the law “vicious,” argued that it would create “a federal police state,” and declared that “President Johnson has turned his back on Texas to court the liberal extremists and Negro bloc in the North and East.” The second component, the film Choice, was a much more explicit type of appeal. Although Goldwater prohibited screenings of the film, which he himself called “racist,” its production demonstrated that winning the South remained the campaign’s chief preoccupation. The third component was an effort to conflate civil rights and civil disorder. Goldwater’s subtle argument was that “crime in the streets” resulted from disrespect for authority and waning morals, which in turn derived from liberalism’s welfare state. This disregard for authority and social mores crystalized in the civil rights movement’s strategy of civil disobedience. By invoking the phrase “law and order,” then, Goldwater launched a coded attack on civil rights, playing to the fears of many whites and implicitly promising strong retaliatory measures against those who appeared to threaten white people (particularly women). The Politics of Law and Order The politics of law and order had been brewing since at least the summer of 1963. In a memo that June, one Goldwater advisor wrote, “The hostility to the new Negro militancy has seemingly spread like wildfire from the South to the entire country.” The president had failed to grasp “the political implications of such a change.” “So long as the “tide of rebellion” continued and Goldwater invoked states’ rights and argued that “private property must remain inviolate,” he had a “serious chance” to beat John Kennedy. The memo suggested that any given category of crime be treated “as a prong of a single fork—a fork labeled ‘moral crisis.’” Goldwater, the memo argued, must jab the fork “relentlessly from now until election day.” Goldwater rolled out the discourse of “law and order” in March 1964 in New Hampshire, where he faced a closely contested primary against Nelson Rockefeller and Henry Cabot Lodge. The president, Goldwater declared, ought to “turn on the lights of moral leadership” and the “lights of moral order.” His “light-switch” reference identified morality with lightness, whiteness, and civic order (and, by extension, depravity with darkness and the civil rights struggle), connections he made even more explicit that June in Dallas. There, Goldwater specifically identified as criminal behavior the nonviolent resistance campaigns of the civil rights activists. Before a crowd of eleven thousand at the Dallas Memorial Auditorium, Goldwater declared his allegiance to “the principles that look upon violence in the streets, anywhere in this land, regardless of who does it, as the wrong way to resolve great moral questions—the way that will destroy the liberties of all the people.” Goldwater employed the language of law and order to appeal to fears of crime and black militancy while simultaneously blaming social ills on liberalism. He often spoke in terms calculated to evoke fears of black-on-white crime and sexuality, as in the statement “Our women are no longer safe in their homes.” In describing Washington, DC, with its high crime rate, as “a place of shame and dishonor,” he called into play public awareness of the city’s sizable African-American population. Goldwater placed the blame for threats to order squarely with the civil rights movement and the Great Society, President Johnson’s set of social and economic reforms. Civil rights, he averred, engendered permissiveness and moral laxity. American liberalism, reaching its crescendo with the Great Society, had banished God from schools and rewarded indolence with social programs. “Government seeks to be parent, teacher, doctor, and even minister,” Goldwater lamented. “Rising crime rates” evidenced the “failure” of the liberal strategy of social change. This strident, racist rhetoric, which originated with the Right, influenced moderates as well. Even temperate public figures like Dwight D. Eisenhower adopted the rhetoric of “law and order,” as in this remark at the 1964 Republican Convention in San Francisco: “Let us not be guilty of maudlin sympathy for the criminal . . . roaming the streets with switchblade knife.” Roy Wilkins, president of the National Association for the Advancement of Colored Persons, underlined the racist implications of Eisenhower’s remark in a strongly worded rebuke: “The phrase ‘switchblade knife’ means ‘Negro’ to the average white American.” In his own 1964 convention speech, Goldwater tied together Democratic corruption scandals and “violence in our streets” with his plea that law and order “not become the license of the mob and jungle.” John Tower also courted segregationist voters by appealing to law and order. At the convention, he observed, “We’ve come to the point when people can be mauled and beaten and even killed on the streets of a great city with hundreds of people looking on, and doing nothing about it.” Placing the blame for this lawlessness with liberal policies, he continued, “We have come to the point where, in many cases, the lawbreakers are treated with loving care . . . while those who uphold and champion the rule of law and order are looked upon in some quarters as suspect.” Rout In 1964, the country as a whole was not ready for the brand of conservatism that Barry Goldwater embodied and Dallas County voters embraced. Conservative Republicans would have to wait for “future Novembers,” as William F. Buckley Jr. put it. But the South was ready. Five of the six states Goldwater won were in the Deep South: Alabama, Georgia, Louisiana, Mississippi, and South Carolina. He took Mississippi with 87 percent of the vote. Whereas Eisenhower had won 40 percent of the nationwide black vote in 1956 and Richard Nixon had garnered 36 percent in 1960, Goldwater took a meager 6 percent in 1964. As historian Michael Flamm concluded, neither perceptions of black violence and crime nor reactions to the rapidity of desegregation in the cities of the Northeast and Midwest were yet strong enough to produce the white voter backlash Goldwater would have needed to win in 1964. The politics of law and order failed to carry the day in 1964 because the discourse was premature. The assassination of President Kennedy in Dallas in November 1963 had revolutionized the political landscape, and both Goldwater and Congressman Bruce Alger were defeated. The assassination cast a long shadow over both campaigns and over Dallas’s identity, reinforcing the city’s reputation as a haven for extremism. An incident in which Adlai Stevenson, the US ambassador to the United Nations, was physically abused by an angry mob of Dallasites on October 26, 1963, recalled the day in 1960 when Lyndon Johnson and his wife were accosted at the Adolphus Hotel. A study by Peter O’Donnell conducted a month before the assassination concluded that “neither Republicans nor Democrats identify Goldwater as part of the radical right.” That was not the case soon after. By demonstrating that extremism was a problem in the body politic, the assassination, although perpetrated by a Marxist, made the identification of Goldwater as a trigger-happy warmonger much more convincing to the public. Some rank-and-file Republicans grew despondent immediately after the tragedy. As Dallas Republican activist Sally McKenzie said, “We all worked our souls out” for Goldwater. “Every bit of that went down the tubes the day that Jack Kennedy was killed in Dallas. I had just finished a door-to-door canvass in my precinct. I went in that night, not that I was being disrespectful of a deceased president, and just tore up the records. It was futile after that.” Goldwater’s propensity for “shooting from the hip” provided further fodder for those characterizing him as “trigger-happy.” If his promise to grant jurisdiction over tactical nuclear weapons to American commanders in the field did not scare away voters, his assertion that such weapons could be used to defoliate the jungles of Vietnam did. In the final weeks of the campaign Goldwater attempted to remove what O’Donnell called the “atomic thorn in his heel” with more appeals to law and order. But the “trigger-happy bit,” one Dallas conservative Republican noted, hurt Goldwater among American voters. “We had a public relations image hung on us like a dead cat.” With Kennedy’s assassination, Lyndon Johnson, a master politician, ascended to the presidency, armed with both a singular understanding of Congress and a mandate to secure his fallen predecessor’s legislative program. While Johnson’s legislative record had been the most liberal in the nation’s history, many Dallasites, Texans, and Americans in the fall of 1964 still regarded the tall Texan as more moderate than his slain predecessor. In a shrewd gesture calculated to garner broad bipartisan support, reinforce his image of steady moderation, and avoid backlash, Johnson identified the 1964 Civil Rights Act as more of a legislative priority for the slain president than for himself. Goldwater, now facing a popular president from Texas instead of an incumbent from Massachusetts, was never fully able to execute the Southern Strategy in 1964. Moreover, the assassination had dampened his enthusiasm for the campaign. Goldwater liked Kennedy personally and had relished the opportunity to run against him. The decision to exclude F. Clifton White and Peter O’Donnell from the campaign also proved unwise. Denison Kitchel, Dean Burch, and Richard Kleindienst—the “Arizona Mafia”— lacked their predecessors’ experience, discretion, and organizational wizardry. In the final analysis, Goldwater’s running mate, William Miller, probably summed up the election results best: “The American people were just not in the mood to assassinate two Presidents in one year.” Kennedy’s assassination contributed to the debacle of the Dallas Republican Party in 1964. All eight Dallas Republicans in the Texas legislature were ousted, and Bruce Alger lost his bid for reelection to Congress to Democrat Earle Cabell. To be sure, Cabell was a well-financed candidate, a popular mayor who rode the coattails of a president from Texas. Moreover, Cabell made a strong case against Alger’s effectiveness as a congressman. Ultimately, the revival of Alger’s ultraconservatism—what many regarded as extremism, especially with Goldwater on the ballot—combined with his Dallas constituents’ concern for the city’s shattered image, were the most important factors in his defeat. Alger had modulated his ultraconservative image following the Adolphus incident, but his flirtation with distancing himself from far-right organizations and ideas did not last long. Alger’s speeches in 1962 and 1963 contained secular apocalyptic overtones. In his self-proclaimed “one-man campaign against John F. Kennedy,” he attacked the administration’s distribution of federal money to the nation’s cities, calling it a “sure step toward the end of free elections.” Aping Senator Joseph McCarthy’s 1950 speech in Wheeling, West Virginia, Alger addressed the Petroleum Engineers’ Club of Dallas and declared that he held in his hand fifty-five indictments charging the Kennedy administration with coddling Communists. On other occasions, Alger averred that the president was moving the country “closer to dictatorship” and that “the nation cannot survive another four years of the New Frontier policies.” This renewed, more militant ultraconservatism, manifested just as Dallas’s image throughout the world was tarnished by the murder of a president, contributed to Alger’s loss in 1964. Murdered by Jack Ruby, Lee Harvey Oswald never had his day in court, but Dallas, as A. C. Greene observed, promptly went “on trial.” In the days, weeks, and months after the assassination, newspaper and magazine editors descended upon Dallas to dissect its identity and often drew hasty and simplistic conclusions. The outside appraisals, on the whole, concluded that the city of Dallas was culturally bereft, politically autocratic, and socially bankrupt. Although President Kennedy’s killer was a Marxist who had lived in Dallas for only two months, many columnists concluded that the city and its right wing had created an environment that contributed to the assassination. An article in Fortune referred to Dallas as the “hate capital of the nation,” “a place so steeped in violence and political extremism that school children would cheer the president’s death.” One newspaper observed, “The hatred preachers got their man. They did not shoot him. They inspired the man who shot him.” Another noted that “Mr. Kennedy had prepared a speech which . . . reminded the people of Dallas that . . . America’s leadership must be guided by the lights of learning and reason. . . . Dallas’s answer, even before that speech was delivered, was to shoot John F. Kennedy.” Along with resurrecting the Adolphus and Stevenson incidents, some journalists concluded that the centralized structure of Dallas’s Citizens Council inhibited discussion, discouraged dissent, and restricted the intellectual and cultural activity essential to a thriving metropolis. Although many city leaders argued that the assassination “could have happened anywhere,” Congressman Alger was the most doctrinaire and hostile in attacking the news media for suggesting that Dallas itself was to blame. With the city’s image under siege, the business community divided its support between Cabell and Alger. Alger garnered support from the oilman Jake Hamon, Dresser Industries’ H. N. Mallon, Sun Oil’s Tom Hill, and Lone Star Steel’s E. B. Germany, while Cabell had the solid backing of the downtown Dallas establishment, including Robert L. Thornton (who founded the Citizens Council), the retailer Stanley Marcus, and John M. Stemmons. In the final analysis, enough business leaders came to the following conclusion: since the federal government already meddled in the life of the city—from civil rights to defense appropriations— Dallas might as well benefit and secure federal money to connect the Trinity River to the Gulf of Mexico, construct a downtown Federal Center, and undertake other projects that would move the city forward. Given concerns over the effect of the city’s image on future growth, it made little sense for Dallas leaders to stick with an intractable libertarian ideologue who had come to personify an extremism that frightened the country and appeared to bring out the worst in people. The assassination also revivified the Democratic Party in Dallas County. Within three months of the tragedy, the North Dallas Democrats, the first local organization working for Democrats on all rungs of the party hierarchy since 1948, was formed. Bill Clark, chairman of the Dallas County Democrats, adopted many of the organizational strategies that Peter O’Donnell had pioneered. Enthusiastic Democratic volunteers went door-to-door and called from numerous phone banks urging Dallasites to vote Democratic, “from the White House to the Court House.” Another important factor in Alger’s defeat was African-American turnout, which reached 85 percent in some precincts. With Lyndon Johnson committed to the cause of civil rights more vigorously than any predecessor (or successor), thirty-two thousand Dallas black voters chose a straight Democratic ticket; the Democratic proportion of victory in some black precincts was 119 to 1. Indeed, between 1952 and 1964 the flight of African-Americans from the Republican Party amounted to a seismic shift, and the Dallas Republican Party illustrated that trajectory in microcosm. In 1952, 44 percent of African-American voters nationwide supported Dwight Eisenhower for president, and two years later 67 percent of African-American voters in Dallas County supported Bruce Alger for congressman. But in 1964, only 6 percent of African-American voters nationwide supported Barry Goldwater for president, and locally only 2.4 percent supported Alger. The Dallas Republican Party’s loss of the African-American vote was no fleeting anomaly. Jim Collins, the son of Carr P. Collins and an unsuccessful 1966 GOP congressional candidate from Dallas, performed about as well as Alger had in African-African precincts two years earlier. Despite national Republican chairman Ray Bliss’s optimistic appraisal that Collins’s support among blacks was “sensational,” that it had “exceeded his fondest expectations,” and that it showed that blacks were returning to the Republican Party, the actual results in Dallas were nothing for Republicans to celebrate, rising an infinitesimal 0.8 percent to 3.2 percent. Speaking to a Dallas audience in 1968, Richard G. Hatcher, the newly elected black mayor of Gary, Indiana, said that “the Republican Party has in effect turned its back on the black people of this country.” The GOP simply did not want black votes, he concluded. The Reverend Ralph Abernathy echoed Hatcher, adding that the 1968 Republican platform and the ticket of Richard Nixon and Spiro T. Agnew “are not an inspiration to black voters.” Despite Alger’s and Goldwater’s thumping at the polls, they left an important legacy: they had made the case that there was a place for segregationists and states’ rights advocates in the Republican Party. Foreshadowing a bright future for the conservative movement, over a million men and women contributed money to Goldwater’s campaign in 1964, whereas Richard Nixon had received contributions from only forty-four thousand in 1960. After signing the 1964 Civil Rights bill into law, President Lyndon Johnson told an aide, “I think we just gave the South to the Republicans for your lifetime and mine.”96 Yet Johnson’s prognostication was only partially correct. Johnson had given the Deep South another reason to vote against the Democratic Party, but Goldwater gave the region a candidate who was on their side. One Republican from South Carolina expressed the view of many in the region when he observed that although Barry Goldwater was a Westerner, he “could pass for a great Southerner any time, any place.”97 But along with the discovery of a candidate, it took the precedent of Dallas-based, segregationist ultraconservatives like Bruce Alger, John Tower, Jack Cox, Maurice Carlson, and Peter O’Donnell to lay the groundwork for Goldwater’s run in 1964 and to demonstrate that national Republicans could finally “whistle Dixie.” Excerpted from "Nut Country: Right-wing Dallas and the Birth of the Southern Strategy" by Edward H. Miller. Published by the University of Chicago Press. Copyright 2015 by the University of Chicago. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 01, 2015 11:00

The curious case of Ben Carson: How a black neurosurgeon soared to the top of the GOP primary

In a primary season that has seen the most unlikely of candidates, Donald Trump, surge to the head of the GOP pack, perhaps the least surprising development is the ascendance of another political outsider: Dr. Ben Carson. But to understand why Carson's recent success in Republican polls makes so much sense, one must first take a closer look at the nature of the modern Republican Party. In the Age of Obama, GOP elites routinely bloviate about their need to expand outreach to people of color, especially Hispanics and Latinos, given the United States’ changing ethnic and racial demographics. Yet, the Party has consistently failed to leverage opportunities to that end. This could be a function of incompetence. Alternatively, such a lack of substantive efforts could simply be a reflection of a political party that is dedicated to white racial resentment and white identity politics—and thus suppressing the votes of non-whites—as its primary electoral strategy. Given these dynamics, how does one make sense of the curious case of Ben Carson? How then does his surging popularity compute? Carson’s popularity points out the tension between what is known as “substantive” and “descriptive” politics. Substantive politics centers on a belief in a person’s values and policy positions as overriding other identity-based concerns about governance and political behavior. Yes, the body that an individual is born into matters. But, substantive politics presumes that almost any person can effectively represent a given constituency and its values. Descriptive politics, on the other hand, is the belief that a person’s life experiences and identity, especially if they are an outsider or Other in a given socio-political system -- in the United States and West this would be women, people of color, gays and lesbians, and members of other marginalized groups -- will lead them to challenge the system or be transformative and somehow resistant. Ultimately, the tension here is between individuals and systems. Do our racialized, ethnic, gendered, and other identities provide gifted insight and leverage for those we represent in government? Or is it best to vote for and support candidates based on their ideas alone, with an understanding that the system exercises constraints on all actors? (Stated differently: A white man may do a better job of representing his black and brown constituents’ interests than a brother or sister who “sells out” to Power. The latter is a “token”; the former can be a true and effective representative.) White conservatives love Ben Carson, the black face in a high place, in a sea of white candidates, because his symbolic presence provides cover for the white supremacist politics endorsed by the post-civil rights era Republican Party. Despite his popularity, Ben Carson is actually an example of the worst case of weak, symbolic, petty, token descriptive politics, where the fact of his presence as a black person is somehow supposed to win over non-white voters to the Republican Party, and demonstrate that the latter is “inclusive” and “not racist.” Yet Ben Carson’s policy proposals are not significantly different from those of his 2016 Republican primary peers. He wants to end the Affordable Care Act, do the bidding of the National Rifle Association against the will of the American people, take away women’s reproductive choices, usher in an American theocracy, and prevent the plutocrats of the 1 percent from paying their fair share in taxes. In many ways, Carson is actually worse than the white conservatives he shared the stage with at the debate the other night. He has repeatedly channeled ugly and grotesque anti-black sentiments and beliefs about the agency, freedom, and intelligence of the African-American community. This is his assigned role as a black conservative; his politics are no less noxious for his expertly performing the assigned script. By contrast, Bernie Sanders and Hillary Clinton have done a far better job of responding to the concerns of black and brown Americans -- even though the 2016 Democratic presidential primary field does not include a person of color. The Republican Party props up its black conservative human mascots and flavors of the month during the presidential campaign season because, on a basic level, white conservatives misunderstand non-white voters. People of color have rejected the Republican Party not only because of questions of representation, but also because its policies are anathema to the well-being, safety, security, and prosperity of Black and Brown America. The Republican Party is facing demographic suicide in an America that is increasingly black and brown -- where the GOP’s policies have savaged the poor, working, and middle classes. When a person is lost in the desert, they tend to walk in circles because they instinctively follow their dominant hand. He or she will eventually die from dehydration. The 2016 Republican presidential primary candidates are an example of a political organization in a death spiral. Black conservatives like Ben Carson will not save them. Together with his co-frontrunner Donald Trump, they are mirages that will lead the Republican Party to its doom.In a primary season that has seen the most unlikely of candidates, Donald Trump, surge to the head of the GOP pack, perhaps the least surprising development is the ascendance of another political outsider: Dr. Ben Carson. But to understand why Carson's recent success in Republican polls makes so much sense, one must first take a closer look at the nature of the modern Republican Party. In the Age of Obama, GOP elites routinely bloviate about their need to expand outreach to people of color, especially Hispanics and Latinos, given the United States’ changing ethnic and racial demographics. Yet, the Party has consistently failed to leverage opportunities to that end. This could be a function of incompetence. Alternatively, such a lack of substantive efforts could simply be a reflection of a political party that is dedicated to white racial resentment and white identity politics—and thus suppressing the votes of non-whites—as its primary electoral strategy. Given these dynamics, how does one make sense of the curious case of Ben Carson? How then does his surging popularity compute? Carson’s popularity points out the tension between what is known as “substantive” and “descriptive” politics. Substantive politics centers on a belief in a person’s values and policy positions as overriding other identity-based concerns about governance and political behavior. Yes, the body that an individual is born into matters. But, substantive politics presumes that almost any person can effectively represent a given constituency and its values. Descriptive politics, on the other hand, is the belief that a person’s life experiences and identity, especially if they are an outsider or Other in a given socio-political system -- in the United States and West this would be women, people of color, gays and lesbians, and members of other marginalized groups -- will lead them to challenge the system or be transformative and somehow resistant. Ultimately, the tension here is between individuals and systems. Do our racialized, ethnic, gendered, and other identities provide gifted insight and leverage for those we represent in government? Or is it best to vote for and support candidates based on their ideas alone, with an understanding that the system exercises constraints on all actors? (Stated differently: A white man may do a better job of representing his black and brown constituents’ interests than a brother or sister who “sells out” to Power. The latter is a “token”; the former can be a true and effective representative.) White conservatives love Ben Carson, the black face in a high place, in a sea of white candidates, because his symbolic presence provides cover for the white supremacist politics endorsed by the post-civil rights era Republican Party. Despite his popularity, Ben Carson is actually an example of the worst case of weak, symbolic, petty, token descriptive politics, where the fact of his presence as a black person is somehow supposed to win over non-white voters to the Republican Party, and demonstrate that the latter is “inclusive” and “not racist.” Yet Ben Carson’s policy proposals are not significantly different from those of his 2016 Republican primary peers. He wants to end the Affordable Care Act, do the bidding of the National Rifle Association against the will of the American people, take away women’s reproductive choices, usher in an American theocracy, and prevent the plutocrats of the 1 percent from paying their fair share in taxes. In many ways, Carson is actually worse than the white conservatives he shared the stage with at the debate the other night. He has repeatedly channeled ugly and grotesque anti-black sentiments and beliefs about the agency, freedom, and intelligence of the African-American community. This is his assigned role as a black conservative; his politics are no less noxious for his expertly performing the assigned script. By contrast, Bernie Sanders and Hillary Clinton have done a far better job of responding to the concerns of black and brown Americans -- even though the 2016 Democratic presidential primary field does not include a person of color. The Republican Party props up its black conservative human mascots and flavors of the month during the presidential campaign season because, on a basic level, white conservatives misunderstand non-white voters. People of color have rejected the Republican Party not only because of questions of representation, but also because its policies are anathema to the well-being, safety, security, and prosperity of Black and Brown America. The Republican Party is facing demographic suicide in an America that is increasingly black and brown -- where the GOP’s policies have savaged the poor, working, and middle classes. When a person is lost in the desert, they tend to walk in circles because they instinctively follow their dominant hand. He or she will eventually die from dehydration. The 2016 Republican presidential primary candidates are an example of a political organization in a death spiral. Black conservatives like Ben Carson will not save them. Together with his co-frontrunner Donald Trump, they are mirages that will lead the Republican Party to its doom.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 01, 2015 09:00

They’re the politically correct: Ben Carson and Bill O’Reilly are the real intolerant speech police

That leftist “social justice warriors” are suppressing speech that makes people “uncomfortable” is the dominant media narrative about free speech on college campuses. There are, of course, examples after examples of the opposite — people with actual administrative authority on campus (not student protesters) both upholding free speech in the face of left-leaning student demands for censorship, and denying free speech to left-leaning activists. The myths persist not because they are true, but because they are pervasive and under-scrutinized. It's also classic conservative concern-trolling. The dominant narrative of students paradoxically coddled and terrified by leftist “SJWs” pretends to be about student wellbeing and protection of free speech, but was always fodder for partisan politics. For example, Fox News frets over the implications of a liberal professoriate because 96 percent of Cornell University faculty donations went to Democrats, but Bill O’Reilly has no plans to send Jesse Watters to the Chamber of Commerce to pose a series of inelegantly leading questions about whether their disproportionate donations to Republicans might be a source of indoctrination. It’s no secret, in other words, that colleges and universities are among the very few influential U.S. institutions that also tend to question received wisdom about the free market and conservative ideology; so colleges and universities (and not workplaces, or the Motion Picture Association, which sponsored CPAC last year) are predictable targets for conservatives who ostensibly care about free speech. The trouble for Republicans — and for adherents of the dominant narrative that the left is singularly or particularly illiberal on matters of speech — is that explicit challenges to free speech on campus continue to come from the right. Last week, viable Republican presidential candidate Ben Carson unveiled a plan to use the Department of Education to monitor and police speech on college campuses, a system by which the federal government collects speech reports from campuses and then decides whether the speech in the reports qualifies as propaganda. If it does, per Carson’s plan, the institution in question would lose federal funding. In short, Carson proposes to encourage people on campus to police each other’s speech, to report to the federal government any speech that might be propaganda, and then to have the federal government determine what is and is not propaganda. Critics of leftist attempts to shut down “insensitive” speech point to the “chilling” effect of an environment that encourages self-policing, watching what we say to avoid causing offense. Carson’s proposal certainly includes that, but goes a step further by inviting one of the most powerful governments in the world to decide for college students and faculty if their speech constitutes propaganda. You could say that Carson’s expressed disregard for the First Amendment is an outlier or a “cherry-picked” example, but then you’d have to contend with the fact that the Republican National Committee, which held the most recent GOP presidential debate on campus at University of Colorado, Boulder, has made arrangements with the University to prevent UC Boulder students from attending the debate. Of the 11,000 seats provided by the venue, the RNC initially made available only 50 tickets for students (facing some pressure, the RNC generously increased student ticket availability to 150 out of 11,000). If you support the Republican agenda, it becomes difficult to talk straight-faced about the left-wing, “SJW” assault on free speech while also claiming to care about college students and free speech on campus. Particularly if you’re concerned that colleges and universities suffer from a lack of exposure to conservative ideas, it makes little sense to hold a Republican presidential debate on a liberal college campus but keep liberal college students from attending. Putting aside that the Republican presidential candidates this year are unlikely to produce much of value in the way of ideas or policy proposals, what better opportunity than a Republican presidential debate to expose liberal students to conservative ideas that matter? However, between the RNC’s handling of student access to the debate, and Ben Carson’s plan to police the speech of liberal professors, the mask of concern for free speech is beginning to melt away. Carson can’t help but admit that he’d rather have state-sponsored censorship than tolerate just one of many U.S. institutions — higher education — where free-market and conservative orthodoxies are likely to face scrutiny. The RNC would rather use a university campus (and all the lofty things it symbolizes) as a debate venue, but surround the venue with fences and keep out actual students, a gesture that puts audience control and exclusivity above the supposed goal of bringing conservative ideas to campus and attracting students to “big tent” (not ring-fenced) conservatism. These examples from national-scale conservative politics add to a growing list of high-profile free-speech issues and controversies at Duke University, Wesleyan University, the Community College of Philadelphia, American University and University of Illinois, Chicago, all of which contradict the dominant narrative of leftist “SJWs” suppressing speech and getting away with it. From this expanding list we learn three very important things about freedom of speech in the U.S. One, freedom of speech faces real challenges that we must not dismiss; but those challenges come from both the left and the right. Two, the actual enforcement of value judgments about speech is a matter not of the political content of speech, but of the power differential between the speaker and the censor; and leftist students and faculty are still far less powerful than the institutions that house and employ us all, or the outside lobbies and politicians who take opportunistic, partisan interest in campus affairs. Three, with employers monitoring employees’ social media accounts, political donations and affiliations, enforcing “workplace happiness” protocols and employing people in unpaid internships and precarious contracts, the gravest threats to free speech are happening not on campus, but in the workplace. When will free-speech advocates muster the courage to police and interrogate our most powerful institutions the way they police college campuses? I’m speaking of the corporations, the corporate lobbying firms and the hip-pocketed Congress that so nakedly serves corporate interests above all else.That leftist “social justice warriors” are suppressing speech that makes people “uncomfortable” is the dominant media narrative about free speech on college campuses. There are, of course, examples after examples of the opposite — people with actual administrative authority on campus (not student protesters) both upholding free speech in the face of left-leaning student demands for censorship, and denying free speech to left-leaning activists. The myths persist not because they are true, but because they are pervasive and under-scrutinized. It's also classic conservative concern-trolling. The dominant narrative of students paradoxically coddled and terrified by leftist “SJWs” pretends to be about student wellbeing and protection of free speech, but was always fodder for partisan politics. For example, Fox News frets over the implications of a liberal professoriate because 96 percent of Cornell University faculty donations went to Democrats, but Bill O’Reilly has no plans to send Jesse Watters to the Chamber of Commerce to pose a series of inelegantly leading questions about whether their disproportionate donations to Republicans might be a source of indoctrination. It’s no secret, in other words, that colleges and universities are among the very few influential U.S. institutions that also tend to question received wisdom about the free market and conservative ideology; so colleges and universities (and not workplaces, or the Motion Picture Association, which sponsored CPAC last year) are predictable targets for conservatives who ostensibly care about free speech. The trouble for Republicans — and for adherents of the dominant narrative that the left is singularly or particularly illiberal on matters of speech — is that explicit challenges to free speech on campus continue to come from the right. Last week, viable Republican presidential candidate Ben Carson unveiled a plan to use the Department of Education to monitor and police speech on college campuses, a system by which the federal government collects speech reports from campuses and then decides whether the speech in the reports qualifies as propaganda. If it does, per Carson’s plan, the institution in question would lose federal funding. In short, Carson proposes to encourage people on campus to police each other’s speech, to report to the federal government any speech that might be propaganda, and then to have the federal government determine what is and is not propaganda. Critics of leftist attempts to shut down “insensitive” speech point to the “chilling” effect of an environment that encourages self-policing, watching what we say to avoid causing offense. Carson’s proposal certainly includes that, but goes a step further by inviting one of the most powerful governments in the world to decide for college students and faculty if their speech constitutes propaganda. You could say that Carson’s expressed disregard for the First Amendment is an outlier or a “cherry-picked” example, but then you’d have to contend with the fact that the Republican National Committee, which held the most recent GOP presidential debate on campus at University of Colorado, Boulder, has made arrangements with the University to prevent UC Boulder students from attending the debate. Of the 11,000 seats provided by the venue, the RNC initially made available only 50 tickets for students (facing some pressure, the RNC generously increased student ticket availability to 150 out of 11,000). If you support the Republican agenda, it becomes difficult to talk straight-faced about the left-wing, “SJW” assault on free speech while also claiming to care about college students and free speech on campus. Particularly if you’re concerned that colleges and universities suffer from a lack of exposure to conservative ideas, it makes little sense to hold a Republican presidential debate on a liberal college campus but keep liberal college students from attending. Putting aside that the Republican presidential candidates this year are unlikely to produce much of value in the way of ideas or policy proposals, what better opportunity than a Republican presidential debate to expose liberal students to conservative ideas that matter? However, between the RNC’s handling of student access to the debate, and Ben Carson’s plan to police the speech of liberal professors, the mask of concern for free speech is beginning to melt away. Carson can’t help but admit that he’d rather have state-sponsored censorship than tolerate just one of many U.S. institutions — higher education — where free-market and conservative orthodoxies are likely to face scrutiny. The RNC would rather use a university campus (and all the lofty things it symbolizes) as a debate venue, but surround the venue with fences and keep out actual students, a gesture that puts audience control and exclusivity above the supposed goal of bringing conservative ideas to campus and attracting students to “big tent” (not ring-fenced) conservatism. These examples from national-scale conservative politics add to a growing list of high-profile free-speech issues and controversies at Duke University, Wesleyan University, the Community College of Philadelphia, American University and University of Illinois, Chicago, all of which contradict the dominant narrative of leftist “SJWs” suppressing speech and getting away with it. From this expanding list we learn three very important things about freedom of speech in the U.S. One, freedom of speech faces real challenges that we must not dismiss; but those challenges come from both the left and the right. Two, the actual enforcement of value judgments about speech is a matter not of the political content of speech, but of the power differential between the speaker and the censor; and leftist students and faculty are still far less powerful than the institutions that house and employ us all, or the outside lobbies and politicians who take opportunistic, partisan interest in campus affairs. Three, with employers monitoring employees’ social media accounts, political donations and affiliations, enforcing “workplace happiness” protocols and employing people in unpaid internships and precarious contracts, the gravest threats to free speech are happening not on campus, but in the workplace. When will free-speech advocates muster the courage to police and interrogate our most powerful institutions the way they police college campuses? I’m speaking of the corporations, the corporate lobbying firms and the hip-pocketed Congress that so nakedly serves corporate interests above all else.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 01, 2015 08:59

In “Burnt,” the brilliant white douche wins again: We love to excuse bad behavior from successful creeps — how else do you explain Donald Trump?

If "Burnt" were being honest, it would have been called “Everybody Wants to Fuck Bradley Cooper.” The John Wells-directed film, in which Cooper plays a bad boy chef, labors under the misapprehension that no matter how horrible, narcissistic or borderline violent a guy cinema’s second favorite Brad plays, everyone is dying to bone him: the talented sous chef (Sienna Miller) who can’t stand the sight of him; the gay, Elmer Fudd-inflected maître d’ (Daniel Bruhl) willing to sacrifice his entire career for just one kiss and even the restaurant critic who is exclusively into women but willing to make an exception for white dudes whose names rhyme with “Madly Boop-her.”   This is despite the fact that (to reiterate) Cooper is playing the worst person in the world. While displaying almost no redeeming qualities — outside of those piercing baby blues — Cooper’s Adam Jones manipulates Miller’s character (whose character may as well be named “Girl Chef”) into working with him and then repeatedly bribes her not to quit because of his abusive behavior; in one scene, he violently grabs her by the shirt for daring to question his authority. Jones forces the Elmer Fudd-ish maître d’ to give him the head chef job at his dying father’s restaurant — because Jones knows Fuddy is in love with him (hey, who isn’t?). And in a telling moment, he facetiously suggests that a shy cook should work for him for free — all to prove a point: You have to be a jerk to get what you want. It might seem like "Burnt" is "Wall Street" with a frying pan, but the film’s “narcissism is good” ethos is less a philosophy than a love letter to insufferable white golden boys and the people who find them dangerously irresistible. "Burnt" validates and redeems Jones' numerous misdeeds not by having him ever apologize for his behavior or make amends but by repeatedly reminding us a) that he’s so brilliant b) that he’s so damaged (bad childhood: check!) and c) that he’s not overtly horrible to children. In Adam’s redemptive “he’s not so bad after all!“ scene, he deigns to bake Girl Chef’s daughter a cake — and even eats it with her. This is, of course, after he wouldn’t let Girl Chef have the actual day off to throw her daughter a party. If little separates Adam Jones from Patrick Bateman aside from a knife and an adult poncho, Adam Jones actually exemplifies a time-honored type in American cinema: In real life, you would file a restraining order against him, but unfortunately, he’s attractive, Caucasian and the protagonist of the film. The Hot White Douche is as old as the history of the cinema itself: In the film "His Girl Friday," you’d throw a pie in Cary Grant’s face if he weren’t, well, Cary Grant. In a memorable scene from "The Philadelphia Story," Grant actually facepalms Katherine Hepburn, pushing her to the ground — and then they end up together. And James Bond has made a 50-year career off being a debonair dickbag; a recent review of "Spectre" called 007 a “violent misogynist,” and even Daniel Craig agreed. Sexism just looks better in Tom Ford. The current cinema has given us famous examples of the Hot White Douche like Edward Cullen and Christian Grey, both of whom teach horny teenage Mormons and middle-aged hausfraus that there’s nothing sexier than abusive relationships. But even less controversial characters have distinct HWD tendencies: In "Reality Bites," Winona Ryder ends up with Ethan Hawke — a pretentious, greasy-haired jerk who spends the whole movie being mean to her — for no other reason than the script says so (they “belong together” or something). "Bridget Jones’ Diary" and "The Ugly Truth" respectively give us Daniel Cleaver (Hugh Grant) and Mike (Gerard Butler), whose workplace behavior should be used as a cautionary tale at sexual harassment seminars. The problem with a movie like "The Ugly Truth" isn’t just that the screenplay (which was somehow written by three women) lets the misogynist get the girl but that it, like "Burnt," proves his worldview correct. "The Ugly Truth" offers a Cyrano de Bergerac scenario in which Mike helps Abby (Katherine Heigl) get laid by teaching her what men like — which is Cool Girls who wear tight dresses and find all their jokes hilarious. Of course, Abby finds that his methods work — until she and Mike inevitably fall in love. "Jurassic World" offers the same thesis: For women, there’s nothing sexier than a scruffy asshole (this time Chris Pratt) with a six-pack ordering you around. What’s particularly troubling about this type is how many actors have made quite a cottage industry off them: Matthew McConaughey launched his second career as cinema’s most prolific Hot White Douche — from "How to Lose a Guy in 10 Days" to "Ghosts of Girlfriends Past," his inevitable HWD swan song. Han “I Know” Solo is a famous example of the type, but even Harrison Ford’s non-"Star Wars" characters (see: Indiana Jones, Rick Deckard) have HWD elements. And Pratt is becoming the biggest star in America by picking up where Ford left off: In "Guardians of the Galaxy," he’s yet another rakish cad who doesn’t have time for feelings — he has his Walkman. The problem with the Hot White Douche — on top of, well, everything — is that they get the privilege of being bad without sacrificing our sympathy or his status as the hero. And this extends beyond Hollywood's fictional heroes — isn't Donald Trump just a natural extension of this type, albeit on a less-handsome scale? His financial success and confident swagger insulate him from backlash against his terrible public behavior. "Sure, he's a crude jerk, but don't you love the way he speaks his mind?" In a particularly interesting scene in "Burnt," it’s revealed that Omar Sy’s Michel, a chef whose life Adam Jones ruined, only agreed to work for him as an act of revenge. During a pivotal moment, Michel (aka: the only character of color) throws cayenne pepper into a dish to ruin it, thus sabotaging the restaurant’s shot at a three-star Michelin rating. After his vengeance is complete, Sy walks off the screen and is never seen again. The difference between the two men is that because Jones is white, the movie has the privilege of being about him; his shades of grey are the only ones the film considers noteworthy. You might argue that many of these film are aware their male leads aren’t role models or the “Man of the Year,” and a movie like "Burnt" is more interested in making their protagonists complicated than likable. I understand that in principle, but how often are Omar Sy or Sienna Miller allowed to play characters who are liked not in spite of their flaws but precisely because of them? While cinema has taken tentative steps forward in giving us female anti-heroes (see: "The Girl with the Dragon Tattoo"'s Lisbeth Salander, "Gone Girl"'s Amy Dunne), it’s a lot harder to think of a woman whose bad behavior is eminently fuckable. In "Young Adult," you don’t want to shack up with Mavis Gary; you want to get her into rehab; Amy Schumer's "Trainwreck" has to clean up her act before she can make her relationship last.  For all the praise heaped on "Gone Girl"’s depiction of the female anti-hero, what I found most refreshing was Gillian Flynn’s overt deflation of the Hot White Douche. Nick Dunne’s failings as a husband (his affair, sinking all their money into a failed bar) don’t make him a charming cad, and unlike Cooper’s libidinous chef, the female characters don’t spend the whole movie polishing his ego. Even the women closest to him, like his twin sister, Margo, come to detest the sight of him when they see Nick for who he really is: just a pathetic jerk. Sure, he didn’t kill his wife — but it doesn’t make him a hero. I wish "Burnt" had the same courage. For all the film’s seeming promises of lessons learned and redemption found, the movie is more interested in giving Adam Jones his great comeback than challenging him to think about why he may have been denied it to begin with. The Hot White Douche might get what he wants, but at what cost?

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 01, 2015 07:30

The GOP can’t escape football: Republicans’ long love affair with America’s most brutal sport

Chris Christie might have thought it was ludicrous for the CNBC debate moderators to ask about the booming yet ethically murky "daily fantasy football" industry, but they had defensible reasons for doing so. Websites such as Draft Kings and FanDuel now generate tens of billions of dollars, by way of a product that bears striking similarities to online gambling. That daily fantasy football should have become such a booming enterprise — and that it might merit consideration during a presidential debate — might surprise some. It shouldn't, and especially not on the conservative end of the spectrum. While baseball may be the America’s official pastime, football has been its most popular sport for more than thirty years. Around this same time, more and more Republican voters, presidential candidates and even presidents have been strongly identified with the game. Indeed, although he may not have realized it, when Christie expressed outrage at the debate’s disproportionate attention to football, he was in part attacking the values of his own party’s base. As writer Neal Gabler noted in an editorial for ESPN, “Of people who identified themselves as part of the NFL fan base 83 percent were white, 64 percent  were male, 51 percent were 45 years or older, only 32 percent made less than $60,000 a year, and, to finish the point, registered Republicans were 21 percent more likely to be NFL fans than registered Democrats.” In addition, football is particularly popular in the South, one of the Republican Party’s regional strongholds in America today. Richard Nixon was the first president to be widely viewed as a “football freak,” even though his personal experience playing the sport was limited to serving as a tackling dummy for the small football program at the Quaker college he attended. Despite this inglorious beginning, Nixon was openly passionate about the game, impressing even harsh critics like Hunter S. Thompson with his encyclopedic knowledge of college and professional football statistics. This same enthusiasm, though, also got him into minor trouble, such as when he asked Miami Dolphins’ coach Don Shula to run a play he suggested during Super Bowl VI (the play failed and the Dallas Cowboys ultimately won) or annoyed anti-Vietnam War protesters at the Lincoln Memorial by trying to talk to them about college football instead of their political concerns. Nixon’s penchant for unflattering associations with football apparently set a precedent for his party’s subsequent commanders in chief. Gerald Ford’s opponents often liked to insult his intelligence by referencing the multiple concussions he suffered while playing the game in his youth, quoting Lyndon Johnson’s old observation that Ford “spent too much time playing football without a helmet.” Ronald Reagan allowed his second inauguration to be postponed by a day so as to not conflict with Super Bowl XIX, instead ushering in the first day of his second term by performing the ceremonial coin flip at the Big Game. George W. Bush, meanwhile, allowed football metaphors to become so commonplace among members of his administration that they sometimes bungled their language when navigating delicate foreign policy situations in North Korea or Afghanistan. This brings us to the meat of Christie’s criticism — in his own words — that “we have ISIS and al Qaeda attacking us and we’re talking about fantasy football.” Despite occupying very different points on the ideological spectrum, Christie’s outrage was in the same spirit as the anger expressed by the Lincoln Memorial protesters who thought Nixon had come to them to discuss the Vietnam War, only to be regaled with tales about the University of Syracuse’s football program. For many Americans, football is more than just a form of entertainment or recreation, but a way of life deeply embedded in their cultural institutions and held in reverence by the bulk of society. As a result, it is easy to allow the bread-and-circus spectacle of football to distract them from issues that have serious real-world consequences — or, like Nixon, to understand why others wouldn’t feel the same way. This overzealous passion for football can also cause serious ideological inconsistencies. Take Paul Ryan, the newly minted Speaker of the House, who as the Republican vice presidential candidate in 2012 stood on a staunchly anti-labor platform that he had helped design — but one which he temporarily abandoned by siding with the NFL referees’ union over the league during a strike, because the inexperienced replacements refs had cost his beloved Green Bay Packers a victory over Seattle Seahawks. (Ryan was joined in this by Wisconsin governor Scott Walker, who had built his political career on his anti-union bona fides, but called for the NFL to resolve the strike so that the quality of the game he loved would not be impaired.) Even Christie’s own rebuke of the notion that Republican presidential candidates should discuss fantasy football regulation touches on this problem; when he argued that the focus should be on how to “get the government to do what they’re supposed to be doing” — and declared, “Enough on fantasy football. Let people play, who cares?” — he reminded the audience of the GOP’s ostensible “small government” principles — ones that should make the state’s stance on something as frivolous as fantasy football self-evident. That said, it’s important that Republicans learn a lesson from the first president to have a meaningful relationship with football — future president Herbert Hoover, who managed Stanford University’s budding football team in the 1890s and famously forgot to bring the ball during their first Big Game with the University of California. While Hoover may have been embarrassed for neglecting his football duties, the party he later represented has often been guilty of the opposite sin, ranking football too highly among its priorities. The daily fantasy football controversy is perhaps a partial exception. Nonetheless, hopefully last night’s debate can start the process of offsetting this phenomenon.Chris Christie might have thought it was ludicrous for the CNBC debate moderators to ask about the booming yet ethically murky "daily fantasy football" industry, but they had defensible reasons for doing so. Websites such as Draft Kings and FanDuel now generate tens of billions of dollars, by way of a product that bears striking similarities to online gambling. That daily fantasy football should have become such a booming enterprise — and that it might merit consideration during a presidential debate — might surprise some. It shouldn't, and especially not on the conservative end of the spectrum. While baseball may be the America’s official pastime, football has been its most popular sport for more than thirty years. Around this same time, more and more Republican voters, presidential candidates and even presidents have been strongly identified with the game. Indeed, although he may not have realized it, when Christie expressed outrage at the debate’s disproportionate attention to football, he was in part attacking the values of his own party’s base. As writer Neal Gabler noted in an editorial for ESPN, “Of people who identified themselves as part of the NFL fan base 83 percent were white, 64 percent  were male, 51 percent were 45 years or older, only 32 percent made less than $60,000 a year, and, to finish the point, registered Republicans were 21 percent more likely to be NFL fans than registered Democrats.” In addition, football is particularly popular in the South, one of the Republican Party’s regional strongholds in America today. Richard Nixon was the first president to be widely viewed as a “football freak,” even though his personal experience playing the sport was limited to serving as a tackling dummy for the small football program at the Quaker college he attended. Despite this inglorious beginning, Nixon was openly passionate about the game, impressing even harsh critics like Hunter S. Thompson with his encyclopedic knowledge of college and professional football statistics. This same enthusiasm, though, also got him into minor trouble, such as when he asked Miami Dolphins’ coach Don Shula to run a play he suggested during Super Bowl VI (the play failed and the Dallas Cowboys ultimately won) or annoyed anti-Vietnam War protesters at the Lincoln Memorial by trying to talk to them about college football instead of their political concerns. Nixon’s penchant for unflattering associations with football apparently set a precedent for his party’s subsequent commanders in chief. Gerald Ford’s opponents often liked to insult his intelligence by referencing the multiple concussions he suffered while playing the game in his youth, quoting Lyndon Johnson’s old observation that Ford “spent too much time playing football without a helmet.” Ronald Reagan allowed his second inauguration to be postponed by a day so as to not conflict with Super Bowl XIX, instead ushering in the first day of his second term by performing the ceremonial coin flip at the Big Game. George W. Bush, meanwhile, allowed football metaphors to become so commonplace among members of his administration that they sometimes bungled their language when navigating delicate foreign policy situations in North Korea or Afghanistan. This brings us to the meat of Christie’s criticism — in his own words — that “we have ISIS and al Qaeda attacking us and we’re talking about fantasy football.” Despite occupying very different points on the ideological spectrum, Christie’s outrage was in the same spirit as the anger expressed by the Lincoln Memorial protesters who thought Nixon had come to them to discuss the Vietnam War, only to be regaled with tales about the University of Syracuse’s football program. For many Americans, football is more than just a form of entertainment or recreation, but a way of life deeply embedded in their cultural institutions and held in reverence by the bulk of society. As a result, it is easy to allow the bread-and-circus spectacle of football to distract them from issues that have serious real-world consequences — or, like Nixon, to understand why others wouldn’t feel the same way. This overzealous passion for football can also cause serious ideological inconsistencies. Take Paul Ryan, the newly minted Speaker of the House, who as the Republican vice presidential candidate in 2012 stood on a staunchly anti-labor platform that he had helped design — but one which he temporarily abandoned by siding with the NFL referees’ union over the league during a strike, because the inexperienced replacements refs had cost his beloved Green Bay Packers a victory over Seattle Seahawks. (Ryan was joined in this by Wisconsin governor Scott Walker, who had built his political career on his anti-union bona fides, but called for the NFL to resolve the strike so that the quality of the game he loved would not be impaired.) Even Christie’s own rebuke of the notion that Republican presidential candidates should discuss fantasy football regulation touches on this problem; when he argued that the focus should be on how to “get the government to do what they’re supposed to be doing” — and declared, “Enough on fantasy football. Let people play, who cares?” — he reminded the audience of the GOP’s ostensible “small government” principles — ones that should make the state’s stance on something as frivolous as fantasy football self-evident. That said, it’s important that Republicans learn a lesson from the first president to have a meaningful relationship with football — future president Herbert Hoover, who managed Stanford University’s budding football team in the 1890s and famously forgot to bring the ball during their first Big Game with the University of California. While Hoover may have been embarrassed for neglecting his football duties, the party he later represented has often been guilty of the opposite sin, ranking football too highly among its priorities. The daily fantasy football controversy is perhaps a partial exception. Nonetheless, hopefully last night’s debate can start the process of offsetting this phenomenon.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on November 01, 2015 07:00

6 surprising health benefits of hemp seeds

AlterNet Among the many benefits of the legalization of marijuana would be eliminating any lingering confusion about the legality of its close relation (read, basically the same plant), hemp. While the possession of industrial hemp, also known by its botanical moniker, Cannabis sativa, has never been illegal, the growing of industrial hemp was illegal until the recent passage of the







 •  0 comments  •  flag
Share on Twitter
Published on November 01, 2015 06:00

October 31, 2015

Halloween is my nightmare, but not for the usual reasons

Each year I dread Halloween. While my neighbors are impersonating Ben Carson, Jeb! Bush and other spine-chilling creatures, Halloween fills me with grief. My 46-year-old brother lay dying on All Hallows’ Eve, 25 years ago. The trick was on us. Ten years older than me, Jay was my surrogate dad. My workaholic father was rarely around, busy supporting three children, his widowed mother and his mother-in-law. A blue-eyed boy with a photographic memory, Jay was my mother’s favorite and I was his. My first childhood memory was Jay climbing into my crib, making me giggle uncontrollably. He protected me from bullies and thrilled me with risks as we pretended to be Olympic slalom riders down the legendary Suicide Hill nearby. Whenever I fell and skinned my knee, my usually gentle brother would hit my arm rather roughly. “Ouch!” I’d protest. “Does your knee hurt anymore?” Jay responded playfully. He even invited me on dates with his high school sweetheart. She was not always enthusiastic about his baby sister munching popcorn next to her in the movies. When he married her after his first year of dental school, I felt jealous and abandoned, but had perfect material for therapy sessions. He left me again, for Vietnam, where he filled soldiers’ teeth dangerously close to the front lines, and later far away to Florida, with his wife and three kids. When he was 44, Jay was diagnosed with lung cancer. A non-smoker, he was so far out of any risk group that the doctors missed the signs for a year, believing the pain he was feeling was a pulled muscle. After his first MRI, he confessed how eerie it was going into the tube, knowing his grim diagnosis. “It was like being in a coffin,” he said. One night his wife called me, sobbing. “Only 30 percent survive beyond 18 months,” she said. I consoled her, sent him his favorite chocolates and books to read during treatment, and cried privately, 1,500 miles away, angry at such an unfair fate. Two years later Jay was taking his last breaths in his bedroom, his wife and kids by his side, as trick-or-treaters relentlessly rang their doorbell all night. “I couldn’t answer the door,” Jay’s wife frantically told me. “But they just wouldn’t stop coming, wanting candy. Finally I disconnected the doorbell.” One Halloween when I was 8, my friends and I heard that a family down the block was doling out a different kind of sugar: money. In the days when parents didn’t escort their kids, my friends and I marched down to the house, whose residents we didn’t know. We leaned on the doorbell. Again and again. No answer, but we persisted. Finally an angry, weary dad answered the door. “Stop disturbing us,” he barked. “Our baby’s sick.” We retreated, guilty and remorseful. His baby recovered from the flu, and we avoided walking by his house for a long time. My brother died the morning after relentless ghouls and goblins didn’t understand that his grieving soon-to-be-widow had nothing sweet to spare. Jay’s tragic death made me question my faith in God. I focused my energy on nurturing his kids, who were in high school and college. Each year, Halloween was particularly painful for me, and I avoided any kind of celebration. Instead, my husband and I lit a candle in memory of Jay, recalling his warmth, humor and jokes. After my daughter was born, it became difficult to hide out on Halloween. Her best friend’s mother made elaborate outfits from scratch as if she were a Broadway costume designer, making my parenting skills feel inadequate. But it also felt impossible to go gleefully into the spooky night when I was sadly missing my brother. Only once did I succeed in piecing together a cool Halloween costume for my daughter. We enjoyed watching Marx Brothers movies, the way Jay had introduced me to “A Day at the Races.” As an homage to my brother, I transformed my only daughter into Harpo. She wore my raincoat, which nearly reached the floor on her. I found a blonde curly wig in a store frequented by transvestites in the East Village, and a friend donated a horn to complete her outfit. Instead of saying trick-or-treat, she honked. Through all this, I kept my Halloween grief to myself, not wanting to ruin her fun and excitement. Ironically, All Hallows’ Eve originated as a time to remember the dead, but today it’s morphed into a festive Mardi Gras. Holidays that emphasize a collective celebration of joy make us feel compelled, even pressured, to be gleeful along with everyone else—rather than be identified as the only stick-in-the-mud in the crowd. On a night of gaiety when it’s de rigueur to transform into something else, I’m cloaked behind an invisible mask of sadness.  Sometimes I tried to ease my sorrow by stuffing myself with my daughter’s overstock of candy. I was relieved when she was old enough to piece together her own costumes. When the new generation of trick-or-treaters arrived at my door, I put on a big smile, a clown with a sad interior. Halloween never ends when your kids grow up. This year an email arrived from close friends, inviting us to their annual party, ending with the line “costumes are a must!” My husband has always hated dressing up, and for years agreed to show up in a Mr. Spock tee-shirt until he was typecast. Once I convinced him to reprise Harpo for a party. He put on the wig for half an hour, then tossed it off. I never feel like turning myself into someone else on the anniversary of my brother’s death, but only my husband knows my secret. I don’t want to cast a pall when everyone else is flying high, riding broomsticks. Of course, Jay wouldn’t have wanted me to stay home and mourn him on Halloween, decades later. So I will show up among the grown-up pirates and witches in a Mets hat and a blue shirt, as usual deserving the award for Worst-Dressed-of-the-Night. On the way to the party I plan to hit myself in my knee so my heart hurts less. I’ll greet my friends with a pretend smile and force my laughter, realizing I’m wearing a costume after all.Each year I dread Halloween. While my neighbors are impersonating Ben Carson, Jeb! Bush and other spine-chilling creatures, Halloween fills me with grief. My 46-year-old brother lay dying on All Hallows’ Eve, 25 years ago. The trick was on us. Ten years older than me, Jay was my surrogate dad. My workaholic father was rarely around, busy supporting three children, his widowed mother and his mother-in-law. A blue-eyed boy with a photographic memory, Jay was my mother’s favorite and I was his. My first childhood memory was Jay climbing into my crib, making me giggle uncontrollably. He protected me from bullies and thrilled me with risks as we pretended to be Olympic slalom riders down the legendary Suicide Hill nearby. Whenever I fell and skinned my knee, my usually gentle brother would hit my arm rather roughly. “Ouch!” I’d protest. “Does your knee hurt anymore?” Jay responded playfully. He even invited me on dates with his high school sweetheart. She was not always enthusiastic about his baby sister munching popcorn next to her in the movies. When he married her after his first year of dental school, I felt jealous and abandoned, but had perfect material for therapy sessions. He left me again, for Vietnam, where he filled soldiers’ teeth dangerously close to the front lines, and later far away to Florida, with his wife and three kids. When he was 44, Jay was diagnosed with lung cancer. A non-smoker, he was so far out of any risk group that the doctors missed the signs for a year, believing the pain he was feeling was a pulled muscle. After his first MRI, he confessed how eerie it was going into the tube, knowing his grim diagnosis. “It was like being in a coffin,” he said. One night his wife called me, sobbing. “Only 30 percent survive beyond 18 months,” she said. I consoled her, sent him his favorite chocolates and books to read during treatment, and cried privately, 1,500 miles away, angry at such an unfair fate. Two years later Jay was taking his last breaths in his bedroom, his wife and kids by his side, as trick-or-treaters relentlessly rang their doorbell all night. “I couldn’t answer the door,” Jay’s wife frantically told me. “But they just wouldn’t stop coming, wanting candy. Finally I disconnected the doorbell.” One Halloween when I was 8, my friends and I heard that a family down the block was doling out a different kind of sugar: money. In the days when parents didn’t escort their kids, my friends and I marched down to the house, whose residents we didn’t know. We leaned on the doorbell. Again and again. No answer, but we persisted. Finally an angry, weary dad answered the door. “Stop disturbing us,” he barked. “Our baby’s sick.” We retreated, guilty and remorseful. His baby recovered from the flu, and we avoided walking by his house for a long time. My brother died the morning after relentless ghouls and goblins didn’t understand that his grieving soon-to-be-widow had nothing sweet to spare. Jay’s tragic death made me question my faith in God. I focused my energy on nurturing his kids, who were in high school and college. Each year, Halloween was particularly painful for me, and I avoided any kind of celebration. Instead, my husband and I lit a candle in memory of Jay, recalling his warmth, humor and jokes. After my daughter was born, it became difficult to hide out on Halloween. Her best friend’s mother made elaborate outfits from scratch as if she were a Broadway costume designer, making my parenting skills feel inadequate. But it also felt impossible to go gleefully into the spooky night when I was sadly missing my brother. Only once did I succeed in piecing together a cool Halloween costume for my daughter. We enjoyed watching Marx Brothers movies, the way Jay had introduced me to “A Day at the Races.” As an homage to my brother, I transformed my only daughter into Harpo. She wore my raincoat, which nearly reached the floor on her. I found a blonde curly wig in a store frequented by transvestites in the East Village, and a friend donated a horn to complete her outfit. Instead of saying trick-or-treat, she honked. Through all this, I kept my Halloween grief to myself, not wanting to ruin her fun and excitement. Ironically, All Hallows’ Eve originated as a time to remember the dead, but today it’s morphed into a festive Mardi Gras. Holidays that emphasize a collective celebration of joy make us feel compelled, even pressured, to be gleeful along with everyone else—rather than be identified as the only stick-in-the-mud in the crowd. On a night of gaiety when it’s de rigueur to transform into something else, I’m cloaked behind an invisible mask of sadness.  Sometimes I tried to ease my sorrow by stuffing myself with my daughter’s overstock of candy. I was relieved when she was old enough to piece together her own costumes. When the new generation of trick-or-treaters arrived at my door, I put on a big smile, a clown with a sad interior. Halloween never ends when your kids grow up. This year an email arrived from close friends, inviting us to their annual party, ending with the line “costumes are a must!” My husband has always hated dressing up, and for years agreed to show up in a Mr. Spock tee-shirt until he was typecast. Once I convinced him to reprise Harpo for a party. He put on the wig for half an hour, then tossed it off. I never feel like turning myself into someone else on the anniversary of my brother’s death, but only my husband knows my secret. I don’t want to cast a pall when everyone else is flying high, riding broomsticks. Of course, Jay wouldn’t have wanted me to stay home and mourn him on Halloween, decades later. So I will show up among the grown-up pirates and witches in a Mets hat and a blue shirt, as usual deserving the award for Worst-Dressed-of-the-Night. On the way to the party I plan to hit myself in my knee so my heart hurts less. I’ll greet my friends with a pretend smile and force my laughter, realizing I’m wearing a costume after all.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 31, 2015 16:30

The war on women is not a war at all: It’s a one-sided assault by sad, insecure little men

AlterNet The so-called 'war on women' is not a war; it’s a one-sided assault. It is conservative men, drunk on power, calling women sluts and then rolling up their sleeves and knocking us back into place. It is conservative men letting us know that they own our bodies and reproductive capacity, which according to the Bible have been theirs since the Iron Age. It is conservative men making damned sure women get punished for failing to keep our legs together, for daring to pursue intimacy and sexual pleasure on our own terms and without their permission. It is conservative men ignoring our pleas that we don’t want to be pregnant and denying us the ability to resist impregnation as deliberately and aggressively as if they had our arms pinned. If that’s not an assault, I don’t know what is. We need to get our language straight because words we use carry associations and implications that change the way we think. Cognitive linguist George Lakoff (Moral Politics; Don’t Think of an Elephant) popularized the term “framing” among progressive activists. He explained why the metaphors we use can be more powerful than any amount of rational argumentation or evidence—undermining or supporting our position by activating a whole neural network of ideas and images that may operate below the level of consciousness. Consider the implications of calling the right-wing assault on reproductive rights a “war”: Men aspire to be warriors. They play with toy guns as children and get addicted to fast-twitch war games as adolescents. They drive Humvees and pose in high fashion camouflage as adults. They liken the winner-takes-all competition of capitalism or elections or football to combat. They go out for drinks and tell “war stories.” In wars, both sides are armed. Soldiers are comrades whose loyalty to each other trumps all else. They are taught to dehumanize their enemies. They come home heroes. They get medals. Politicians use war to arouse nationalistic pride. Philosophy teaches us that war can be just or noble or an art form. Religion teaches that it can be righteous or even commanded by God. Onward Christian soldiers. In the quest to win a war, lives and families destroyed are mere collateral damage. Economic devastation is a means to a higher end. Any man fighting in a war thinks he is one of the good guys. To paraphrase author Chris Hedges, war is a force that gives men meaning. Assault, by contrast, is unequal and often unprovoked. One side is the clear aggressor. There’s nothing glorious about assault.  In fact, perpetrators are widely reviled. Nobody organizes a victory or veterans’ parade to celebrate assailants. A man who forces his will on a child or who forces pregnancy on a woman is a repulsive villain, not a hero. Republican lawmakers and candidates who spend their days toying with women in front of the C-SPAN cameras in order to excite men who get off on female disempowerment may think of themselves as “culture warriors.” But let’s call them what they are: perpetrators of assault—the assault on women.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 31, 2015 15:30

“I can fix Ash now”: Bruce Campbell reveals how his cult hero matures — with “dentures and a man girdle” — for “Ash vs. Evil Dead”

The boomstick is back! 23 years ago this month, “Army of Darkness” premiered, and the third chapter in the “Evil Dead” saga ended up being its last — until now, that is. After years of negotiations and plenty of augmented plans, the television series “Ash vs. Evil Dead” premieres tonight on Starz. Cult icon Bruce Campbell reprises the titular role of Ash Williams in the series, replete with his penchant for one-liners and his signature chainsaw. We caught up with Campbell to discuss what it feels like to jump back into the character of Ash after all these years and how playing a cowboy on TV makes his life in Oregon easier. When you guys did “Within the Woods,” the short that inspired “Evil Dead,” you were 20. Here you are almost four decades later, back in those shoes. You’re continuing a story that you started telling... The amazing thing is that — once we got back on that set for the first time to shoot the pilot — I realized that Sam Raimi and I have known each other since junior high school, we still know each other, we can still deal with each other, we’re still in the business, and we were able to pull this show off. So to me, the victory has already been won. You mentioned the fact that you’re still in the film business after all these years. When you did “Evil Dead,” when you did “Within the Woods,” did you have any inkling that you had a career in front of you? Or did you think, “Hey, I’ll do a couple of movies and that’ll be it”? We didn’t even think we’d finish the first “Evil Dead,” because it took three years to finish. So there was no long-term anything. It was, “Holy crap! Are we even going to finish this movie?” Because if we’re not gonna finish it, we’re not gonna get our money back. Then we’re not gonna do another movie at all. We thought there was a lot at stake with the first movie. We had investors. Thankfully, they’re all pretty happy now. With the first “Evil Dead” movie, at what point did you realize that maybe you had something there? It took that long, and I know there were also issues actually getting it released. Well, no distributor would pick it up here in the U.S., so we started it in England. It was the response over there. It was second only to “E.T.” over there. In ’83, “Evil Dead” was the number-one video in the UK. You look down the list, and you see “The Shining” was number eight or 11. That’s pretty sweet. You beat Stan the Man at his game, horror movie for horror movie. We beat “The Shining.” Get out of town! That’s it, right there! For me, though, it was seeing it in my local movie theater. That was it. It’s where I saw all the movies in my formative years. To see our movie in the same theater, sitting in the same seat, same popcorn. I went, “OK. That’s it. We did it.” So you were still in Michigan at that point. I’m assuming you made the move to L.A. at least for a little while, right? I did my 10 years of penance there. I went after “Evil Dead 2.” After “Evil Dead 2,” we dealt with Dino De Laurentiis. We were gonna get more of a release. There was more at stake. So I moved to L.A. I was there for about 10 years, from the late ’80s to the late ’90s. I went, “No one’s shooting here anymore.” Then I just bailed. Now I live in Oregon. Do you feel like an outsider in the world of show business because you’re not in L.A., or… I’ve always been an outsider. Even when I was there. I’m not a schmoozer. I’m not that guy. I like working. I like working with people. I like going to events, if you’re promoting something for a reason. I don’t go to other actors’ premieres. Used to do that in the early days just to get your picture taken, but that’s stupid. So I actually can’t wait to see what actors show up at the “Evil Dead” premiere that had nothing to do with the series. That’s gonna give me a chuckle. I can’t imagine how different the pace of life is between small-town Oregon and L.A. The guy who’s standing in front of you at Starbucks has a fleece coat with dog hair all over it, his hair’s sticking straight up, and the guy’s a millionaire. You wouldn’t know it. Driving a crappy old truck with a scraggly old dog. That’s Oregon. Whereas Miami — doing “Burn Notice” down there — people who don’t have money are trying to show you that they have it. It’s all smoke and mirrors. I love the reality of Oregon. A week after I moved in, my neighbor comes driving up my driveway. He goes, “I understand you used to play a cowboy on a TV show. Why don’t you run me a hundred head of cattle up the road this Saturday?” So I did. I helped him run his cattle up the road. I said, “You got a horse? Let’s go.” I got to meet my neighbors that way. That’s Oregon. Let’s double back to “Ash vs. Evil Dead” for a minute. When did you guys realize that you’d be able to continue doing something with “Evil Dead?” There was talk of a movie for years and years, but talking about and trying to make it happen is obviously different than actually seeing it in a tangible way. Oh believe me, I’m gonna laugh my ass off at the premiere! Not because of the show. Because we got here. Three different movies made by three different companies, and now you want to work out a deal with them to make a TV show? Good luck, Chuck! We had to negotiate that. We had to dodge around a lot of stuff, a lot of personalities, a lot of money. It’s kind of interesting. It came about because TV is finally ready for us. The aftermarket of the “Evil Dead” movies stayed strong, but what people don’t forget was that “Army of Darkness” was a bomb. It cost $13 million and made $13 million. It was effectively dead for at least a decade. Then this company, Anchor Bay in Michigan, started releasing deleted scenes and making-ofs. They brought the interest back, and stoked it. “Army of Darkness” now is an American movie classic, for God’s sake! Time was our friend, and television caught up with us. You now have premium channels that allow you to do whatever you want, and we’re digging that. At that point, you had to think that was it. Very rarely does someone get a second chance at the same story, especially after that much time. You don’t often get a chance to go back to a character or a story. Yeah, because I can fix Ash now. I wanna fix him. I have the skills. I have the tools to flesh him out and make him a full-fledged character. As a younger actor, you’re not always thinking about that. You’re thinking about nailing the one-liners. But now, there’s bigger import. You’re a leader. You’re a teacher. You’re a tormenter. You’re the veteran. So Ash is now the old guy in the platoon with the cigarette hanging out of his mouth. I’m good with that. We haven’t hidden the fact that he’s an old guy now. I love that. He wears dentures and a man girdle. I think it’s just fantastic. Is Ash a different character in your mind than he would’ve been 20 years ago? Especially now that you have the ability to flesh him out. No! No! He hasn’t progressed much. He’s the boyfriend you run into, the old boyhood pal you run into. You go, “Wow, that guy hasn’t done anything!” He’s kind of like that. Characters don’t always have to evolve. A half hour will give you just enough of what you want to make Ash a real person. If it was an hour, it would be boring and [in a booming, pretentious tone] pon-der-ous! With all due respect to every other one-hour show, the pace would grind to a halt and Ash would be talking about his fucked-up childhood. Do we really want that? No!The boomstick is back! 23 years ago this month, “Army of Darkness” premiered, and the third chapter in the “Evil Dead” saga ended up being its last — until now, that is. After years of negotiations and plenty of augmented plans, the television series “Ash vs. Evil Dead” premieres tonight on Starz. Cult icon Bruce Campbell reprises the titular role of Ash Williams in the series, replete with his penchant for one-liners and his signature chainsaw. We caught up with Campbell to discuss what it feels like to jump back into the character of Ash after all these years and how playing a cowboy on TV makes his life in Oregon easier. When you guys did “Within the Woods,” the short that inspired “Evil Dead,” you were 20. Here you are almost four decades later, back in those shoes. You’re continuing a story that you started telling... The amazing thing is that — once we got back on that set for the first time to shoot the pilot — I realized that Sam Raimi and I have known each other since junior high school, we still know each other, we can still deal with each other, we’re still in the business, and we were able to pull this show off. So to me, the victory has already been won. You mentioned the fact that you’re still in the film business after all these years. When you did “Evil Dead,” when you did “Within the Woods,” did you have any inkling that you had a career in front of you? Or did you think, “Hey, I’ll do a couple of movies and that’ll be it”? We didn’t even think we’d finish the first “Evil Dead,” because it took three years to finish. So there was no long-term anything. It was, “Holy crap! Are we even going to finish this movie?” Because if we’re not gonna finish it, we’re not gonna get our money back. Then we’re not gonna do another movie at all. We thought there was a lot at stake with the first movie. We had investors. Thankfully, they’re all pretty happy now. With the first “Evil Dead” movie, at what point did you realize that maybe you had something there? It took that long, and I know there were also issues actually getting it released. Well, no distributor would pick it up here in the U.S., so we started it in England. It was the response over there. It was second only to “E.T.” over there. In ’83, “Evil Dead” was the number-one video in the UK. You look down the list, and you see “The Shining” was number eight or 11. That’s pretty sweet. You beat Stan the Man at his game, horror movie for horror movie. We beat “The Shining.” Get out of town! That’s it, right there! For me, though, it was seeing it in my local movie theater. That was it. It’s where I saw all the movies in my formative years. To see our movie in the same theater, sitting in the same seat, same popcorn. I went, “OK. That’s it. We did it.” So you were still in Michigan at that point. I’m assuming you made the move to L.A. at least for a little while, right? I did my 10 years of penance there. I went after “Evil Dead 2.” After “Evil Dead 2,” we dealt with Dino De Laurentiis. We were gonna get more of a release. There was more at stake. So I moved to L.A. I was there for about 10 years, from the late ’80s to the late ’90s. I went, “No one’s shooting here anymore.” Then I just bailed. Now I live in Oregon. Do you feel like an outsider in the world of show business because you’re not in L.A., or… I’ve always been an outsider. Even when I was there. I’m not a schmoozer. I’m not that guy. I like working. I like working with people. I like going to events, if you’re promoting something for a reason. I don’t go to other actors’ premieres. Used to do that in the early days just to get your picture taken, but that’s stupid. So I actually can’t wait to see what actors show up at the “Evil Dead” premiere that had nothing to do with the series. That’s gonna give me a chuckle. I can’t imagine how different the pace of life is between small-town Oregon and L.A. The guy who’s standing in front of you at Starbucks has a fleece coat with dog hair all over it, his hair’s sticking straight up, and the guy’s a millionaire. You wouldn’t know it. Driving a crappy old truck with a scraggly old dog. That’s Oregon. Whereas Miami — doing “Burn Notice” down there — people who don’t have money are trying to show you that they have it. It’s all smoke and mirrors. I love the reality of Oregon. A week after I moved in, my neighbor comes driving up my driveway. He goes, “I understand you used to play a cowboy on a TV show. Why don’t you run me a hundred head of cattle up the road this Saturday?” So I did. I helped him run his cattle up the road. I said, “You got a horse? Let’s go.” I got to meet my neighbors that way. That’s Oregon. Let’s double back to “Ash vs. Evil Dead” for a minute. When did you guys realize that you’d be able to continue doing something with “Evil Dead?” There was talk of a movie for years and years, but talking about and trying to make it happen is obviously different than actually seeing it in a tangible way. Oh believe me, I’m gonna laugh my ass off at the premiere! Not because of the show. Because we got here. Three different movies made by three different companies, and now you want to work out a deal with them to make a TV show? Good luck, Chuck! We had to negotiate that. We had to dodge around a lot of stuff, a lot of personalities, a lot of money. It’s kind of interesting. It came about because TV is finally ready for us. The aftermarket of the “Evil Dead” movies stayed strong, but what people don’t forget was that “Army of Darkness” was a bomb. It cost $13 million and made $13 million. It was effectively dead for at least a decade. Then this company, Anchor Bay in Michigan, started releasing deleted scenes and making-ofs. They brought the interest back, and stoked it. “Army of Darkness” now is an American movie classic, for God’s sake! Time was our friend, and television caught up with us. You now have premium channels that allow you to do whatever you want, and we’re digging that. At that point, you had to think that was it. Very rarely does someone get a second chance at the same story, especially after that much time. You don’t often get a chance to go back to a character or a story. Yeah, because I can fix Ash now. I wanna fix him. I have the skills. I have the tools to flesh him out and make him a full-fledged character. As a younger actor, you’re not always thinking about that. You’re thinking about nailing the one-liners. But now, there’s bigger import. You’re a leader. You’re a teacher. You’re a tormenter. You’re the veteran. So Ash is now the old guy in the platoon with the cigarette hanging out of his mouth. I’m good with that. We haven’t hidden the fact that he’s an old guy now. I love that. He wears dentures and a man girdle. I think it’s just fantastic. Is Ash a different character in your mind than he would’ve been 20 years ago? Especially now that you have the ability to flesh him out. No! No! He hasn’t progressed much. He’s the boyfriend you run into, the old boyhood pal you run into. You go, “Wow, that guy hasn’t done anything!” He’s kind of like that. Characters don’t always have to evolve. A half hour will give you just enough of what you want to make Ash a real person. If it was an hour, it would be boring and [in a booming, pretentious tone] pon-der-ous! With all due respect to every other one-hour show, the pace would grind to a halt and Ash would be talking about his fucked-up childhood. Do we really want that? No!The boomstick is back! 23 years ago this month, “Army of Darkness” premiered, and the third chapter in the “Evil Dead” saga ended up being its last — until now, that is. After years of negotiations and plenty of augmented plans, the television series “Ash vs. Evil Dead” premieres tonight on Starz. Cult icon Bruce Campbell reprises the titular role of Ash Williams in the series, replete with his penchant for one-liners and his signature chainsaw. We caught up with Campbell to discuss what it feels like to jump back into the character of Ash after all these years and how playing a cowboy on TV makes his life in Oregon easier. When you guys did “Within the Woods,” the short that inspired “Evil Dead,” you were 20. Here you are almost four decades later, back in those shoes. You’re continuing a story that you started telling... The amazing thing is that — once we got back on that set for the first time to shoot the pilot — I realized that Sam Raimi and I have known each other since junior high school, we still know each other, we can still deal with each other, we’re still in the business, and we were able to pull this show off. So to me, the victory has already been won. You mentioned the fact that you’re still in the film business after all these years. When you did “Evil Dead,” when you did “Within the Woods,” did you have any inkling that you had a career in front of you? Or did you think, “Hey, I’ll do a couple of movies and that’ll be it”? We didn’t even think we’d finish the first “Evil Dead,” because it took three years to finish. So there was no long-term anything. It was, “Holy crap! Are we even going to finish this movie?” Because if we’re not gonna finish it, we’re not gonna get our money back. Then we’re not gonna do another movie at all. We thought there was a lot at stake with the first movie. We had investors. Thankfully, they’re all pretty happy now. With the first “Evil Dead” movie, at what point did you realize that maybe you had something there? It took that long, and I know there were also issues actually getting it released. Well, no distributor would pick it up here in the U.S., so we started it in England. It was the response over there. It was second only to “E.T.” over there. In ’83, “Evil Dead” was the number-one video in the UK. You look down the list, and you see “The Shining” was number eight or 11. That’s pretty sweet. You beat Stan the Man at his game, horror movie for horror movie. We beat “The Shining.” Get out of town! That’s it, right there! For me, though, it was seeing it in my local movie theater. That was it. It’s where I saw all the movies in my formative years. To see our movie in the same theater, sitting in the same seat, same popcorn. I went, “OK. That’s it. We did it.” So you were still in Michigan at that point. I’m assuming you made the move to L.A. at least for a little while, right? I did my 10 years of penance there. I went after “Evil Dead 2.” After “Evil Dead 2,” we dealt with Dino De Laurentiis. We were gonna get more of a release. There was more at stake. So I moved to L.A. I was there for about 10 years, from the late ’80s to the late ’90s. I went, “No one’s shooting here anymore.” Then I just bailed. Now I live in Oregon. Do you feel like an outsider in the world of show business because you’re not in L.A., or… I’ve always been an outsider. Even when I was there. I’m not a schmoozer. I’m not that guy. I like working. I like working with people. I like going to events, if you’re promoting something for a reason. I don’t go to other actors’ premieres. Used to do that in the early days just to get your picture taken, but that’s stupid. So I actually can’t wait to see what actors show up at the “Evil Dead” premiere that had nothing to do with the series. That’s gonna give me a chuckle. I can’t imagine how different the pace of life is between small-town Oregon and L.A. The guy who’s standing in front of you at Starbucks has a fleece coat with dog hair all over it, his hair’s sticking straight up, and the guy’s a millionaire. You wouldn’t know it. Driving a crappy old truck with a scraggly old dog. That’s Oregon. Whereas Miami — doing “Burn Notice” down there — people who don’t have money are trying to show you that they have it. It’s all smoke and mirrors. I love the reality of Oregon. A week after I moved in, my neighbor comes driving up my driveway. He goes, “I understand you used to play a cowboy on a TV show. Why don’t you run me a hundred head of cattle up the road this Saturday?” So I did. I helped him run his cattle up the road. I said, “You got a horse? Let’s go.” I got to meet my neighbors that way. That’s Oregon. Let’s double back to “Ash vs. Evil Dead” for a minute. When did you guys realize that you’d be able to continue doing something with “Evil Dead?” There was talk of a movie for years and years, but talking about and trying to make it happen is obviously different than actually seeing it in a tangible way. Oh believe me, I’m gonna laugh my ass off at the premiere! Not because of the show. Because we got here. Three different movies made by three different companies, and now you want to work out a deal with them to make a TV show? Good luck, Chuck! We had to negotiate that. We had to dodge around a lot of stuff, a lot of personalities, a lot of money. It’s kind of interesting. It came about because TV is finally ready for us. The aftermarket of the “Evil Dead” movies stayed strong, but what people don’t forget was that “Army of Darkness” was a bomb. It cost $13 million and made $13 million. It was effectively dead for at least a decade. Then this company, Anchor Bay in Michigan, started releasing deleted scenes and making-ofs. They brought the interest back, and stoked it. “Army of Darkness” now is an American movie classic, for God’s sake! Time was our friend, and television caught up with us. You now have premium channels that allow you to do whatever you want, and we’re digging that. At that point, you had to think that was it. Very rarely does someone get a second chance at the same story, especially after that much time. You don’t often get a chance to go back to a character or a story. Yeah, because I can fix Ash now. I wanna fix him. I have the skills. I have the tools to flesh him out and make him a full-fledged character. As a younger actor, you’re not always thinking about that. You’re thinking about nailing the one-liners. But now, there’s bigger import. You’re a leader. You’re a teacher. You’re a tormenter. You’re the veteran. So Ash is now the old guy in the platoon with the cigarette hanging out of his mouth. I’m good with that. We haven’t hidden the fact that he’s an old guy now. I love that. He wears dentures and a man girdle. I think it’s just fantastic. Is Ash a different character in your mind than he would’ve been 20 years ago? Especially now that you have the ability to flesh him out. No! No! He hasn’t progressed much. He’s the boyfriend you run into, the old boyhood pal you run into. You go, “Wow, that guy hasn’t done anything!” He’s kind of like that. Characters don’t always have to evolve. A half hour will give you just enough of what you want to make Ash a real person. If it was an hour, it would be boring and [in a booming, pretentious tone] pon-der-ous! With all due respect to every other one-hour show, the pace would grind to a halt and Ash would be talking about his fucked-up childhood. Do we really want that? No!

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 31, 2015 14:30

Exposing the climate denial lies: Bill McKibben, the truth and the fight against the fossil-fuel industry

In October, a month before a show at the Orpheum, I drove up to Burlington, Vermont, on Bill McKibben’s and 350’s home turf, to attend the “dress rehearsal” for the national “Do the Math” tour, set to launch in Seattle the day after the election. On assignment for Grist magazine, I’d been invited to spend some time “backstage” with the 350.org team, watch their run-throughs for the evening’s production, and chat with McKibben and the others. And though I was on assignment, everyone knew (including of course my editor at Grist, Scott Rosenberg) that I was hardly there to cover it as a conventional reporter. Earlier that year I’d helped launch 350 Massachusetts, the independent grassroots network started by 350.org’s allies at Better Future Project, a young nonprofit in Cambridge (on whose board I served until late 2014). And I was getting involved with other alumni in the nascent, student-led Divest Harvard campaign. You could say my assignment in Burlington was an inside job. Perhaps this is where I should pause and explain that Bill McKibben and I have been acquainted for many years, having worked together occasionally when I was an editor at the Atlantic and the Boston Globe. As I dove deeper into the climate movement, we developed a kind of collaboration as fellow writers and activists. Not that Bill and I have ever really become close friends—we don’t hang out with each other, we’ve never shared much about our personal lives—but we have a warm, collegial relationship. I say all of this not only for the sake of transparency, but because Bill McKibben has had a significant impact on my own thinking about climate. That doesn’t mean I’m incapable of stepping back and giving my honest view of his work. Much of what I write here he’ll probably appreciate; some of it he may feel compelled to argue with. I don’t know. But I’m sure I’ll find out. The Saturday night crowd on the Burlington campus was festive, raucous, pumped. When the man on the stage, Bill McKibben, said it was time to march not just on Washington but on the headquarters of fossil-fuel companies—“it’s time to march on Dallas”—and asked those to stand who’d be willing to join in the fight, seemingly every person filling the University of Vermont’s cavernous Ira Allen Chapel, some 800 souls, rose to their feet. “The fossil-fuel industry has behaved so recklessly that they should lose their social license—their veneer of respectability,” Bill told the audience. “You want to take away our planet and our future? We’re going to take away your money and your good name.” Before heading up to Burlington, I’d asked Bill what the divestment campaign represented for the climate movement. How did it compare with the fight against Keystone XL, now more than a year since he and 1,252 others were arrested at the White House, leading Obama to delay his decision for further review? “Fighting Keystone,” he told me, “we learned we could stand up to the fossil-fuel industry. We demonstrated some moxie.” But, he added: “We also figured out that we’re not going to win just fighting one pipeline at a time. We have to keep all those battles going, but we also have to play some offense, go at the heart of the problem.” His “Do the Math” talk—which grew straight out of the Rolling Stone piece and the Carbon Tracker Initiative’s analysis of fossil-fuel reserves—left no doubt about what that problem is, its scale, and its urgency. One simply cannot repeat this too often: To have any decent chance of preventing runaway warming within this century—to slow the process down and maybe, ultimately, stop it—something like 80 percent of fossil-fuel reserves must stay in the ground, forever, and the world must mobilize an all-out global shift to renewable energy. Given the sheer amount of money at stake—tens of trillions of dollars—the odds of anything like that happening under current political conditions are roughly nil. Bill’s point was that, if there’s going to be any hope at all of preserving a livable climate, those political conditions must change decisively, starting now. And they can—but only if and when enough people, including those in power, understand the simple carbon math and realize that the fossil-fuel industry and its lobby are prepared to cook humanity off the planet unless somebody stops them. The most affecting display in Burlington that night was a show of faces—people, all around the world, who since 2009 had organized and participated in 350.org’s massive “global days of action,” involving thousands of demonstrations in hundreds of countries, on every continent— and who were already suffering the impacts of climate change: in Kenya, Haiti, Brazil, India, Pakistan, the Pacific island nations, and many other places, including the United States. Projected on the big screen behind Bill, they were a profound reminder of the human costs of global warming. Likewise, Bill’s message was about far more than math and carbon reserves. It was about justice and injustice, right and wrong—what you could call the moral equation. On Saturday afternoon, after Bill and the rest of 350’s small production crew ran through their script in the empty, echoing Ira Allen Chapel, I tagged along with them for lunch a short walk from the UVM campus. I asked Bill how the idea for the tour had been born. It all went back, he said, to that seminal 2011 report from the Carbon Tracker Initiative in London. Bill told me that he and Naomi had both read that report early in 2012—and when they saw the numbers, they both realized the implications. “It exposed a real vulnerability of the fossil-fuel industry,” Bill told me, “because it made clear what the outcome of this process was going to be if we continued.” There was a long pause, as he searched for just how to phrase what came next. “There’s always been this slight unreality to the whole climate-change thing,” Bill went on. “Because most people, at some level, kept thinking—and rightly so—‘Yeah, but no one will ever actually do this. No one will actually, knowingly, destroy the planet by climate change.’ But once you’ve seen those numbers, it’s clear, that’s exactly what they’re knowingly planning to do. So that changes the equation, you know?” I noted that the people who built the fossil-fuel industry didn’t set out to wreck the planet. It’s an incredible accident of history that we ended up in this fix. Bill nodded. There was, he said, “a sound historical reason” for the development of fossil fuels. “But that sound historical reason vanished the minute Jim Hansen basically explained, twenty-five years ago, that we’re about to do in the earth. And now that we’ve melted the Arctic, it’s well under way, at this point—it’s outrageous, is all it is.” “Now we have the new, and in some ways, the most important set of facts since the original science around climate. This stuff on who owns what, in terms of reserves—it’s the Keeling Curve of climate economics and politics,” he said, referring to the graph of the ever-rising concentration of carbon dioxide in the atmosphere, one of the foundational discoveries of climate science. “These are the iconic numbers for understanding where we are now.” So could divestment generate enough leverage, economic or otherwise, to make a difference? I wanted Bill to explain how it was an effective strategy. “I think it’s a way to a get a fight started,” Bill said without hesitation, “and to get people in important places talking actively about the culpability of the fossil-fuel industry for the trouble that we’re in. And once that talk starts, I think it does start imposing a certain kind of economic pressure. Their high stock price is entirely justified by the thought that they’re going to get all their reserves out of the ground. And I think we’ve already made an argument that it shouldn’t be a legitimate thing to be doing.” And not just those existing reserves. Perhaps the most damning number to emerge out of the divestment fight is this: 674 billion. That’s how many dollars, according to Carbon Tracker, the top 200 publicly traded fossil-fuel companies spent in 2012 alone on exploration and development of new reserves. (It remains to be seen whether the recent collapse of oil prices will lead companies to pull back significantly on such spending.) In other words, in the face of global catastrophe, those who lead the industry have not only bankrolled a wildly successful effort to sow confusion and denial of climate science—and to obstruct any serious response to the crisis. In the meantime, they are busy digging us an ever deeper hole, committed to a business model that by any sane measure should be called genocidal. Bill likes to say that what he and the rest of us are demanding is not radical—indeed, fighting to preserve the planet for our children and future generations is inherently conservative. The “real radicals,” Bill will tell you, run fossil-fuel companies. Nothing could be more radical than the catastrophic course they’re pursuing: willingly changing the composition of the earth’s atmosphere, consequences be damned. At perhaps the key moment in Bill’s “Do the Math” talk, he played a video clip of Exxon CEO Rex Tillerson at the Council on Foreign Relations in June 2012. Bill eviscerated the onscreen Rex in a darkly comic back-and-forth that would’ve made Jon Stewart proud. The Exxon chief, having made news by acknowledging that climate change is real and that warming “will have an impact”—while his company was spending, according to Bloomberg, as much as $37 billion per year exploring and drilling for more oil and gas—went on to express confidence that “we’ll adapt.” Agricultural production areas will be shifted northward, Tillerson suggests. (Never mind, Bill points out, that you can’t just move Iowa to Siberia, and that there isn’t any topsoil in the tundra.) “It’s an engineering problem, with engineering solutions,” intoned Rex—who, as Bill noted on stage, was making $100,000 a day. “No,” Bill replied. “It’s a greed problem. Yours.” In 2011 and 2012, the tone of the climate movement was shifting. Maybe it all went back to the failure of Copenhagen in 2009, the collapse of climate legislation in the Senate in 2010, and the disillusioning, infuriating lack of climate leadership by Barack Obama. With a kind of desperation, but with history as a guide, people began talking and writing in earnest about building a genuine grassroots movement, a peoples’ movement, based on something more, something broader and deeper, than all the lobbying money and the corporate-style, K-Street-friendly communications strategies of the big green groups. A movement built on something more like moral outrage—and moral indictment. Bill’s tone had been changing as well. There were hints of it in those brutal opening chapters of his book Eaarth, released in the spring of 2010, where he surveyed the planet’s damage and the all but certain ravages to come. And when the watered-down-to-nothing climate bill finally died in the Senate that summer, he let loose with a much-quoted broadside headlined “We’re hot as hell and we’re not going to take it anymore.” As though finally venting emotions long suppressed (he’s a New Englander, after all), he wrote with trademark but now seething understatement: “I’m a mild-mannered guy, a Methodist Sunday school teacher. Not quick to anger. So what I want to say is: This is fucked up. The time has come to get mad, and then to get busy.” Still every bit the soft-spoken, self-effacing speaker—and still droll, even laugh-out-loud funny, on stage—Bill had both darkened and toughened his message. It was as though, as a person of faith—yes, it’s true, Bill McKibben is a lifelong churchgoer, Sunday school teacher, and sometime preacher—he had discovered his “prophetic voice.” He may not thunder, that will never be his style, but he has become, I want to say, a sort of modern-day Jeremiah. Bill flatly rejects any such comparison. “I’m not a prophet,” he tells me. Full stop. But this much is undeniable: Bill seems to have remembered a basic truth of transformative social movements—that they’re driven not by “positive messaging” (much less any simplistic, poll-tested “win-win” market optimism) but by deep moral conviction and moral outrage at intolerable injustice. The movements that change the world are moral struggles—and spiritual ones. The fact that Bill is a lifelong churchgoing Christian is well known to his friends and colleagues, but no doubt strikes some of his secular readers and fans as strange, possibly a little embarrassing. Reporters have occasionally picked up on this aspect of his life, mentioned it in passing, but what’s rarely if ever explored is just how central Bill’s brand of faith is to his outlook and to the whole arc of his life’s work. If you’re one of those secular readers, I hope you’ll bear with me here. This is not an exercise in self-righteousness, or evangelization, or whatever. Bill is not inclined to any of that (and neither am I). Nor is this by any means simple. No, what I’m trying to do is suggest, as best I can, who Bill McKibben is, where he’s coming from, and what really drives him. Perhaps I should start by mentioning that Bill McKibben was born in Palo Alto, California, in 1960, and grew up comfortably middle class in Toronto and in Lexington, Massachusetts, in the Boston suburbs; that his father was a journalist who worked for BusinessWeek and the Boston Globe and was arrested in 1971 on the Lexington town green supporting an antiwar protest by Vietnam veterans; that his family went to church on Sunday and the church youth group was a big part of his life; that he went to Harvard, where he was editor of the Crimson, and where he became good friends with the late great Reverend Peter J. Gomes, rector of Memorial Church; that he went straight on to the New Yorker, where he wrote “Talk of the Town” pieces for five years before quitting in protest when legendary editor William Shawn was forced out; that while in New York he and others started the homeless shelter at the famous Riverside Church; that he and his wife, Sue Halpern, a fellow writer, moved to the Adirondacks, and that he turned his full attention to what was happening to the planet; that they eventually moved to the Green Mountains of Vermont, overlooking the Champlain Valley, where he has taught at Middlebury College ever since. That they have a daughter named Sophie who’s now in her early twenties. That Bill is seldom happier than when he’s out in the woods after a snow. But really, the first thing that should always be said about Bill McKibben is that he’s the guy who wrote, while still in his twenties, The End of Nature. Not just the first book for a general audience about global warming, but without exaggeration, an American classic—a prescient tour de force, in which he reported on what was already the well-advanced science of human-caused climate change, and then proceeded to sketch the broader contours and substance of the subject as we still know it twenty five years on. Rereading the book even now, you realize that there’s been precious little new to say about climate change, in big-picture terms, since Bill explained it to us. Others had of course written about looming ecological catastrophe, “limits to growth,” the Earth’s carrying capacity, and so on. But when it comes to climate change, and its import, Bill was there first. But he didn’t just get the scoop; he went deep. Indeed, that was the real scoop. He thought hard about it, felt it, and wrote a bold, searching, moving—and, like most classics, in some ways idiosyncratic—extended essay on the meaning of what humanity has done to the earth and everything on it. Or more precisely, what our modern civilization has done to it. His subject is not only the fact that we’ve changed the composition of the atmosphere, but what it feels like as one struggles to comprehend the consequences, to take it all on board, philosophically and spiritually. I’m far from alone, I feel sure, when I say that the book has affirmed and clarified my sense of a spiritual crisis at the heart of the climate crisis. How so? Consider that well before the term “Anthropocene” gained currency—the widely accepted idea among Earth scientists that we have left the Holocene and entered an Age of Man, in which humanity itself is now a geological force—Bill argued that our impact on the planet carries world- and worldview-altering significance. We’ve changed everything, even the weather. And so the idea of “nature” as something vastly larger and independent of humanity, of human society, cannot survive. We’ve delivered the death blow. Or as Bill writes:
An idea, a relationship, can go extinct, just like an animal or a plant. The idea in this case is “nature,” the separate and wild province, the world apart from man to which he adapted, under whose rules he was born and died. In the past, we spoiled and polluted parts of that nature, inflicted environmental “damage.” . . . We never thought that we had wrecked nature. Deep down, we never really thought we could. . . .
Of course, as he acknowledges, “natural processes” go on. In fact, “rainfall and sunlight may become more important forces in our lives.” The point is, he writes, “the meaning of the wind, the sun, the rain—of nature—has already changed.” This realization leads him to what is perhaps the book’s central statement: “By changing the weather, we make every spot on earth man made and artificial. We have deprived nature of its independence, and that is fatal to its meaning. Nature’s independence is its meaning; without it there is nothing but us.” I want to come back to that last phrase—“nothing but us”—but before I do, it’s crucial to understand the impact this realization has on Bill as a person of faith. Of course, in typical fashion, he disclaims: “I am no theologian; I am not even certain what I mean by God. (Perhaps some theologians join me in this difficulty.)” But he goes on to ask, “For those of us who have tended to locate God in nature—who, say, look upon spring as a sign of his existence and a clue to his meaning—what does it mean that we have destroyed the old spring and replaced it with a new one of our own devising?” To answer that question, Bill finds himself drawn time and again to the Hebrew Bible’s story of Job. First, however, he has to deal with the often heard environmental critique of the Bible’s creation story, in Genesis, where God gives man “dominion” over the earth and commands him to “subdue” it. There in The End of Nature, Bill joins those who argue that this is far too narrow a reading, and observes that when we take the Bible as a whole, “the opposite messages resound.” Many theologians, he rightly points out, “have contended that the Bible demands a careful ‘stewardship’ of the planet instead of a careless subjugation, that immediately after giving man dominion over the earth God instructed him to ‘cultivate and keep it.’” But even this, he says, fails to really capture the depth of the Bible’s ecological message. For that, he turns to Job—“one of the most far-reaching defenses ever written of wilderness, of nature free from the hand of man.” The Job story, of course, is a staple of Western literature, but to refresh, it goes like this: Job, we are told, is a wealthy, faithful, good, and just man, yet the devil makes a bet with God that if Job is stripped of all his possessions, his children, his happiness—really made to suffer—he will turn and curse God. The Lord is confident, and accepts the wager. Soon, Bill writes, “Job is living on a dunghill on the edge of town, his flesh a mass of oozing sores, his children dead, his flock scattered, his property gone.” But Job, though he curses the day he was born, won’t curse his Maker. He simply wants an explanation for his suffering. He maintains his innocence, and can’t accept the orthodox view offered by his friends that he’s being punished for some sin. Therefore God owes him an answer. What have I done to deserve this? Job demands. Finally, God’s voice speaks to him from out of a whirlwind, and the answer—as Bill puts it in his short book on Job, The Comforting Whirlwind—is “shockingly radical.” It is God’s longest soliloquy in the Bible, and it is unsparing yet beautiful—perhaps, as Bill suggests, the foundation of Western nature writing. In Stephen Mitchell’s striking, poetic translation (the one Bill uses), God asks Job:
Where were you when I planned the earth? Tell me, if you are so wise. Do you know who took its dimensions, measuring its length with a cord? . . . Were you there when I stopped the waters, as they issued gushing from the womb? when I wrapped the ocean in clouds and swaddled the sea in shadows? when I closed it in with barriers and set its boundaries, saying, “Here you may come, but no farther; here shall your proud waves break.”
Bill has called this “God’s taunt”—as if the Creator is saying, You little man, who do you think you are, demanding that I explain your suffering? Creation does not revolve around you. God asks (again in Mitchell’s translation): “Who cuts a path for the thunderstorm / and carves a road for the rain— / to water the desolate wasteland, / the land where no man lives; / to make the wilderness blossom / and cover the desert with grass?” Indeed, that is the rub. As Bill writes in The End of Nature: “God seems to be insisting that we are not the center of the universe, that he is quite happy if it rains where there are no people—that God is quite happy with the places where there are no people, a radical departure from our most ingrained notions.” To Bill, this is a profoundly comforting thought—that we are subsumed into something far larger, incomprehensibly powerful, and free of our touch. And so back to Bill’s question: What does it mean that we, or at least some of us, have altered the atmosphere, changed the weather, the storms—that, in effect, we are adding force to the whirlwind? When God asks who set the boundaries of the oceans, Bill writes, “we can now answer that it is us. Our actions will determine the level of the sea, and change the course and destination of every drop of precipitation.” There’s a word for this, Bill likes to say: “blasphemy.” We have usurped God. And considering how our power-grab has worked out, that is not a happy way to be—whether you believe in the Bible’s God or not. As Bill has said countless times in the past few years, we’ve taken creation’s, the planet’s, largest physical features—the Arctic, the oceans, the great glaciers—and we’ve broken them. We’re perhaps a decade away from an ice-free Arctic summer. The oceans are now an ungodly 30 percent more acidic, threatening the base of the marine food chain and all that depend on it. In other words, the end of nature is a pretty miserable place. So it’s not surprising that The End of Nature concludes on a dark and deeply pessimistic note. The book pulls no punches. It’s too honest for that. What we’ve set in motion cannot be undone: “Now it is too late— not too late to ameliorate some of the changes and so perhaps to avoid the most gruesome of their consequences. But the scientists agree that we have already pumped enough gas into the air so that a significant rise in temperature and a subsequent shift in weather are inevitable.” Even had the nations of the world begun “heroic efforts” in the 1980s, he writes, “it wouldn’t have been enough to prevent terrible, terrible changes.” We would still be committed, Bill informs us, to a warming far greater than humans have ever experienced. That—in 1989. And he was right. This leads him to say things like: “If industrial civilization is ending nature, it is not utter silliness to talk about ending—or, at least, transforming— industrial civilization.” That would mean an acceptance of limits, an end to human hubris. Of course it sounds impossible—but what are the alternatives? “It could be that this idea of a humbler world, or some idea like it, is both radical and necessary, in the way that cutting off a leg can be both radical and necessary.” He suggests that there are signs, however small, of such radical new thinking, as among the bio-centric “deep ecologists,” citing Dave Foreman, founder of Earth First!, who drew inspiration from the writings of Edward Abbey (in particular his novel of eco-defense warriors, The Monkey Wrench Gang). But that’s about as much hope as Bill will allow himself at the end of the book. Remember that phrase: without nature as an independent force, “there is nothing but us.” Ultimately, he is overwhelmed by a deep sadness and a sense of “loneliness.” Tellingly, I think, in the book’s final pages he even asks, “If nature has already ended, what are we fighting for?” He doesn’t really have an answer. Not yet. Not in The End of Nature. Of course, the idea of nature that Bill pronounced dead is itself a product of the human mind—an artifact of our particular evolution as a species, or really, of a particular civilization. And I want to say, it’s as though Bill’s crisis, the spiritual crisis of The End of Nature, is really the struggle to let go of his own conception—you might call it the bio-centric, late-twentieth-century-environmentalist conception—of what nature means. It’s a struggle not unlike the struggle to let go of a deceased loved one. And if that is the case—if in fact it is too late to save “nature,” if there is “nothing but us”—then yes, the question Bill asks in the end demands an answer: What are we fighting for? At this point, I want to propose another way of looking at Job—the way I’ve taken to viewing the story, one I’ve known since childhood, in light of our catastrophe and in light of my own deepest fear, and despair, for the future. I see Job there on the waste, alone and naked in the dust, covered with ashes, tormented, diseased, his children dead—bereft of everything that he owned and loved. And I hear him crying out (Mitchell again):
God damn the day I was born and the night that forced me from the womb. On that day—let there be darkness; let it never have been created; let it sink back into the void. . . . On that night—let no child be born, no mother cry out with joy. . . . Let its last stars be extinguished; let it wait in terror for daylight; let its dawn never arrive.
We are Job. Worse, our children are—alone on the ash heap, cursing the day they were born. Because, on our current course, Job is the vision of our future, our children’s future—and for far, far too many, from the Philippines to the Rockaways, the vision of our present. It’s not only that human beings have “ended nature” and usurped the place of God, not only that we have inflicted that death on nature, this catastrophe bearing down on us. We—and most of all the innocent, alive today and yet to be born—must suffer it. There is no comfort in the whirlwind. And so, when I get to the end of The End of Nature, I see my friend Bill as a much younger man—a young man, alone, in the throes of the spiritual crisis of our time, who has yet to come to terms with the fact that what we are fighting for, now, is not only the earth but each other. Excerpted from "What We’re Fighting for Now Is Each Other: Dispatches from the Front Lines of Climate Justice" by Wen Stephenson (Beacon Press, 2015). Reprinted with permission from Beacon Press.  

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on October 31, 2015 13:30