Helen H. Moore's Blog, page 997

September 27, 2015

Ronald Reagan’s “welfare queen” myth: How the Gipper kickstarted the war on the working poor

Welfare’s virtual extinction has gone all but unnoticed by the American public and the press. But also unnoticed by many has been the expansion of other types of help for the poor. Thanks in part to changes made by the George W. Bush administration, more poor individuals claim SNAP than ever before. The State Children’s Health Insurance Program (now called CHIP, minus the “State”) was created in 1997 to expand the availability of public health insurance to millions of lower-income children. More recently, the Affordable Care Act has made health care coverage even more accessible to lower-income adults with and without children. Perhaps most important, a system of tax credits aimed at the working poor, especially those with dependent children, has grown considerably. The most important of these is the Earned Income Tax Credit (EITC). The EITC is refundable, which means that if the amount for which low-income workers are eligible is more than they owe in taxes, they will get a refund for the difference. Low-income working parents often get tax refunds that are far greater than the income taxes withheld from their paychecks during the year. These tax credits provide a significant income boost to low-income parents working a formal job (parents are not eligible if they’re working off the books). Because tax credits like the EITC are viewed by many as being pro-work, they have long enjoyed support from Democrats and Republicans alike. But here’s the catch: only those who are working can claim them. These expansions of aid for the working poor mean that even after a watershed welfare reform, we, as a country, aren’t spending less on poor families than we once did. In fact, we now spend much more. Yet for all this spending, these programs, except for SNAP, have offered little to help two people like Modonna and Brianna during their roughest spells, when Modonna has had no work. To see clearly who the winners and losers are in the new regime, compare Modonna’s situation before and after she lost her job. In 2009, the last year she was employed, her cashier’s salary was probably about $17,500. After taxes, her monthly paycheck would have totaled around $1,325. While she would not have qualified for a penny of welfare, at tax time she could have claimed a refund of about $3,800, all due to refundable tax credits (of course, her employer still would have withheld FICA taxes for Social Security and Medicare, so her income wasn’t totally tax-free). She also would have been entitled to around $160 each month in SNAP benefits. Taken together, the cash and food aid she could have claimed, even when working full-time, would have been in the range of $5,700 per year. The federal government was providing Modonna with a 36 percent pay raise to supplement her low earnings. Now, having lost her job and exhausted her unemployment insurance, Modonna gets nothing from the government at tax time. Despite her dire situation, she can’t get any help with housing costs. So many people are on the waiting list for housing assistance in Chicago that no new applications are being accepted. The only safety net program available to her at present is SNAP, which went from about $160 to $367 a month when her earnings fell to zero. But that difference doesn’t make up for Modonna’s lost wages. Not to mention the fact that SNAP is meant to be used only to purchase food, not to pay the rent, keep the utility company happy, or purchase school supplies. Thus, as Modonna’s earnings fell from $17,500 to nothing, the annual cash and food stamps she could claim from the government also fell, from $5,700 to $4,400. Welfare pre-1996 style might have provided a lifeline for Modonna as she frantically searched for another job. A welfare check might have kept her and her daughter in their little studio apartment, where they could keep their things, sleep in their own beds, take showers, and prepare meals. It might have made looking for a job easier —paying for a bus pass or a new outfit or hairdo that could help her compete with the many others applying for the same job. But welfare is dead. They just aren’t giving it out anymore. Who killed welfare? You might say that it all started with a charismatic presidential candidate hailing from a state far from Washington, D.C., running during a time of immense change for the country. There was no doubt he had a way with people. It was in the smoothness of his voice and the way he could lock on to someone, even over the TV. Still, he needed an issue that would capture people’s attention. He needed something with curb appeal. In 1976, Ronald Reagan was trying to oust a sitting president in his own party, a none-too-easy task. As he refined his stump speech, he tested out a theme that had worked well when he ran for governor of California and found that it resonated with audiences all across the country: It was time to reform welfare. Over the years, America had expanded its hodgepodge system of programs for the poor again and again. In Reagan’s time, the system was built around Aid to Families with Dependent Children (AFDC), the cash assistance program that was first authorized in 1935, during the depths of the Great Depression. This program offered cash to those who could prove their economic need and demanded little in return. It had no time limits and no mandate that recipients get a job or prove that they were unable to work. As its caseload grew over the years, AFDC came to be viewed by many as a program that rewarded indolence. And by supporting single mothers, it seemed to condone nonmarital childbearing. Perhaps the real question is not why welfare died, but why a program at such odds with American values had lasted as long as it did. In fact, welfare’s birth was a bit of a historical accident. After the Civil War, which had produced a generation of young widowed mothers, many states stepped in with “mother’s aid” programs, which helped widows care for their children in their own homes rather than placing them in orphanages. But during the Great Depression, state coffers ran dry. Aid to Dependent Children (ADC), as the program was first called, was the federal government’s solution to the crisis. Like the earlier state programs, it was based on the assumption that it was best for a widowed mother to raise her children at home. In the grand scheme of things, ADC was a minor footnote in America’s big bang of social welfare legislation in 1935 that created Social Security for the elderly, unemployment insurance for those who lost their jobs through no fault of their own, and other programs to support the needy aged and blind. Its architects saw ADC as a stopgap measure, believing that once male breadwinners began paying in to Social Security, their widows would later be able to claim their deceased husbands’ benefits. Yet ADC didn’t shrink over the years; it grew. The federal government slowly began to loosen eligibility restrictions, and a caseload of a few hundred thousand recipients in the late 1930s had expanded to 3.6 million by 1962. Widowed mothers did move on to Social Security. But other single mothers —divorcées and women who had never been married —began to use the program at greater rates. There was wide variation in the amount of support offered across the states. In those with large black populations, such as Mississippi and Alabama, single mothers got nickels and dimes on the dollar of what was provided in largely white states, such as Massachusetts and Minnesota. And since the American public deemed divorced or never-married mothers less deserving than widows, many states initiated practices intended to keep them off the rolls. Poverty rose to the top of the public agenda in the 1960s, in part spurred by the publication of Michael Harrington’s The Other America: Poverty in the United States. Harrington’s 1962 book made a claim that shocked the nation at a time when it was experiencing a period of unprecedented affluence: based on the best available evidence, between 40 million and 50 million Americans—20 to 25 percent of the nation’s population—still lived in poverty, suffering from “inadequate housing, medicine, food, and opportunity.” Shedding light on the lives of the poor from New York to Appalachia to the Deep South, Harrington’s book asked how it was possible that so much poverty existed in a land of such prosperity. It challenged the country to ask what it was prepared to do about it. Prompted in part by the strong public reaction to The Other America, and just weeks after President John F. Kennedy’s assassination, President Lyndon Johnson declared an “unconditional war on poverty in America.” In his 1964 State of the Union address, Johnson lamented that “many Americans live on the outskirts of hope —some because of their poverty, and some because of their color, and all too many because of both.” He charged the country with a new task: to uplift the poor, “to help replace their despair with opportunity.” This at a time when the federal government didn’t yet have an official way to measure whether someone was poor. In his efforts to raise awareness about poverty in America, Johnson launched a series of “poverty tours” via Air Force One, heading to places such as Martin County, Kentucky, where he visited with struggling families and highlighted the plight of the Appalachian poor, whose jobs in the coal mines were rapidly disappearing. A few years later, as Robert F. Kennedy contemplated a run for the presidency, he toured California’s San Joaquin Valley, the Mississippi Delta, and Appalachia to see whether the initial rollout of the War on Poverty programs had made any difference in the human suffering felt there. RFK’s tours were organized in part by his Harvard-educated aide Peter Edelman. (Edelman met his future wife, Marian Wright —later founder of the Children’s Defense Fund—on the Mississippi Delta tour. “She was really smart, and really good-looking,” he later wrote of the event.) Dressed in a dark suit and wearing thick, black-framed glasses, Edelman worked with others on Kennedy’s staff and local officials to schedule visits with families and organize community hearings. In eastern Kentucky, RFK held meetings in such small towns as Whitesburg and Fleming-Neon. Neither Edelman nor anyone else involved anticipated the keen interest in the eastern Kentucky trip among members of the press, who were waiting to hear whether Kennedy would run for president. Since the organizers had not secured a bus for the press pool, reporters covering the trip were forced to rent their own vehicles and formed a caravan that spanned thirty or forty cars. Edelman remembers that “by the end of the first day we were three hours behind schedule.” Kennedy’s poverty activism was cut short by his assassination in June 1968. But Johnson’s call to action had fueled an explosion in policy making. More programs targeting poor families were passed as part of Johnson’s Great Society and its War on Poverty than at any other time in American history. Congress made the fledgling Food Stamp Program permanent (although the program grew dramatically during the 1970s under President Richard Nixon) and increased federal funds for school breakfasts and lunches, making them free to children from poor families. Social Security was expanded to better serve the poorest of its claimants, Head Start was born, and new health insurance programs for the poor (Medicaid) and elderly (Medicare) were created. What the War on Poverty did not do was target the cash welfare system (by then renamed Aid to Families with Dependent Children, or AFDC) for expansion. Yet the late 1960s and early 1970s marked the greatest period of caseload growth in the program’s history. Between 1964 and 1976, the number of Americans getting cash assistance through AFDC nearly tripled, from 4.2 million to 11.3 million. This dramatic rise was driven in part by the efforts of the National Welfare Rights Organization (NWRO). A group led by welfare recipients and radical social workers, the NWRO brought poor families to welfare offices to demand aid and put pressure on program administrators to treat applicants fairly. The NWRO was also the impetus behind a series of court decisions in the late 1960s and the 1970s that struck down discriminatory practices that had kept some families over the prior decades off the welfare rolls, particularly those headed by blacks, as well as divorced and never-married mothers. Through “man in the house” rules, state caseworkers had engaged in midnight raids to ensure that recipients had no adult males living in the home. In addition, “suitable home” requirements had enabled caseworkers to exclude applicants if a home visit revealed “disorder.” Some instituted “white glove tests” to ensure “good housekeeping.” An applicant could be denied if the caseworker’s white glove revealed dust on a windowsill or the fireplace mantel. When these practices were struck down, the caseloads grew bigger, and with rising caseloads came rising expenditures. No longer was cash welfare an inconsequential footnote among government programs. It was now a significant commitment of the federal and state governments in its own right. As costs increased, AFDC’s unpopularity only grew. The largest, most representative survey of American attitudes, the General Social Survey, has consistently shown that between 60 and 70 percent of the American public believes that the government is “spending too little on assistance for the poor.” However, if Americans are asked about programs labeled “welfare” in particular, their support for assistance drops considerably. Even President Franklin D. Roosevelt claimed that “welfare is a narcotic, a subtle destroyer of the human spirit.” Although there is little evidence to support such a claim, welfare is widely believed to engender dependency. Providing more aid to poor single mothers during the 1960s and 1970s likely reduced their work effort somewhat. But it didn’t lead to the mass exodus from the workforce that the rhetoric of the time often suggested. Sometimes evidence, however, doesn’t stand a chance against a compelling narrative. Americans were suspicious of welfare because they feared that it sapped the able-bodied of their desire to raise themselves up by their own bootstraps. By the mid-1970s, with the country grappling with what seemed like a fundamental societal shift, another reason for wariness toward welfare arose. In 1960, only about 5 percent of births were to unmarried women, consistent with the two previous decades. But then the percentage began to rise at an astonishing pace, doubling by the early 1970s and nearly doubling again over the next decade. A cascade of criticism blamed welfare for this trend. According to this narrative, supporting unwed mothers with public dollars made them more likely to trade in a husband for the dole. Once again, no credible social scientist has ever found evidence that the sharp rise in nonmarital childbearing was driven by welfare. While welfare may have led to a small decrease in the rate of marriage among the poor during those years, it could not begin to explain the skyrocketing numbers of births to unwed women. Yet Americans were primed to buy the story that AFDC, a system that went so against the grain of the self-sufficiency they believed in, was the main culprit in causing the spread of single motherhood. And so it was that Ronald Reagan, preparing his run for the presidency during a period when discontent with this stepchild of the welfare state was particularly high, found an issue with broad appeal and seized on it as a way to differentiate himself from his more moderate opponent. His stump speech soon began to feature the “welfare queen”—a villain who was duping the government in a grand style. Unlike the average American, she wasn’t expected to work or marry. The father or fathers of her offspring were given a pass on the responsibility of caring for the children they sired. The campaign even found a woman who became the symbol of all that was wrong with welfare. In a speech in January 1976, Reagan announced that she “[has] used 80 names, 30 addresses, 15 telephone numbers to collect food stamps, Social Security, veterans benefits for four nonexistent, deceased veteran husbands, as well as welfare. Her tax-free cash income alone has been running $150,000 a year.” As he punctuated the dollar value with just the right intonation, audible gasps could be heard from the crowd. Reagan’s claims were loosely based on a real person. Hailing from Chicago, Linda Taylor was a character as worthy of the big screen as Reagan himself. In a profile in Slate, Josh Levin wrote that in the 1970s alone, “Taylor was investigated for homicide, kidnapping, and baby trafficking.” She was implicated in multiple counts of insurance fraud and had numerous husbands, whom she used and discarded. Without a doubt, she was a real villain. But she was very far from a typical welfare recipient. Although negative racial stereotypes had plagued welfare throughout its existence, the emphasis on race was more widespread and virulent after Reagan turned his focus to the system. His welfare queen soon became deeply ingrained in American culture. She was black, decked out in furs, and driving her Cadillac to the welfare office to pick up her check. None of these stereotypes even came close to reflecting reality, particularly in regard to race. It was true that as of the late 1960s and beyond, a disproportionate percentage of blacks participated in AFDC. But there was never a point at which blacks accounted for a majority of recipients. The typical AFDC recipient, even in Reagan’s day, was white. Reagan lost the Republican primary to Ford in 1976 but defeated President Jimmy Carter in 1980. As president, Reagan took a somewhat softer tone, rhetorically portraying the welfare recipient as more of a victim of bad public policy than a villain. Like FDR, President Reagan viewed the poor as caught up in a system that acted like a narcotic. He was buoyed by the work of the libertarian social scientist Charles Murray, whose influential 1984 book Losing Ground argued that social welfare policies had increased long-term poverty. Murray’s logic was simple: Pay women to stay single and have babies, and more of them will do so. Pay them not to work, and you have a double disaster on your hands. Murray laid the blame for continuing high rates of poverty squarely at the feet of the welfare system. By discouraging both work and marriage, the system was ensuring that millions of American women and children remained poor. In his second inaugural address, Reagan argued for Murray’s thesis; his call was to help the poor “escape the spider’s web of dependency.” Despite this grand narrative and call to action, the changes Reagan was able to make to the welfare system were not extensive. The most notable legislative accomplishment of the 1980s was the Family Support Act, a bipartisan effort by conservatives and New Democrats who sought to distance themselves from the tax-and-spend image that was losing them seats in Congress. Arkansas governor Bill Clinton was a leader among the latter group. The act was the most significant attempt to date to put teeth into a work requirement for the welfare poor and to enhance child support enforcement. Those with new requirements imposed upon them were supposed to work at least part-time or to participate in a training program, but there were numerous exemptions. In the end, the program amounted to little more than an unfunded mandate. There was a jobs program with a catchy acronym (JOBS, standing for “job opportunities and basic skills”), but few states took their part seriously, and life changed for only a small fraction of welfare recipients. President Reagan famously quipped that “we waged a war on poverty, and poverty won.” Judged by the size of the welfare rolls, Reagan’s campaign against welfare was at least as futile. By 1988, there were 10.9 million recipients on AFDC, about the same number as when he took office . Four years later, when Reagan’s successor, George H. W. Bush, left office , the welfare caseloads reached 13.8 million —4.5 million adults and their 9.3 million dependent children. How was it that welfare, an immensely unpopular program, could withstand such an offensive? If welfare’s chief nemesis, Ronald Reagan, had failed, who possibly stood a chance? Excerpted from "$2 a Day: Living on Almost Nothing in America" by Kathryn J. Edin and H. Luke Shaefer. Published by Houghton Mifflin Harcourt. Copyright 2015 by Kathryn J. Edin and H. Luke Shaefer. Reprinted with permission of the publisher. All rights reserved. 2_a_day_embedWelfare’s virtual extinction has gone all but unnoticed by the American public and the press. But also unnoticed by many has been the expansion of other types of help for the poor. Thanks in part to changes made by the George W. Bush administration, more poor individuals claim SNAP than ever before. The State Children’s Health Insurance Program (now called CHIP, minus the “State”) was created in 1997 to expand the availability of public health insurance to millions of lower-income children. More recently, the Affordable Care Act has made health care coverage even more accessible to lower-income adults with and without children. Perhaps most important, a system of tax credits aimed at the working poor, especially those with dependent children, has grown considerably. The most important of these is the Earned Income Tax Credit (EITC). The EITC is refundable, which means that if the amount for which low-income workers are eligible is more than they owe in taxes, they will get a refund for the difference. Low-income working parents often get tax refunds that are far greater than the income taxes withheld from their paychecks during the year. These tax credits provide a significant income boost to low-income parents working a formal job (parents are not eligible if they’re working off the books). Because tax credits like the EITC are viewed by many as being pro-work, they have long enjoyed support from Democrats and Republicans alike. But here’s the catch: only those who are working can claim them. These expansions of aid for the working poor mean that even after a watershed welfare reform, we, as a country, aren’t spending less on poor families than we once did. In fact, we now spend much more. Yet for all this spending, these programs, except for SNAP, have offered little to help two people like Modonna and Brianna during their roughest spells, when Modonna has had no work. To see clearly who the winners and losers are in the new regime, compare Modonna’s situation before and after she lost her job. In 2009, the last year she was employed, her cashier’s salary was probably about $17,500. After taxes, her monthly paycheck would have totaled around $1,325. While she would not have qualified for a penny of welfare, at tax time she could have claimed a refund of about $3,800, all due to refundable tax credits (of course, her employer still would have withheld FICA taxes for Social Security and Medicare, so her income wasn’t totally tax-free). She also would have been entitled to around $160 each month in SNAP benefits. Taken together, the cash and food aid she could have claimed, even when working full-time, would have been in the range of $5,700 per year. The federal government was providing Modonna with a 36 percent pay raise to supplement her low earnings. Now, having lost her job and exhausted her unemployment insurance, Modonna gets nothing from the government at tax time. Despite her dire situation, she can’t get any help with housing costs. So many people are on the waiting list for housing assistance in Chicago that no new applications are being accepted. The only safety net program available to her at present is SNAP, which went from about $160 to $367 a month when her earnings fell to zero. But that difference doesn’t make up for Modonna’s lost wages. Not to mention the fact that SNAP is meant to be used only to purchase food, not to pay the rent, keep the utility company happy, or purchase school supplies. Thus, as Modonna’s earnings fell from $17,500 to nothing, the annual cash and food stamps she could claim from the government also fell, from $5,700 to $4,400. Welfare pre-1996 style might have provided a lifeline for Modonna as she frantically searched for another job. A welfare check might have kept her and her daughter in their little studio apartment, where they could keep their things, sleep in their own beds, take showers, and prepare meals. It might have made looking for a job easier —paying for a bus pass or a new outfit or hairdo that could help her compete with the many others applying for the same job. But welfare is dead. They just aren’t giving it out anymore. Who killed welfare? You might say that it all started with a charismatic presidential candidate hailing from a state far from Washington, D.C., running during a time of immense change for the country. There was no doubt he had a way with people. It was in the smoothness of his voice and the way he could lock on to someone, even over the TV. Still, he needed an issue that would capture people’s attention. He needed something with curb appeal. In 1976, Ronald Reagan was trying to oust a sitting president in his own party, a none-too-easy task. As he refined his stump speech, he tested out a theme that had worked well when he ran for governor of California and found that it resonated with audiences all across the country: It was time to reform welfare. Over the years, America had expanded its hodgepodge system of programs for the poor again and again. In Reagan’s time, the system was built around Aid to Families with Dependent Children (AFDC), the cash assistance program that was first authorized in 1935, during the depths of the Great Depression. This program offered cash to those who could prove their economic need and demanded little in return. It had no time limits and no mandate that recipients get a job or prove that they were unable to work. As its caseload grew over the years, AFDC came to be viewed by many as a program that rewarded indolence. And by supporting single mothers, it seemed to condone nonmarital childbearing. Perhaps the real question is not why welfare died, but why a program at such odds with American values had lasted as long as it did. In fact, welfare’s birth was a bit of a historical accident. After the Civil War, which had produced a generation of young widowed mothers, many states stepped in with “mother’s aid” programs, which helped widows care for their children in their own homes rather than placing them in orphanages. But during the Great Depression, state coffers ran dry. Aid to Dependent Children (ADC), as the program was first called, was the federal government’s solution to the crisis. Like the earlier state programs, it was based on the assumption that it was best for a widowed mother to raise her children at home. In the grand scheme of things, ADC was a minor footnote in America’s big bang of social welfare legislation in 1935 that created Social Security for the elderly, unemployment insurance for those who lost their jobs through no fault of their own, and other programs to support the needy aged and blind. Its architects saw ADC as a stopgap measure, believing that once male breadwinners began paying in to Social Security, their widows would later be able to claim their deceased husbands’ benefits. Yet ADC didn’t shrink over the years; it grew. The federal government slowly began to loosen eligibility restrictions, and a caseload of a few hundred thousand recipients in the late 1930s had expanded to 3.6 million by 1962. Widowed mothers did move on to Social Security. But other single mothers —divorcées and women who had never been married —began to use the program at greater rates. There was wide variation in the amount of support offered across the states. In those with large black populations, such as Mississippi and Alabama, single mothers got nickels and dimes on the dollar of what was provided in largely white states, such as Massachusetts and Minnesota. And since the American public deemed divorced or never-married mothers less deserving than widows, many states initiated practices intended to keep them off the rolls. Poverty rose to the top of the public agenda in the 1960s, in part spurred by the publication of Michael Harrington’s The Other America: Poverty in the United States. Harrington’s 1962 book made a claim that shocked the nation at a time when it was experiencing a period of unprecedented affluence: based on the best available evidence, between 40 million and 50 million Americans—20 to 25 percent of the nation’s population—still lived in poverty, suffering from “inadequate housing, medicine, food, and opportunity.” Shedding light on the lives of the poor from New York to Appalachia to the Deep South, Harrington’s book asked how it was possible that so much poverty existed in a land of such prosperity. It challenged the country to ask what it was prepared to do about it. Prompted in part by the strong public reaction to The Other America, and just weeks after President John F. Kennedy’s assassination, President Lyndon Johnson declared an “unconditional war on poverty in America.” In his 1964 State of the Union address, Johnson lamented that “many Americans live on the outskirts of hope —some because of their poverty, and some because of their color, and all too many because of both.” He charged the country with a new task: to uplift the poor, “to help replace their despair with opportunity.” This at a time when the federal government didn’t yet have an official way to measure whether someone was poor. In his efforts to raise awareness about poverty in America, Johnson launched a series of “poverty tours” via Air Force One, heading to places such as Martin County, Kentucky, where he visited with struggling families and highlighted the plight of the Appalachian poor, whose jobs in the coal mines were rapidly disappearing. A few years later, as Robert F. Kennedy contemplated a run for the presidency, he toured California’s San Joaquin Valley, the Mississippi Delta, and Appalachia to see whether the initial rollout of the War on Poverty programs had made any difference in the human suffering felt there. RFK’s tours were organized in part by his Harvard-educated aide Peter Edelman. (Edelman met his future wife, Marian Wright —later founder of the Children’s Defense Fund—on the Mississippi Delta tour. “She was really smart, and really good-looking,” he later wrote of the event.) Dressed in a dark suit and wearing thick, black-framed glasses, Edelman worked with others on Kennedy’s staff and local officials to schedule visits with families and organize community hearings. In eastern Kentucky, RFK held meetings in such small towns as Whitesburg and Fleming-Neon. Neither Edelman nor anyone else involved anticipated the keen interest in the eastern Kentucky trip among members of the press, who were waiting to hear whether Kennedy would run for president. Since the organizers had not secured a bus for the press pool, reporters covering the trip were forced to rent their own vehicles and formed a caravan that spanned thirty or forty cars. Edelman remembers that “by the end of the first day we were three hours behind schedule.” Kennedy’s poverty activism was cut short by his assassination in June 1968. But Johnson’s call to action had fueled an explosion in policy making. More programs targeting poor families were passed as part of Johnson’s Great Society and its War on Poverty than at any other time in American history. Congress made the fledgling Food Stamp Program permanent (although the program grew dramatically during the 1970s under President Richard Nixon) and increased federal funds for school breakfasts and lunches, making them free to children from poor families. Social Security was expanded to better serve the poorest of its claimants, Head Start was born, and new health insurance programs for the poor (Medicaid) and elderly (Medicare) were created. What the War on Poverty did not do was target the cash welfare system (by then renamed Aid to Families with Dependent Children, or AFDC) for expansion. Yet the late 1960s and early 1970s marked the greatest period of caseload growth in the program’s history. Between 1964 and 1976, the number of Americans getting cash assistance through AFDC nearly tripled, from 4.2 million to 11.3 million. This dramatic rise was driven in part by the efforts of the National Welfare Rights Organization (NWRO). A group led by welfare recipients and radical social workers, the NWRO brought poor families to welfare offices to demand aid and put pressure on program administrators to treat applicants fairly. The NWRO was also the impetus behind a series of court decisions in the late 1960s and the 1970s that struck down discriminatory practices that had kept some families over the prior decades off the welfare rolls, particularly those headed by blacks, as well as divorced and never-married mothers. Through “man in the house” rules, state caseworkers had engaged in midnight raids to ensure that recipients had no adult males living in the home. In addition, “suitable home” requirements had enabled caseworkers to exclude applicants if a home visit revealed “disorder.” Some instituted “white glove tests” to ensure “good housekeeping.” An applicant could be denied if the caseworker’s white glove revealed dust on a windowsill or the fireplace mantel. When these practices were struck down, the caseloads grew bigger, and with rising caseloads came rising expenditures. No longer was cash welfare an inconsequential footnote among government programs. It was now a significant commitment of the federal and state governments in its own right. As costs increased, AFDC’s unpopularity only grew. The largest, most representative survey of American attitudes, the General Social Survey, has consistently shown that between 60 and 70 percent of the American public believes that the government is “spending too little on assistance for the poor.” However, if Americans are asked about programs labeled “welfare” in particular, their support for assistance drops considerably. Even President Franklin D. Roosevelt claimed that “welfare is a narcotic, a subtle destroyer of the human spirit.” Although there is little evidence to support such a claim, welfare is widely believed to engender dependency. Providing more aid to poor single mothers during the 1960s and 1970s likely reduced their work effort somewhat. But it didn’t lead to the mass exodus from the workforce that the rhetoric of the time often suggested. Sometimes evidence, however, doesn’t stand a chance against a compelling narrative. Americans were suspicious of welfare because they feared that it sapped the able-bodied of their desire to raise themselves up by their own bootstraps. By the mid-1970s, with the country grappling with what seemed like a fundamental societal shift, another reason for wariness toward welfare arose. In 1960, only about 5 percent of births were to unmarried women, consistent with the two previous decades. But then the percentage began to rise at an astonishing pace, doubling by the early 1970s and nearly doubling again over the next decade. A cascade of criticism blamed welfare for this trend. According to this narrative, supporting unwed mothers with public dollars made them more likely to trade in a husband for the dole. Once again, no credible social scientist has ever found evidence that the sharp rise in nonmarital childbearing was driven by welfare. While welfare may have led to a small decrease in the rate of marriage among the poor during those years, it could not begin to explain the skyrocketing numbers of births to unwed women. Yet Americans were primed to buy the story that AFDC, a system that went so against the grain of the self-sufficiency they believed in, was the main culprit in causing the spread of single motherhood. And so it was that Ronald Reagan, preparing his run for the presidency during a period when discontent with this stepchild of the welfare state was particularly high, found an issue with broad appeal and seized on it as a way to differentiate himself from his more moderate opponent. His stump speech soon began to feature the “welfare queen”—a villain who was duping the government in a grand style. Unlike the average American, she wasn’t expected to work or marry. The father or fathers of her offspring were given a pass on the responsibility of caring for the children they sired. The campaign even found a woman who became the symbol of all that was wrong with welfare. In a speech in January 1976, Reagan announced that she “[has] used 80 names, 30 addresses, 15 telephone numbers to collect food stamps, Social Security, veterans benefits for four nonexistent, deceased veteran husbands, as well as welfare. Her tax-free cash income alone has been running $150,000 a year.” As he punctuated the dollar value with just the right intonation, audible gasps could be heard from the crowd. Reagan’s claims were loosely based on a real person. Hailing from Chicago, Linda Taylor was a character as worthy of the big screen as Reagan himself. In a profile in Slate, Josh Levin wrote that in the 1970s alone, “Taylor was investigated for homicide, kidnapping, and baby trafficking.” She was implicated in multiple counts of insurance fraud and had numerous husbands, whom she used and discarded. Without a doubt, she was a real villain. But she was very far from a typical welfare recipient. Although negative racial stereotypes had plagued welfare throughout its existence, the emphasis on race was more widespread and virulent after Reagan turned his focus to the system. His welfare queen soon became deeply ingrained in American culture. She was black, decked out in furs, and driving her Cadillac to the welfare office to pick up her check. None of these stereotypes even came close to reflecting reality, particularly in regard to race. It was true that as of the late 1960s and beyond, a disproportionate percentage of blacks participated in AFDC. But there was never a point at which blacks accounted for a majority of recipients. The typical AFDC recipient, even in Reagan’s day, was white. Reagan lost the Republican primary to Ford in 1976 but defeated President Jimmy Carter in 1980. As president, Reagan took a somewhat softer tone, rhetorically portraying the welfare recipient as more of a victim of bad public policy than a villain. Like FDR, President Reagan viewed the poor as caught up in a system that acted like a narcotic. He was buoyed by the work of the libertarian social scientist Charles Murray, whose influential 1984 book Losing Ground argued that social welfare policies had increased long-term poverty. Murray’s logic was simple: Pay women to stay single and have babies, and more of them will do so. Pay them not to work, and you have a double disaster on your hands. Murray laid the blame for continuing high rates of poverty squarely at the feet of the welfare system. By discouraging both work and marriage, the system was ensuring that millions of American women and children remained poor. In his second inaugural address, Reagan argued for Murray’s thesis; his call was to help the poor “escape the spider’s web of dependency.” Despite this grand narrative and call to action, the changes Reagan was able to make to the welfare system were not extensive. The most notable legislative accomplishment of the 1980s was the Family Support Act, a bipartisan effort by conservatives and New Democrats who sought to distance themselves from the tax-and-spend image that was losing them seats in Congress. Arkansas governor Bill Clinton was a leader among the latter group. The act was the most significant attempt to date to put teeth into a work requirement for the welfare poor and to enhance child support enforcement. Those with new requirements imposed upon them were supposed to work at least part-time or to participate in a training program, but there were numerous exemptions. In the end, the program amounted to little more than an unfunded mandate. There was a jobs program with a catchy acronym (JOBS, standing for “job opportunities and basic skills”), but few states took their part seriously, and life changed for only a small fraction of welfare recipients. President Reagan famously quipped that “we waged a war on poverty, and poverty won.” Judged by the size of the welfare rolls, Reagan’s campaign against welfare was at least as futile. By 1988, there were 10.9 million recipients on AFDC, about the same number as when he took office . Four years later, when Reagan’s successor, George H. W. Bush, left office , the welfare caseloads reached 13.8 million —4.5 million adults and their 9.3 million dependent children. How was it that welfare, an immensely unpopular program, could withstand such an offensive? If welfare’s chief nemesis, Ronald Reagan, had failed, who possibly stood a chance? Excerpted from "$2 a Day: Living on Almost Nothing in America" by Kathryn J. Edin and H. Luke Shaefer. Published by Houghton Mifflin Harcourt. Copyright 2015 by Kathryn J. Edin and H. Luke Shaefer. Reprinted with permission of the publisher. All rights reserved. 2_a_day_embed

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 08:59

Flying domestically just got that much more miserable for people in these 4 states

AlterNet Thanks to provisions in the little-known Real ID Act – passed in 2005 – four states will soon be unable to use a regular driver's licenses to fly even within the continental United States.

The Department of Homeland Security has named New York, Louisiana, Minnesota, American Samoa, and New Hampshire as locations where the residents will be required to use alternative to fly on commercial airplanes.

Although there is no reason given for why these states and regions were singled out, it could possibly be because these driver's licenses – the traditional form of identification used at airports – aren't compatible with new enactments of federal "Real ID" laws. According to Travel and Leisure:

"The new rules will go into effect sometime in 2016 (the exact date has not been announced), and there will be a three-month forgiveness period, during which people with these licenses will be warned that their IDs are no longer valid for flights.

Here’s the breakdown: if you're from one of these states, “acceptable” IDs include passports and passport cards, as well as permanent resident cards, U.S. military ID, and DHS trusted traveler cards such a Global Entry and NEXUS. The TSA will also accept Enhanced Driver’s Licenses, the kind that are currently used to replace passports for travel to and from  Canada, Mexico, and the Caribbean. Of the noncompliant states, only New York and Minnesota issue enhanced licenses.

The new DHS enforcement is rooted in the REAL ID Act, passed in 2005 based on the recommendation by the 9/11 Commission that the government should “set standards for the issuance of sources of identification, such as driver's licenses,” according to Department of Homeland Security's brief.

AlterNet Thanks to provisions in the little-known Real ID Act – passed in 2005 – four states will soon be unable to use a regular driver's licenses to fly even within the continental United States.

The Department of Homeland Security has named New York, Louisiana, Minnesota, American Samoa, and New Hampshire as locations where the residents will be required to use alternative to fly on commercial airplanes.

Although there is no reason given for why these states and regions were singled out, it could possibly be because these driver's licenses – the traditional form of identification used at airports – aren't compatible with new enactments of federal "Real ID" laws. According to Travel and Leisure:

"The new rules will go into effect sometime in 2016 (the exact date has not been announced), and there will be a three-month forgiveness period, during which people with these licenses will be warned that their IDs are no longer valid for flights.

Here’s the breakdown: if you're from one of these states, “acceptable” IDs include passports and passport cards, as well as permanent resident cards, U.S. military ID, and DHS trusted traveler cards such a Global Entry and NEXUS. The TSA will also accept Enhanced Driver’s Licenses, the kind that are currently used to replace passports for travel to and from  Canada, Mexico, and the Caribbean. Of the noncompliant states, only New York and Minnesota issue enhanced licenses.

The new DHS enforcement is rooted in the REAL ID Act, passed in 2005 based on the recommendation by the 9/11 Commission that the government should “set standards for the issuance of sources of identification, such as driver's licenses,” according to Department of Homeland Security's brief.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 08:00

Ben Carson’s great betrayal: How he ignores history in favor of the Republican Party

The Black Freedom Struggle began in America when the first Africans were brought to Florida in 1581. It continued onward through emancipation and reconstruction as black Americans “built a nation under their feet”, resisting chattel slavery, self-manumitting, taking up arms, and then building political and social institutions across the South and the rest of the United States. The Black Freedom Struggle would reach its peak with the Civil Rights Movement and be seared into American public memory with the Great March on Washington for Jobs and Freedom, and iconic speeches by Dr. King and others. The Civil Rights Movement continues today with Black Lives Matter and the centuries-long fight by black and brown folks against police thuggery, for a more equitable society, dignity, and full human rights for all peoples on both sides of the color line. The Black Freedom Struggle inspired other groups—women, gays and lesbians, the differently-abled—in the United States to resist and fight Power. It has also been a source of inspiration for people’s movements around the world. Of course, the individuals who led (and lead) the Black Freedom Struggle are not perfect. They, like all of us, are flawed. Black resistance to white supremacy occasionally (both necessarily and understandably) involved moments of fleeting flirtation with racial chauvinism. And one cannot overlook how political stagecraft and cruel realpolitik tried to erase the leadership role played by gays and lesbians in the Civil Rights Movement--this is a shameful blemish on the radically humanistic and transformative vision of American life offered by that glorious struggle. But in all, the Black Freedom Struggle has been a source of inspiration; black Americans are the moral conscience of a nation. Black America has earned that title even as much as it has been unfairly forced upon it. In that idealized role, black Americans are called to defend the weak against the strong, speak truth to power, and force America to live up to the promise of its democratic creed and vision. This obligation can give strength, clarity of purpose and energy to Black Americans and others who honor that legacy. Being part of a community that is “the miner’s canary” and “moral conscience of a nation” can exact a heavy burden. As such, some black folks have decided that the burden and obligation are too great to carry. Their shoulders are too narrow and weak. Ben Carson, black conservative and 2016 Republican presidential primary candidate, is one such person. Last week, Ben Carson surrendered to xenophobia, nativism, and intolerance when he suggested that Muslims are inherently incapable of being President of the United States because their faith is incompatible with the Constitution. As reported by CNN, in a conversation on Wednesday of this week Carson then suggested:
"I find black Republicans are treated extremely well in the Republican Party. In fact, I don't hear much about being a black Republican," he said Wednesday at an event in Michigan. "I think the Republicans have done a far superior job of getting over racism."
Carson was a Democrat for years, but said he's found the Republican Party to be more welcoming. "When you look at the philosophies of the two parties now, what I have noticed as a black Republican is that Republicans tend to look more at the character of people. And Democrats tend to look more at the color of their skin," he said Wednesday. Ben Carson’s comments are delusional, hypocritical, and vexing. Carson, like many movement conservatives, is a Christian theocrat who wants to weaken the boundaries between church and state in the United States. Carson, like other contemporary American conservatives, fetishizes the Constitution except when he wants to radically alter it: His suggestion that there should be a religious litmus test for office actually violates Article VI. Black Americans are not lockstep or uniform in their political beliefs. Spirited disagreement is central to black American political life. But for Carson to suggest that the Republican Party, with its Birtherism, Southern Strategy of overt and covert racism, and clear examples of “old fashioned” anti-black animus in the Age of Obama, is somehow a force for racial “progress” is an analysis that can only be offered by a person who is possessed of some sort of Stockholm Syndrome or willfully blind to empirical reality. Ben Carson’s pandering to Islamophobia is a violation of the Black Freedom Struggle’s spirit that black folks as unique victims of Power in America have a moral obligation to stand with the weak against the strong. Ultimately, he has rejected the legacy and burden of the Black Freedom Struggle. These are not meritorious acts of radical autonomy or individuality. Rather, they are acts of cowardice and betrayal. But if one rejects the Black Freedom Struggle, what do they replace it with? Black conservatives such as Ben Carson receive head-patting approval from white conservatives. The primary role of black conservatives in the post civil rights era is, as I have suggested many times both here at Salon and elsewhere, is to serve as human chaff and a defense shield against claims that white racism exists—and that today’s Republican Party is an organization whose “name brand” is based on mining white racial resentment, rage, and animus. Ben Carson, like Herman Cain before him, Supreme Court Justice Clarence Thomas, and the panoply of black conservatives trotted out on Fox News and elsewhere to excuse-make for white racism, are professional “black best friends” for the Republican Party. Ben Carson’s rejection of the Black Freedom Struggle and public embrace of Islamophobia is also very lucrative. Black conservatives, like women who reject feminism, gays and lesbians who oppose marriage equality, and Hispanics and Latinos who publicly bloviate against “illegal immigrants,” occupy a very lucrative niche in the right-wing media and entertainment apparatus. In the mid- to long-term, Carson’s black conservative hustle will earn him money on the lecture circuit. In the short-term, Carson’s Islamophobia has garnered at least $1 million in donations to his campaign. Betraying the Black Freedom Struggle is both ego gratifying for black conservatives—they are deemed by the White Right as the “special” or “good” black who is not the like the “other ones”—and financially lucrative. How do Black conservatives such as Ben Carson and Clarence Thomas, among others, reconcile their rejection of the Black Freedom Struggle with the fact that they, as members of the black elite and professional classes, are direct beneficiaries and products of it? They can imagine themselves as the true holders of the flame who are defending Black America’s “real interests” from trickery and deception by Democrats who want to keep black folks on a “plantation”. This is specious and insulting, of course, as such claims assume that black Americans are stupid, dumb, and unlike white folks, have no ability to make rational political calculi about their own collective self-interest. Contemporary black conservatives could also choose to rewrite the last 70 years or so of history--Republicans are the saviors of black Americans for time immemorial; Democrats are permanent enslavers and Klansman. In this imagined world, the Civil Rights Movement, and its won-in-blood-and-death victories -- such as the Voting Rights Act -- is somehow no longer needed. Moreover, protections for Black Americans which acknowledge the unique and continuing threat to their right to vote and full citizenship are somehow condescending and infantilizing. This is the logic of Clarence Thomas in his neutering the Voting and Civil Rights Acts. This betrayal of one of the core tenets of the Black Freedom Struggle is also tacitly and actively endorsed by black conservatives who are members of the Republican Party, because the latter’s strategy and goal for maintaining electoral power in the present and future is to limit the ability of non-whites to vote. My claims here are not at all based on some type of inexorable race essentialism or related fictions of “biological race.” The mantle of the Black Freedom Struggle, the miner’s canary, and the calling to be the moral conscience of a nation, are a function of history, values, political socialization, linked fate, the “blues sensibility”, and “love principle” that have driven black American freedom and resistance in the United States and elsewhere. Black conservatives in the post-civil-rights era are of that legacy while still having chosen to turn their backs on it. And others like Ben Carson, men and women influenced by radical Christian fundamentalism and cultivated ignorance on the historical and contemporary realities of the color line and American politics, are black conservative Don Quixotes, stuck in a fantasy world, fighting windmills, chimeras, and other enemies that do not exist. In their made up world, lies and fantasies are more comforting than hard realities and truths. Ben Carson and other black conservatives may have turned their backs to the Black Freedom Struggle — but it still claims them nonetheless.The Black Freedom Struggle began in America when the first Africans were brought to Florida in 1581. It continued onward through emancipation and reconstruction as black Americans “built a nation under their feet”, resisting chattel slavery, self-manumitting, taking up arms, and then building political and social institutions across the South and the rest of the United States. The Black Freedom Struggle would reach its peak with the Civil Rights Movement and be seared into American public memory with the Great March on Washington for Jobs and Freedom, and iconic speeches by Dr. King and others. The Civil Rights Movement continues today with Black Lives Matter and the centuries-long fight by black and brown folks against police thuggery, for a more equitable society, dignity, and full human rights for all peoples on both sides of the color line. The Black Freedom Struggle inspired other groups—women, gays and lesbians, the differently-abled—in the United States to resist and fight Power. It has also been a source of inspiration for people’s movements around the world. Of course, the individuals who led (and lead) the Black Freedom Struggle are not perfect. They, like all of us, are flawed. Black resistance to white supremacy occasionally (both necessarily and understandably) involved moments of fleeting flirtation with racial chauvinism. And one cannot overlook how political stagecraft and cruel realpolitik tried to erase the leadership role played by gays and lesbians in the Civil Rights Movement--this is a shameful blemish on the radically humanistic and transformative vision of American life offered by that glorious struggle. But in all, the Black Freedom Struggle has been a source of inspiration; black Americans are the moral conscience of a nation. Black America has earned that title even as much as it has been unfairly forced upon it. In that idealized role, black Americans are called to defend the weak against the strong, speak truth to power, and force America to live up to the promise of its democratic creed and vision. This obligation can give strength, clarity of purpose and energy to Black Americans and others who honor that legacy. Being part of a community that is “the miner’s canary” and “moral conscience of a nation” can exact a heavy burden. As such, some black folks have decided that the burden and obligation are too great to carry. Their shoulders are too narrow and weak. Ben Carson, black conservative and 2016 Republican presidential primary candidate, is one such person. Last week, Ben Carson surrendered to xenophobia, nativism, and intolerance when he suggested that Muslims are inherently incapable of being President of the United States because their faith is incompatible with the Constitution. As reported by CNN, in a conversation on Wednesday of this week Carson then suggested:
"I find black Republicans are treated extremely well in the Republican Party. In fact, I don't hear much about being a black Republican," he said Wednesday at an event in Michigan. "I think the Republicans have done a far superior job of getting over racism."
Carson was a Democrat for years, but said he's found the Republican Party to be more welcoming. "When you look at the philosophies of the two parties now, what I have noticed as a black Republican is that Republicans tend to look more at the character of people. And Democrats tend to look more at the color of their skin," he said Wednesday. Ben Carson’s comments are delusional, hypocritical, and vexing. Carson, like many movement conservatives, is a Christian theocrat who wants to weaken the boundaries between church and state in the United States. Carson, like other contemporary American conservatives, fetishizes the Constitution except when he wants to radically alter it: His suggestion that there should be a religious litmus test for office actually violates Article VI. Black Americans are not lockstep or uniform in their political beliefs. Spirited disagreement is central to black American political life. But for Carson to suggest that the Republican Party, with its Birtherism, Southern Strategy of overt and covert racism, and clear examples of “old fashioned” anti-black animus in the Age of Obama, is somehow a force for racial “progress” is an analysis that can only be offered by a person who is possessed of some sort of Stockholm Syndrome or willfully blind to empirical reality. Ben Carson’s pandering to Islamophobia is a violation of the Black Freedom Struggle’s spirit that black folks as unique victims of Power in America have a moral obligation to stand with the weak against the strong. Ultimately, he has rejected the legacy and burden of the Black Freedom Struggle. These are not meritorious acts of radical autonomy or individuality. Rather, they are acts of cowardice and betrayal. But if one rejects the Black Freedom Struggle, what do they replace it with? Black conservatives such as Ben Carson receive head-patting approval from white conservatives. The primary role of black conservatives in the post civil rights era is, as I have suggested many times both here at Salon and elsewhere, is to serve as human chaff and a defense shield against claims that white racism exists—and that today’s Republican Party is an organization whose “name brand” is based on mining white racial resentment, rage, and animus. Ben Carson, like Herman Cain before him, Supreme Court Justice Clarence Thomas, and the panoply of black conservatives trotted out on Fox News and elsewhere to excuse-make for white racism, are professional “black best friends” for the Republican Party. Ben Carson’s rejection of the Black Freedom Struggle and public embrace of Islamophobia is also very lucrative. Black conservatives, like women who reject feminism, gays and lesbians who oppose marriage equality, and Hispanics and Latinos who publicly bloviate against “illegal immigrants,” occupy a very lucrative niche in the right-wing media and entertainment apparatus. In the mid- to long-term, Carson’s black conservative hustle will earn him money on the lecture circuit. In the short-term, Carson’s Islamophobia has garnered at least $1 million in donations to his campaign. Betraying the Black Freedom Struggle is both ego gratifying for black conservatives—they are deemed by the White Right as the “special” or “good” black who is not the like the “other ones”—and financially lucrative. How do Black conservatives such as Ben Carson and Clarence Thomas, among others, reconcile their rejection of the Black Freedom Struggle with the fact that they, as members of the black elite and professional classes, are direct beneficiaries and products of it? They can imagine themselves as the true holders of the flame who are defending Black America’s “real interests” from trickery and deception by Democrats who want to keep black folks on a “plantation”. This is specious and insulting, of course, as such claims assume that black Americans are stupid, dumb, and unlike white folks, have no ability to make rational political calculi about their own collective self-interest. Contemporary black conservatives could also choose to rewrite the last 70 years or so of history--Republicans are the saviors of black Americans for time immemorial; Democrats are permanent enslavers and Klansman. In this imagined world, the Civil Rights Movement, and its won-in-blood-and-death victories -- such as the Voting Rights Act -- is somehow no longer needed. Moreover, protections for Black Americans which acknowledge the unique and continuing threat to their right to vote and full citizenship are somehow condescending and infantilizing. This is the logic of Clarence Thomas in his neutering the Voting and Civil Rights Acts. This betrayal of one of the core tenets of the Black Freedom Struggle is also tacitly and actively endorsed by black conservatives who are members of the Republican Party, because the latter’s strategy and goal for maintaining electoral power in the present and future is to limit the ability of non-whites to vote. My claims here are not at all based on some type of inexorable race essentialism or related fictions of “biological race.” The mantle of the Black Freedom Struggle, the miner’s canary, and the calling to be the moral conscience of a nation, are a function of history, values, political socialization, linked fate, the “blues sensibility”, and “love principle” that have driven black American freedom and resistance in the United States and elsewhere. Black conservatives in the post-civil-rights era are of that legacy while still having chosen to turn their backs on it. And others like Ben Carson, men and women influenced by radical Christian fundamentalism and cultivated ignorance on the historical and contemporary realities of the color line and American politics, are black conservative Don Quixotes, stuck in a fantasy world, fighting windmills, chimeras, and other enemies that do not exist. In their made up world, lies and fantasies are more comforting than hard realities and truths. Ben Carson and other black conservatives may have turned their backs to the Black Freedom Struggle — but it still claims them nonetheless.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 07:30

September 26, 2015

Confronted by my own bullsh*t: I wanted to be the voice of nonviolence for my church after George Zimmerman’s acquittal, but all I could do was cry for all my inconsistencies

"Well, friend, want to go to the shooting range with me?” Clayton said, his light brown eyes lighting up mischievously. We were stretching before the CrossFit class Clayton coaches when I mentioned that I’d recently realized he was my “token conservative friend,” the way some people have a “token black friend.” His response was to invite me to, of all things, a gun range. I reached for my toes as my liberal gun-control-advocate self immediately and gleefully replied, “Seriously? Of course I do.” Because politics should never, if at all possible, get in the way of fun. Little did I know that this would be one of several experiences, during what turned out to be the week of the George Zimmerman acquittal, that would make it virtually impossible for me to claim the liberal outrage/moral high ground I would later wish I could maintain, since life and its ambiguities sometimes throw our ideals into crisis. * A few days after his offer, I saw Clayton’s short, muscular frame walking up to my front door carrying a heavy black bag. He was there for a quick gun-safety lesson before we headed to the range, since I had never in my life actually held a handgun. Clayton is a Texan, a Republican, and a Second Amendment enthusiast. But since Clayton has a degree from Texas A&M and has lived part of his life in Saudi Arabia, where his dad was an oil man, he describes himself as a “well-educated and well-traveled redneck.” “There are four things you need to know,” Clayton said, beginning my very first gun-safety lesson. “One, always assume every gun you pick up is loaded. Two, never aim a gun at something you do not intend to destroy. Three, keep your finger off the trigger until you are ready to fire, and four, know your target and what’s beyond it. A gun is basically a paperweight. In and of themselves,” he claimed, “they are only dangerous if people do not follow these rules.” I’m not sure what the statistics are on gun-shaped paperweight deaths, I thought, but I’ll be sure to look that up. “Okay, ready?” Clayton asked. “I have no idea,” I replied. He placed a matte black handgun and a box of ammunition on our kitchen table, and it felt as illicit as if he had just placed a kilo of cocaine or a stack of Hustler magazines on the very surface where we pray and eat our dinners as a family. I tried to ask some intelligent questions. “What kind of gun is this?” “It’s a 40.” Like I had any idea what the hell that meant. “What’s a 9mm? I’ve heard a lot about those.” “This is.” And he lifted his shirt up to show his concealed handgun. “Man, you don’t carry that thing around all the time, do you?” He smiled. “If I’m not in gym shorts or pajamas, yes.” Later, in the firing-range parking lot, filled almost exclusively with pickup trucks, I made the astute observation, “No Obama bumper stickers.” “Weird, huh?” he joked. When I’m someplace cool, say an old cathedral or a hipster ice cream shop, I am sure to check in on Facebook. But not here. Partly because it was Monday morning and Clayton had penciled in our fun shooting date as a “work meeting,” but also because I didn’t want a rash of shit from my friends or parishioners— almost all of whom are liberal— asking if I’d lost my mind or simply been abducted by rednecks. As we stepped onto the casing-littered, black rubber-matted floor of the indoor firing range, I was aware of several important points: one, our guns were loaded and intended to destroy the paper target in front of us; two, I should put my finger on the trigger only when I intended to shoot; three, a wall of rubber and concrete was behind my target; and four, I sweat. I knew from hearing gunshots in my neighborhood that guns were loud. And I knew from the movies that there was a kickback when a gun was fired. But, holy shit, was I unprepared for how loud and jolting firing a handgun would be. Or how fun. We shot for about an hour, and after we were done, Clayton told me that I did pretty well, for a first-timer. (Except for when a hot shell casing went down my shirt and I jerked around so mindlessly that he had to reach over and turn the loaded gun in my hand back toward the target, making me feel like a total dumbass. A really, really dangerous dumbass.) But I loved it. I loved it like I love roller coasters and riding a motorcycle: not something I want in my life all the time, but an activity that is fun to do once in a while, that makes me feel like I’m alive and a little bit lethal. “Can we shoot skeet next time?” I asked eagerly as we made our way back to the camo-covered front desk to retrieve our IDs. The whole shop looked like a duck blind. As though if something dangerous or tasty came through the front door, all the young, acne-ridden guys who work there could take it down without danger of being spotted. On the way back to my house, I suggested we stop for pupusas (stuffed Salvadorian corn cakes) so that we both could have a novel experience on a Monday. Sitting at one of the five stools by the window at Tacos Acapulco— looking out on the check-cashing joints and Mexican panaderias that dot East Colfax Avenue— I took the opportunity to ask a burning question: “So why in the world do you want to carry a gun all the time?” I’d never knowingly been this close to a gun-carrier before, and it felt like my chance to ask something I’d always wanted to know. I could only hope my question didn’t feel to him like it does to our black friend Shayla when people ask to touch her afro. As he tried managing with his fork the melted cheese that refused to detach itself from the pupusa, he said, “Self-defense, and pride of country. We have this right, so we should exercise it. Also if someone tried to hurt us while we were sitting here, I could take them down.” It was a foreign worldview to me, that people could go through life so aware of the possibility that someone might try to hurt them, and that, as a response, they would strap a gun on their body as they made their way through Denver. I didn’t understand or even approve. But Clayton is my token conservative friend, and I love him, and he went through the trouble of taking me to the shooting range, so I left it there. * The week I went shooting with Clayton was also the week of my mother’s seventieth and my sister’s fiftieth birthday party. It was a murder mystery dinner, so, five nights after blasting paper targets with Clayton at the shooting range, I sat on the back patio of my parents’ suburban Denver home and pretended to be a hippie winemaker for the sake of a contrived drama. Normally, my natural misanthropy would prevent me from participating in such awkward nonsense, but I soon remembered how many times I had voluntarily dressed myself up and played a role in other contrived dramas that didn’t involve a four-course meal or civil company (like the year I tried to be a Deadhead), so I submitted to the murder mystery dinner for the sake of two women I love. My role called for a flowing skirt, peasant blouse, and flowers in my hair— none of which I own or could possibly endure wearing, so a nightgown and lots of beads had to do the trick. Throughout the mostly pleasant evening, I would see Mom talking to my brother out of the side of her mouth, just like she did when we were kids and she wanted to tell Dad something she didn’t want us to know. I watched my mom, unaware that an unscripted drama was unfolding around the edges of the fictional one that called for flowers in my fauxhawk. As I snuck into the kitchen to check my phone for messages, my dad followed me to fill me in on what was happening. It turns out my mom’s side-of-the-mouth whispers were about something serious. My mom had been receiving threats from an unbalanced (and reportedly armed) woman who was blaming my mom for a loss she had experienced. My mom had nothing to do with this loss, but that didn’t stop this woman from fixating on her as the one to blame. And she knew where my mom went to church on Sundays. “It’s made being at church pretty tense for us,” my father told me. My older brother Gary, who is a law enforcement officer in a federal prison and who, along with his wife and three kids, attends the same church as my parents, walked by Dad and me in the kitchen and said, “Horrible, right? The past three weeks I’ve carried a concealed weapon to church in case she shows up and tries anything.” I immediately thought of Clayton and his heretofore foreign worldview, weighing it against how I now felt instinctually glad that my brother would be able to react if a crazy person tried to hurt our mother. And how, at the same time, it felt like madness that I would be glad someone was carrying a gun to church. But that’s the thing about my values—they tend to bump up against reality, and when that happens, I may need to throw them out the window. That, or I ignore reality. For me, more often than not, it’s the values that go. My gut reaction to my brother’s gun-carrying disturbed me, but not as much in the moment as it would the next morning. * On the night of the party, I missed the breaking news that George Zimmerman, who had shot and killed unarmed teen Trayvon Martin, had been found not guilty on all counts. For more than a year, the case had ignited fierce debate over racism and Florida’s “Stand Your Ground” law, which allows the use of violent force if someone believes their life is being threatened. My Facebook feed was lit up with protests, outrage, and rants. I wanted to join in and act as a voice for nonviolence that week, but when I heard on NPR that George Zimmerman’s brother was saying he rejected the idea that Trayvon Martin was unarmed, Martin’s weapon being the sidewalk on which he broke George’s nose, well, my first reaction was not nonviolence but an overwhelming urge to reach through the radio and give that man a fully armed punch in the throat. Even more, that very week, a federal law enforcement officer was carrying a concealed weapon into my mom’s church every Sunday. Which is insane and something I would normally want to post a rant about on my Facebook wall for all the liberals like me to “like.” Except in this case, that particular law enforcement officer (a) was my brother, and (b) carried that weapon to protect his (my) family, his (my) mother, from a crazy woman who wanted her dead. When I heard that my brother was armed to protect my own mom, I wasn’t alarmed like any good gun-control supporting pastor would be. I was relieved. And now what the hell do I post on Facebook? What do I do with that? I also had to deal with the fact that I simply could not express the level of antiracist outrage I wanted to, knowing something that no one else would know unless I said it out loud: despite my politics and liberalism, when a group of young black men in my neighborhood walk by, my gut reaction is to brace myself in a different way than I would if those men were white. I hate this about myself, but if I said that there is not residual racism in me, racism that— after forty-four years of being reinforced by messages in the media and culture around me— I simply do not know how to escape, I would be lying. Even if I do own an “eracism” bumper sticker. The morning after the George Zimmerman verdict, as I was reflecting on what to say to my church about it, I wanted to be a voice for nonviolence, antiracism, and gun-control as I felt I should (or as I saw people on Twitter demanding: “If your pastor doesn’t preach about gun control and racism this week, find a new church”) — but all I could do was stand in my kitchen and cry. Cry for all my inconsistencies. For my parishioner and mother of two, Andrea Gutierrez, who said to me that mothers of kids with brown and black skin now feel like their children can legally be target practice on the streets of suburbia. For a nation divided — both sides hating the other. For all the ways I silently perpetuate the things I criticize. For the death threats toward my family and the death threats toward the Zimmerman family. For Tracy Martin and Sybrina Fulton, whose child, Trayvon, was shot dead, and who were told that it was more his fault than the fault of the shooter. Moments after hearing about the acquittal, I walked my dog and called Duffy, a particularly thoughtful parishioner. “I’m really screwed up about all of this,” I said, proceeding to detail all the reasons that, even though I feel so strongly about these issues, I could not with any integrity “stand my own ground” against violence and racism — not because I no longer believe in standing against those things ( I do), but because my own life and my own heart contain too much ambiguity. There is both violence and nonviolence in me, and yet I don’t believe in them both. She suggested that maybe others felt the same way and that maybe what they needed from their pastor wasn’t the moral outrage and rants they were already seeing on Facebook; maybe they just needed me to confess my own crippling inconsistencies as a way for them to acknowledge their own. That felt like a horrible idea, but I knew she was right. So often in the church, being a pastor or a “spiritual leader” means being the example of “godly living.” A pastor is supposed to be the person who is really good at this Christianity stuff — the person others can look to as an example of righteousness. But as much as being the person who is the best Christian, who “follows Jesus” the most closely can feel a little seductive, it’s simply never been who I am or who my parishioners need me to be. I’m not running after Jesus. Jesus is running my ass down. Yeah, I am a leader, but I’m leading them onto the street to get hit by the speeding bus of confession and absolution, sin and sainthood, death and resurrection— that is, the gospel of Jesus Christ. I’m a leader, but only by saying, “Oh, screw it. I’ll go first.” I stood the next day in the copper light of sundown in the parish hall where House for All Sinners and Saints meets and confessed all of this to my congregation. I told them there had been a million reasons for me to want to be the prophetic voice for change, but every time I tried, I was confronted by my own bullshit. I told them I was unqualified to be an example of anything but needing Jesus. That evening I admitted to my congregation that I had to look at how my outrage feels good for a while, but only like eating candy corn feels good for a while— I know it’s nothing more than empty calories. My outrage feels empty because what I am desperate for is to speak the truth of my burden of sin and have Jesus take it from me, yet ranting about the system or about other people will always be my go-to instead. Because maybe if I show the right level of outrage, it’ll make up for the fact that every single day of my life I have benefitted from the very same system that acquitted George Zimmerman. My opinions feel good until I crash from the self-righteous sugar high, then realize I’m still sick and hungry for a taste of mercy. * The first time I was asked to give a lecture on preaching at the Festival of Homiletics, a national conference for preachers, they wanted me to give a talk on what preaching is like at House for All. I wasn’t sure what to say, so I asked my congregation. There was passion in their replies, and none of it had to do with how much they appreciate their preacher being such an amazing role model for them. Not one of them said they love all the real-life applications they receive in the sermons for how to have a more victorious marriage. Almost all of them said they love that their preacher is so obviously preaching to herself and just allowing them to overhear it. My friend Tullian put it this way: “Those most qualified to speak the gospel are those who truly know how unqualified they are to speak the gospel.” Never once did Jesus scan the room for the best example of holy living and send that person out to tell others about him. He always sent stumblers and sinners. I find that comforting. Reprinted from "ACCIDENTAL SAINTS: FINDING GOD IN ALL THE WRONG PEOPLE." Copyright © 2015 by Nadia Bolz-Weber. Published by Convergent Books, an imprint of Penguin Random House LLC."Well, friend, want to go to the shooting range with me?” Clayton said, his light brown eyes lighting up mischievously. We were stretching before the CrossFit class Clayton coaches when I mentioned that I’d recently realized he was my “token conservative friend,” the way some people have a “token black friend.” His response was to invite me to, of all things, a gun range. I reached for my toes as my liberal gun-control-advocate self immediately and gleefully replied, “Seriously? Of course I do.” Because politics should never, if at all possible, get in the way of fun. Little did I know that this would be one of several experiences, during what turned out to be the week of the George Zimmerman acquittal, that would make it virtually impossible for me to claim the liberal outrage/moral high ground I would later wish I could maintain, since life and its ambiguities sometimes throw our ideals into crisis. * A few days after his offer, I saw Clayton’s short, muscular frame walking up to my front door carrying a heavy black bag. He was there for a quick gun-safety lesson before we headed to the range, since I had never in my life actually held a handgun. Clayton is a Texan, a Republican, and a Second Amendment enthusiast. But since Clayton has a degree from Texas A&M and has lived part of his life in Saudi Arabia, where his dad was an oil man, he describes himself as a “well-educated and well-traveled redneck.” “There are four things you need to know,” Clayton said, beginning my very first gun-safety lesson. “One, always assume every gun you pick up is loaded. Two, never aim a gun at something you do not intend to destroy. Three, keep your finger off the trigger until you are ready to fire, and four, know your target and what’s beyond it. A gun is basically a paperweight. In and of themselves,” he claimed, “they are only dangerous if people do not follow these rules.” I’m not sure what the statistics are on gun-shaped paperweight deaths, I thought, but I’ll be sure to look that up. “Okay, ready?” Clayton asked. “I have no idea,” I replied. He placed a matte black handgun and a box of ammunition on our kitchen table, and it felt as illicit as if he had just placed a kilo of cocaine or a stack of Hustler magazines on the very surface where we pray and eat our dinners as a family. I tried to ask some intelligent questions. “What kind of gun is this?” “It’s a 40.” Like I had any idea what the hell that meant. “What’s a 9mm? I’ve heard a lot about those.” “This is.” And he lifted his shirt up to show his concealed handgun. “Man, you don’t carry that thing around all the time, do you?” He smiled. “If I’m not in gym shorts or pajamas, yes.” Later, in the firing-range parking lot, filled almost exclusively with pickup trucks, I made the astute observation, “No Obama bumper stickers.” “Weird, huh?” he joked. When I’m someplace cool, say an old cathedral or a hipster ice cream shop, I am sure to check in on Facebook. But not here. Partly because it was Monday morning and Clayton had penciled in our fun shooting date as a “work meeting,” but also because I didn’t want a rash of shit from my friends or parishioners— almost all of whom are liberal— asking if I’d lost my mind or simply been abducted by rednecks. As we stepped onto the casing-littered, black rubber-matted floor of the indoor firing range, I was aware of several important points: one, our guns were loaded and intended to destroy the paper target in front of us; two, I should put my finger on the trigger only when I intended to shoot; three, a wall of rubber and concrete was behind my target; and four, I sweat. I knew from hearing gunshots in my neighborhood that guns were loud. And I knew from the movies that there was a kickback when a gun was fired. But, holy shit, was I unprepared for how loud and jolting firing a handgun would be. Or how fun. We shot for about an hour, and after we were done, Clayton told me that I did pretty well, for a first-timer. (Except for when a hot shell casing went down my shirt and I jerked around so mindlessly that he had to reach over and turn the loaded gun in my hand back toward the target, making me feel like a total dumbass. A really, really dangerous dumbass.) But I loved it. I loved it like I love roller coasters and riding a motorcycle: not something I want in my life all the time, but an activity that is fun to do once in a while, that makes me feel like I’m alive and a little bit lethal. “Can we shoot skeet next time?” I asked eagerly as we made our way back to the camo-covered front desk to retrieve our IDs. The whole shop looked like a duck blind. As though if something dangerous or tasty came through the front door, all the young, acne-ridden guys who work there could take it down without danger of being spotted. On the way back to my house, I suggested we stop for pupusas (stuffed Salvadorian corn cakes) so that we both could have a novel experience on a Monday. Sitting at one of the five stools by the window at Tacos Acapulco— looking out on the check-cashing joints and Mexican panaderias that dot East Colfax Avenue— I took the opportunity to ask a burning question: “So why in the world do you want to carry a gun all the time?” I’d never knowingly been this close to a gun-carrier before, and it felt like my chance to ask something I’d always wanted to know. I could only hope my question didn’t feel to him like it does to our black friend Shayla when people ask to touch her afro. As he tried managing with his fork the melted cheese that refused to detach itself from the pupusa, he said, “Self-defense, and pride of country. We have this right, so we should exercise it. Also if someone tried to hurt us while we were sitting here, I could take them down.” It was a foreign worldview to me, that people could go through life so aware of the possibility that someone might try to hurt them, and that, as a response, they would strap a gun on their body as they made their way through Denver. I didn’t understand or even approve. But Clayton is my token conservative friend, and I love him, and he went through the trouble of taking me to the shooting range, so I left it there. * The week I went shooting with Clayton was also the week of my mother’s seventieth and my sister’s fiftieth birthday party. It was a murder mystery dinner, so, five nights after blasting paper targets with Clayton at the shooting range, I sat on the back patio of my parents’ suburban Denver home and pretended to be a hippie winemaker for the sake of a contrived drama. Normally, my natural misanthropy would prevent me from participating in such awkward nonsense, but I soon remembered how many times I had voluntarily dressed myself up and played a role in other contrived dramas that didn’t involve a four-course meal or civil company (like the year I tried to be a Deadhead), so I submitted to the murder mystery dinner for the sake of two women I love. My role called for a flowing skirt, peasant blouse, and flowers in my hair— none of which I own or could possibly endure wearing, so a nightgown and lots of beads had to do the trick. Throughout the mostly pleasant evening, I would see Mom talking to my brother out of the side of her mouth, just like she did when we were kids and she wanted to tell Dad something she didn’t want us to know. I watched my mom, unaware that an unscripted drama was unfolding around the edges of the fictional one that called for flowers in my fauxhawk. As I snuck into the kitchen to check my phone for messages, my dad followed me to fill me in on what was happening. It turns out my mom’s side-of-the-mouth whispers were about something serious. My mom had been receiving threats from an unbalanced (and reportedly armed) woman who was blaming my mom for a loss she had experienced. My mom had nothing to do with this loss, but that didn’t stop this woman from fixating on her as the one to blame. And she knew where my mom went to church on Sundays. “It’s made being at church pretty tense for us,” my father told me. My older brother Gary, who is a law enforcement officer in a federal prison and who, along with his wife and three kids, attends the same church as my parents, walked by Dad and me in the kitchen and said, “Horrible, right? The past three weeks I’ve carried a concealed weapon to church in case she shows up and tries anything.” I immediately thought of Clayton and his heretofore foreign worldview, weighing it against how I now felt instinctually glad that my brother would be able to react if a crazy person tried to hurt our mother. And how, at the same time, it felt like madness that I would be glad someone was carrying a gun to church. But that’s the thing about my values—they tend to bump up against reality, and when that happens, I may need to throw them out the window. That, or I ignore reality. For me, more often than not, it’s the values that go. My gut reaction to my brother’s gun-carrying disturbed me, but not as much in the moment as it would the next morning. * On the night of the party, I missed the breaking news that George Zimmerman, who had shot and killed unarmed teen Trayvon Martin, had been found not guilty on all counts. For more than a year, the case had ignited fierce debate over racism and Florida’s “Stand Your Ground” law, which allows the use of violent force if someone believes their life is being threatened. My Facebook feed was lit up with protests, outrage, and rants. I wanted to join in and act as a voice for nonviolence that week, but when I heard on NPR that George Zimmerman’s brother was saying he rejected the idea that Trayvon Martin was unarmed, Martin’s weapon being the sidewalk on which he broke George’s nose, well, my first reaction was not nonviolence but an overwhelming urge to reach through the radio and give that man a fully armed punch in the throat. Even more, that very week, a federal law enforcement officer was carrying a concealed weapon into my mom’s church every Sunday. Which is insane and something I would normally want to post a rant about on my Facebook wall for all the liberals like me to “like.” Except in this case, that particular law enforcement officer (a) was my brother, and (b) carried that weapon to protect his (my) family, his (my) mother, from a crazy woman who wanted her dead. When I heard that my brother was armed to protect my own mom, I wasn’t alarmed like any good gun-control supporting pastor would be. I was relieved. And now what the hell do I post on Facebook? What do I do with that? I also had to deal with the fact that I simply could not express the level of antiracist outrage I wanted to, knowing something that no one else would know unless I said it out loud: despite my politics and liberalism, when a group of young black men in my neighborhood walk by, my gut reaction is to brace myself in a different way than I would if those men were white. I hate this about myself, but if I said that there is not residual racism in me, racism that— after forty-four years of being reinforced by messages in the media and culture around me— I simply do not know how to escape, I would be lying. Even if I do own an “eracism” bumper sticker. The morning after the George Zimmerman verdict, as I was reflecting on what to say to my church about it, I wanted to be a voice for nonviolence, antiracism, and gun-control as I felt I should (or as I saw people on Twitter demanding: “If your pastor doesn’t preach about gun control and racism this week, find a new church”) — but all I could do was stand in my kitchen and cry. Cry for all my inconsistencies. For my parishioner and mother of two, Andrea Gutierrez, who said to me that mothers of kids with brown and black skin now feel like their children can legally be target practice on the streets of suburbia. For a nation divided — both sides hating the other. For all the ways I silently perpetuate the things I criticize. For the death threats toward my family and the death threats toward the Zimmerman family. For Tracy Martin and Sybrina Fulton, whose child, Trayvon, was shot dead, and who were told that it was more his fault than the fault of the shooter. Moments after hearing about the acquittal, I walked my dog and called Duffy, a particularly thoughtful parishioner. “I’m really screwed up about all of this,” I said, proceeding to detail all the reasons that, even though I feel so strongly about these issues, I could not with any integrity “stand my own ground” against violence and racism — not because I no longer believe in standing against those things ( I do), but because my own life and my own heart contain too much ambiguity. There is both violence and nonviolence in me, and yet I don’t believe in them both. She suggested that maybe others felt the same way and that maybe what they needed from their pastor wasn’t the moral outrage and rants they were already seeing on Facebook; maybe they just needed me to confess my own crippling inconsistencies as a way for them to acknowledge their own. That felt like a horrible idea, but I knew she was right. So often in the church, being a pastor or a “spiritual leader” means being the example of “godly living.” A pastor is supposed to be the person who is really good at this Christianity stuff — the person others can look to as an example of righteousness. But as much as being the person who is the best Christian, who “follows Jesus” the most closely can feel a little seductive, it’s simply never been who I am or who my parishioners need me to be. I’m not running after Jesus. Jesus is running my ass down. Yeah, I am a leader, but I’m leading them onto the street to get hit by the speeding bus of confession and absolution, sin and sainthood, death and resurrection— that is, the gospel of Jesus Christ. I’m a leader, but only by saying, “Oh, screw it. I’ll go first.” I stood the next day in the copper light of sundown in the parish hall where House for All Sinners and Saints meets and confessed all of this to my congregation. I told them there had been a million reasons for me to want to be the prophetic voice for change, but every time I tried, I was confronted by my own bullshit. I told them I was unqualified to be an example of anything but needing Jesus. That evening I admitted to my congregation that I had to look at how my outrage feels good for a while, but only like eating candy corn feels good for a while— I know it’s nothing more than empty calories. My outrage feels empty because what I am desperate for is to speak the truth of my burden of sin and have Jesus take it from me, yet ranting about the system or about other people will always be my go-to instead. Because maybe if I show the right level of outrage, it’ll make up for the fact that every single day of my life I have benefitted from the very same system that acquitted George Zimmerman. My opinions feel good until I crash from the self-righteous sugar high, then realize I’m still sick and hungry for a taste of mercy. * The first time I was asked to give a lecture on preaching at the Festival of Homiletics, a national conference for preachers, they wanted me to give a talk on what preaching is like at House for All. I wasn’t sure what to say, so I asked my congregation. There was passion in their replies, and none of it had to do with how much they appreciate their preacher being such an amazing role model for them. Not one of them said they love all the real-life applications they receive in the sermons for how to have a more victorious marriage. Almost all of them said they love that their preacher is so obviously preaching to herself and just allowing them to overhear it. My friend Tullian put it this way: “Those most qualified to speak the gospel are those who truly know how unqualified they are to speak the gospel.” Never once did Jesus scan the room for the best example of holy living and send that person out to tell others about him. He always sent stumblers and sinners. I find that comforting. Reprinted from "ACCIDENTAL SAINTS: FINDING GOD IN ALL THE WRONG PEOPLE." Copyright © 2015 by Nadia Bolz-Weber. Published by Convergent Books, an imprint of Penguin Random House LLC.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 15:00

Praying at the church of rock and roll: How John Lennon made me a skeptic, Morrissey made me a believer and “Exile on Main Street” never let me down

“We believe in nothing, Lebowski!” say the nihilists who harass The Dude in his tub. I always equated nihilism with punk rock — there’s that song “88 Lines By 44 Women,” by The Nails, where the lead singer runs down the list of the bad relationships he’s had -- Jackie the “rich punk rocker,” Sarah the “modern dancer,” Suzy the Ohioan Scientologist -- and then he points out that Terry “didn’t give a shit, was just a nihilist.” Whereas Suzy probably believed in quite a lot, some of it hard to wrap one’s head around, I always favored Terry. That single hit the radio in 1982, right around the time many Reagan-supported Christian fundamentalists were gaining power, profit and influence, especially via their cable Sunday morning network TV exposure. In the Middle East, fundamentalist Muslims had succeeded in ousting the Shah via a passionate revolution. Scientolology was also expanding. I think they annexed Tom Cruise during this era.  There was even the Church of the Subgenius, led by pipe-smoking Bob Dobbs, which was appealingly satirical, though I wasn’t sure if it was a joke or not.   Most of all, it made me shrink. I envied those who believed... in anything at all. I was born in the late 1960s and raised “culturally Jewish," in that I loved baseball, jazz, Woody Allen and that mushroom and barley mixture they served us on holidays. I loved Anne Frank. I still have a certificate on my bulletin board which reminds me that I have a couple of trees planted in Israel, but I’ve never visited them and do not intend to now. I was Bar Mitzvahed on Coney Island (not in Luna Park or on the boardwalk in front of the corn and clams hut, but in an actual temple) but had no idea what I was saying during the manhood ritual. I read my Haf Torah phonetically and, let’s face it, went through the whole thing so that I could have a dance party (“Tainted Love” was the big hit, as was anything off "Dare" and  Yaz’s “Situation” and I got the message Soft Cell and Phil Oakey were laying down easily). Nobody in my family pushed me toward any religion. I don't blame them. I actually felt lucky at the time. So many of my friends had no choice. If anything, religion like a lot of rules and study to me, and like Joey Ramone, another secular Jew (I suspect), I didn't wanna be learned or tamed. I had a hard enough time mastering Trigonometry and all the muffled lyrics to the new R.E.M. EP, much less the ins and outs of the holy Torah. This a roundabout way of saying I, too, believed in nothing. And now, over 30 years later, I find that I am a middle aged man who still believes in zip. The difference is that as I find myself on the wrong side of 45, I have only just started to wonder why.   Autumn's kick-off this week brings as it does every year new birthday (for me and for Gwen Stefani who is only a  day younger than I am), as well as the baseball playoffs and the hint of Christmas and Chanukah and Kwanzaa (seven weeks away — prepare yourselves, it's coming). Once again, I find myself surrounded on all sides by  devout New Yorkers, and happy children on my TV. With regard to sports, I know people pray for the Yankees (or the Red Sox, or this year the Mets) to succeed while I can only grit my teeth. If there’s a hurricane, as we are in hurricane season, the believers pray for safety, while I stock cans of Heinz beans and try to remember where I placed my knife with the compass on the handle. They are sure of themselves and grateful for the gift of faith. I am ashamed for the lack of it. I don't even have the option to be a proud atheist like Bill Maher or the late great Chris Hitchens. Atheism, to me, is just another form of belief in something greater than oneself, and the only thing that's greater than myself, the only way I excel, is by realizing just how much my head is filled with rock and roll facts, figures, theories and lyrics.   I’m the Rain Man of pop, with no room for the spirit. It may try to enter me, but it will come up against a wall of British indie. I don’t blame my parents. I blame the Stone Roses for releasing a perfect debut album. I blame all four Beatles, and Bob Marley and Eric B and Rakim and even They Might Be Giants. They've squatted in my soul where faith and belonging might have found a place to blossom. There's even room for terrible music there, but not for God. And even if I had a vacancy, I am pretty sure I would evict it after a while. For this, and I know this is somewhat of a psychopath's cliché, I blame John Lennon (that said, my favorite Salinger book is "Franny and Zooey"). Lennon made me the skeptic I am today. The cynic. The guy who loiters in the used car dealership of soulfulness without ever taking the keys. My father left the family in 1980 to become the ramblin’ gamblin' man that he remains today (I think he's still alive, I haven't seen him in 10 years). The old man was not cut out for a domestic life and a job that kept him from the track. Later that year, my mother woke me up one morning in early December, in tears. I thought she was going to tell me that my Dad died.  A loanshark shot him in the belly, maybe? Or a horse broke from the paddock and trampled him. Maybe he was shot in a poker match like Stagger Lee. Instead, she told me that John Lennon, my hands-down favorite Beatle, was murdered steps from his doorway in Manhattan, about 45 minutes from my house.   I’d never heard any solo Beatles songs. I was 11 and still just into a phase where I was collecting each Beatles record, and once you have your own vinyl copy of The White Album, complete with the poster… well, hell, you can spend days just looking at the collage. It takes months for an 11-year-old, even an already pop-savvy one, to fully absorb the double album. But after that, I began to collect Lennon's solo stuff  and drawing his name on my white Hanes tees. Soon, I happened to hear the song “God,” from his solo debut, 1970's "Plastic Ono Band." There are incredible songs all over that record:  “Mother,” “Isolation,” “Well Well Well,” even “My Mummy’s Dead” (amazing nobody’s ever covered that one). But “God,” with its climactic list of things that John no longer believed in, seemed to fortify all my suspicions and faithlessness in a matter of seconds. Things John did not believe in: Magic. I-Ching. Bible. Tarot. Hitler (a relief). Jesus. Kennedy. Mantra. Gita. Yoga. Kings. Elvis. Zimmerman (Bob Dylan) and the Beatles.  He believed in himself and Yoko, that was it. That was “reality,” John promised. God was, according to the lyrics, nothing but a “concept by which we measure our pain.” He said it again for emphasis, but he didn’t need to on my account. Here was the same guy who, in his 20s, sang about love and peace. Now he was cleaning house and it felt righteous to me. It helped me mourn him. But it also left me one suspicious little duck as I came of age. “It’s cool not to believe in anything,” I told myself.   I not only renounced my faith (no press release), but I renounced Ringo, Paul and George. John was my guy, perfect, dead, a kind of saint. Bob Dylan, who I was discovering at the time as well,  was my guy too. Bob sang, “Don’t follow leaders," when he was a young man. Yeah, fuck those leaders. And watch the parking meters. I defined myself by how much shit I refused to take, from anyone, anywhere, anyhow that I chose. But John didn't even buy into Bob. Lennon was truly a love-and-peace kind of guy, for all his perfect vitriol and skepticism — remember, he dispensed with the Maharishi in a devastating three minutes in the form of “Sexy Sadie,” and did the same to Dylan by answering his Christian period hit “Serve Somebody” with the venomous “Serve Yourself.” (“You tell me you found Jesus Christ, well that’s great, and he’s the only one. You say you just found Buddha and he’s sitting on his ass in the sun.”). Joni Mitchell was even better. She didn't even buy into John. "They won't give peace a chance," she sings in "California," on Blue, "that was just a dream some of us had."  Go Joni, raise your eyebrow for me. Ringo was still flashing that insipid peace sign every time he saw a camera and Paul was singing about "Pipes of Peace," and we all know George was following his path, so they had to make way for Joni, and Leonard Cohen, another perfect cynic ("Everybody knows that the dice are loaded..." he sang.   Quel sorprese, Lenny).   As I got a little bit older, by say 1984, I was whatever you call a white suburban kid who dresses in all black, with a black raincoat and spikey hair. A punk? A Goth? An American iteration of a rain soaked British indie youth? I was not a skinhead or a Rude Boy or a hippie, I know that. But I could not go full-tilt into punk rock or follow the Dead or even reinvent myself as a B-Boy because that would require allegiance to some kind of mini cultural ethos. Gang of Four loved a man in uniform, but I didn’t trust people in culture drag whether they were cops or B Boys. Yes, my rain coat was in its own way Mancunian culture drag, but it felt plain enough that I gave myself a pass. My sister owned tie-dyed t-shirts. Other kids I knew dressed like Michael Stipe. I dressed like a postman from Salford.   I was a fan of no one. TV shows would let me down, authors were never as good as people said they were, and Echo and the Bunnymen, U2, Depeche Mode and Siouxsie started to suck the more popular they got. It took a lot for a band (or a writer or a filmmaker, or, you know… a young woman) to penetrate my reinforced, barbed-wire-covered wall of perfect doubt. Only one band got in, really. One in hundreds.   If there was a wrench in my perfect I am a rock/island status, it came from my beloved land of raincoats.   In 1984 and 1985, The Smiths, Morrissey, Johnny Marr, Mike Joyce and Andy Rourke, seemed to pass right through my wall of bullshit proofing like vapor, and swirled around me until  I was dizzy and swore my first allegiance to a band in the better part of a decade. The Smiths confused me because I couldn't tell if they believed in anything at all or were just as skeptical as I was. Morrissey was contradictory. "My faith in love is still devout," he sang, but earlier, he'd dismissed the same emotion as a "miserable lie." It didn't matter — part of the reason I couldn't take my eye off them (in addition to the fact that they were amazing to just ... observe) was to figure out where they truly lay, on my side of philosophy or with the believers? “If it’s not love, then it’s the bomb that will bring us together,” he promised. Did that mean that love was necessary? Or was the bomb just as acceptable? ("Come, come, nuclear war," he pleaded years later as a solo artist.) By college, I had every chance to be changed. I was happy there (a small liberal arts school in Vermont with a modicum of notoriety, thanks to a certain book by a certain writer) and I was doing enough acid that I could have probably been indoctrinated into the Manson family and died for Charlie, but I was also listening to a lot of indie rock and most of the singers of this indie rock employed  this thing called irony ("does anyone remember irony") as they sang their mostly deadpan lyrics (Camper Van Beethoven and the like). It was no call to arms, this stuff; nothing to take literally. It was safe.  I didn't really have to take any skinheads bowling. I could stay in my perfect bubble and quietly rue anyone who believed in anything at all. Vegans. War protestors (come the Gulf). All causes, good or bad, were bound to be counted out as bullshit before too long. Any mode of spiritual salvation was a money grab, if you asked me then. Three-card monte. Squeegee men. Prophets. They were all the same. By graduation, when I was on the cusp of becoming a published writer/artist myself, Trent Reznor was busy screaming “God is dead and no one cares, if there is a hell, I’ll see you there.” I liked that. No one cared. I didn't care. And yet I was still too much of a wuss to join the Atheists. Trust no one, ran the "X-Files" catch phrase, which I amended to, "not even those who trust no one."   One day in my 40s I asked my shrink what I should do about this problem, because nobody wants to start feeling their mortality without at least some kind of insurance policy for the soul. He suggested that I meditate. He gave me a mantra (a Sanskrit word with literally no meaning, he said) and I attempted it dutifully, but as I sat there, cross-legged on my floor, breathing in and out and reciting this word, my mind began to wander and I started to think about balance, and how it’s dangerous to have too much faith and dangerous to have too little and that nobody really has no faith at all and then I started thinking about "Led Zeppelin 2" and how it's such a great record to play in the fall ("leaves are falling all around...")   After almost 20 years of working in rock and roll it began to occur to me: maybe that alone was what I believed in, and I was facing a real paradox — if you believe in nothing because of rock and roll, doesn’t that really translate into an utterly devout and worshipful relationship with rock and roll itself, one as devoted and unbreakable as any faith bond that's more widely accepted?    After all, no matter where you go or how base you are, it's always there. Bob Seger sang “Rock n’ Roll Never Forgets,” and even more convincingly KISS (via Argent) sang “God Gave Rock 'n’ Roll to You."  Yes, those songs kind of blow (well, Seger has better ones anyway) but the point held water. So I could not be a Rastafarian, a Satanist, a Buddhist. I would be a Rock 'n’ Roll-ist. I search for meaning in the grooves of "Imperial Bedroom" or "Bookends" or "There's A Riot Going On" because I believe there is meaning there. Surely Elvis Costello would frown upon me deifying him, and Tom Waits, too; and if he were alive, Kurt Cobain would be horrified, as would Nick Drake (although it probably didn't take much to horrify that guy), but that's who I am and as I push 50, it's getting high time to just accept it. Rock and roll made a skeptic out of me but it also gave me my own private temple where the B-52s are welcome as is Matthew Sweet and even The Eagles (actually just Eagles if you ask them).   I never stop trying, of course, to get with the acceptable faiths or bring others around to mine.   After we broke up, my ex-girlfriend, another rock writer of some note, got heavily into yoga and the culture of yoga that goes along with it and it felt like I lost her twice. I started dating another woman who was obsessed with her gluten intake and when I said "That's all bullshit," I lost her too. It became clear that at this rate, I was destined to end up with nothing but my records. So I began to wonder: is that all bad? Little Richard gave up playing for God, but when I hear his music I hear and feel something that I can only identify as joyful and powerful and holy.  And so I played her Little Richard.  A lot of it. And some Rock Steady. Some Madness.  Selecter.  And to her it was just music. Good music. It wasn't sacred. I should have gone with Springsteen first.   If I have a point here it's that who is to say that the full-body warmth and tingle that takes over when I hear "Led Zeppelin 2" or "Astral Weeks" is a kind of small rapture, a little strain of the great surrender that the devout feel. I certainly give myself over to it.  It has its way with my mood and my body. Maybe I’m the most spiritual person I know; I just turn to "Revolver" instead of the holy texts. "Revolver" is a holy text. The only question remains: What is the leap of faith required when you have tried and failed to revere anything but rock and roll on a spiritual level? I guess it’s that much of rock and roll is flawed and derivative, but then, you could say that about all faiths. Atheism, nihilism, any other -ism that rejects these things outright is certainly something to be respected, but if you’re looking and wondering, eventually, you gotta make the leap and hope your entire belief system isn’t shattered by bad EDM or another inscrutable Radiohead album — which could send me right back into the Temple proper. I know it’s there for me as an option, but so is my copy of "Exile on Main Street," which has yet to let me down.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 14:00

Can a thinking person still have faith? My skeptical, honest quest for religious answers

"I do not believe in God and I am not an atheist." -- Camus When I searched for God I found a man who looked like Will Ferrell. This search I’d kept secret. God, these days? Are you nuts? “Think what times these are,” Saul Bellow wrote—a generation ago; it’s worse today. How, against a contemporary background, do you contemplate the almighty? Who believes there’s an oasis in 2015’s scattered metaphysical sand? But I had a reason to search. A few years ago my 3-year-old son couldn’t rise from bed unaided, his walk was a limp. Our pediatrician frowned when breaking the news: Rheumatoid arthritis. And in that shiver of change I glimpsed, for my son, for me, a long line of vanishing possibilities. (The disease was most likely permanent.) And it was the same old dumb supplicant’s story. Gulped tears, inhibition, begging made ritual. Please, please, God—the tonality of fraudulence. Because how could I (secular, religiously illiterate I) talk to God? What could be said that wouldn’t embarrass my intelligence, or His (presuming I actually believed)? This was a new feeling. I’ve been a writer my entire adult life, but I didn’t have any language. At the moment it really mattered, whatever fluency I'd had came pretty much to zilch -- to smoke and cobwebs. And yet now my son was in bad shape. And so I asked God to help him.  And my son got well, in an almost miraculous fashion. It happened in an unlikely way: A stranger’s advice, skeptical doctors, a full recovery. (My wife wrote about it for the Times.)   This ecstatic surprise struck from me a real desire to thank, and to understand, and to believe more fully, in God. I prayed—self-invented, unschooled prayers—and I did nothing more than that. I did nothing, that is, to further my understanding, or even to deepen my thoughts about God, or belief in Him. Two years later, my wife found herself — a lapsed Presbyterian — diagnosed with an invasive melanoma. Time, again, to ask for mercy. I began going to temple. Or, I tried to. I'd visit synagogues and hear rabbis and I found all of it off-putting. But I was thirsting after guidance.  That's what it felt like, a need, centered in the throat. This is a universal deal, of course. Trouble = need. As it is with most of us, it is with me. I’m better suited to searching for answers than to the mundane facts of participatory worship. But next to my wife in bed, I felt like an ant colony: So many little black questions in me, hopes and doubts, crawling every which way. And then, somehow, a second time, my prayer was answered. My wife after a succinct surgery was given a clean bill of health. This set in train a search. I needed to find out more about God. Can you have faith today and remain a thinking person? I approached religious leaders: Riverside Church’s first Minister of Social Justice; the president of a Jesuit university; the head rabbi of Brooklyn’s leading reform congregations; a renowned Buddhist, etc.  I wanted only to have a back-and-forth about how to believe — a heavily secular Jew and a number of influential faith-thinkers, hashing out the how of it, discussing whether religion can work in a digital age, to be published in an epistolary way. What follows is the first of these exchanges—my interaction with Erik Kolbell, who looks like a bearded (handsomer) Will Ferrell, and is just as likable. He’s also a brilliant guy—a writer, psychotherapist, Yale Divinity School graduate, and ordained minister, and the first Minister of Social Justice at Riverside Church in New York City. (I became aware of him when I saw him on Charlie Rose talking about his inspired book "What Jesus Meant: The Beatitudes and a Meaningful Life.") Kolbell sat waiting for me in the chosen restaurant, a hand-written sign, scrawled in rickety ink, propped on his glass. “ERIK KOLBELL.”  When he smiled there was a certain Muppetish softness to the mouth —ah! Here you are, sit, sit— a smile almost comic in its munificence, in its knowledge that munificence is, nowadays, a pretty rare article of trade. What follows is a transcript of our written correspondence. The questions are mine, the answers are his. One of the things I was struck by when I went to services—Reform Jewish, Presbyterian and Catholic—was how much time was spent, in elegant ways, asking for things. Generally, these were inoffensive and general things.   -Please protect us, oh Lord.-   But that gave me pause just the same. How can we—a comfortable New York flock, for example—ask for anything, when we know there are so many who have it worse than we do?  I share your quandary on this issue, Darin, not only because it invites selfishness at the expense of compassion, but because it runs the risk of promoting an unhealthy understanding of the relationship between the individual and God.  Even asking for innocent things, such as good health, suggests a kind of magical thinking that I don't necessarily subscribe to.  If I can pray to God for health, then why not wealth as well?  And if so, is God then simply being defined as a dispenser of goods and blessings?  A generous grandfather, pockets stuffed with candy, eager to dole it out if only we ask nicely?  Rather, by way of prayer, I am fond of the first three words of Psalm 119:36, which read simply "Incline my heart…"  For my money, this is what we can justly ask of God; that our consciousness of God is such that our hearts are inclined toward those things that promote personal integration, deepened faith, human justice and universal mercy. In answer, then, to the question you pose below, with the Bellow quote, I believe prayer is the endeavor on the part of the individual (or the community) to dispose ourselves to a deepening consciousness of what God might want of us in order that these things — integration, faith, justice and mercy — are made manifest in our lives. So, then, you believe the Almighty wants something from each of us? Which leads to a bigger question: To what degree do you think God is involved in every person's life?    I am hesitant to impute any human characteristics to God, so even the term "want" makes me a little jittery.  As one theologian put it, "To say that God is love is really to say 'I experience what I call love from what I call God.'" It is the subjective experience and articulation of the objective reality of God; the infinite heavily filtered through layers of finitude. With this in mind, I would argue that our highest calling is to ascertain what it means to live a full and compassionate life, and then to aspire to live it.  We could say that this "pleases God" or we could say that it puts us in company with the manifestation of the ultimate good (or Good). As to your second point, I do not believe in a Grand Manipulator who sees to it that this or that team wins the big game, this or that town is spared the wrath of the tornado, this child lives and the other dies as a result of the same illness. I do believe that we can effect both good and ill on earth, and, as pertains the question of inexplicable and arbitrary suffering, while we cannot explain it (to do so is to demean it) we can redeem it.  And redemption is a holy task. Much of the rest of the services were finding different ways to praise the Almighty—and the Almighty asking us for praise. Why would a divinity, by definition majestic, and the author of all the splendor and complex genius  of creation, be hungry for so much praise? That seems to underestimate God—cheapen Him, make him smaller with insecurity.  As does the subject of my first question. I mean, it seems organized religion too often sees God as a praise-hungry personal assistant: someone who spends all His time doing chores for people and wanting credit for it. The flip side of asking God for things is praising him for the riches we enjoy.  If the praise is at God's bidding then we are indeed cheapening (not to mention anthropomorphizing) God.  But if, as some hymns, prayers, etc., are simply meant to be communal expressions of appreciation for the gift that is life itself, then they serve to keep us apprised of the difference between gratitude and entitlement.  To the point of loving God for His sake, I think I mentioned to you my problem with heaven and hell.  If we see them as rewards and punishments then they serve to arrest our moral development.  Any time a reward is attached to a moral gesture the gesture becomes cheapened.  Charity — caritas — is reduced to a quid pro quo.  There is nothing inherently moral about feeding the poor and visiting the sick if my motivation is ultimately selfish.  Better that I do it simply because it must be done. As la Rouchefoucauld put it, "We would frequently be ashamed of our good deeds if people saw all of the motives that produced them." And compare it to Tolstoy: "It is much better to do good in a way that no one knows anything about it." I want to ask you about a Wright Morris quote:  “The purpose of religion, quite simply, is to dispense with the problem of death.” You mentioned your doubt about heaven and hell. Do you think there's a life after death-- or, more to the point, is it imperative that religion have an answer? I think there are many believers who are agnostic about the question of an "afterlife," just as there are more than a few atheists who are similarly agnostic.  We just don't know, now, do we?  I am reminded of an old story of a rabbi talking to an unborn baby and telling the baby there is a magnificent world awaiting her. The baby responds: "Here's what I know. I know that I have every need met exactly where I am. I know that I have food, shelter, and comfort. What I don't know is what, if anything, lies at the other end of that tunnel. I don't know, because I have no empirical experience of it." This is all to say that I think it naïve hope to believe in a chunk of celestial real estate with winged angels and lilting harps. (Notice that nowhere in the Bible is there any real description of what the afterlife looks like.) But I also think it betrays a kind of arrogance to argue that the only reality that exists is the reality of our own perceptual experience, that perception gleaned and sorted out by the 10 percent of the brain we actually make full use of.  I am wide open to the possibility of a "beyond," which, again, we freight when we use language like "life after death," simply because it can suggest that whatever might lie beyond in some way resembles what we understand when we use the word "life." Think Timothy Leary, at the very least."I do not believe in God and I am not an atheist." -- Camus When I searched for God I found a man who looked like Will Ferrell. This search I’d kept secret. God, these days? Are you nuts? “Think what times these are,” Saul Bellow wrote—a generation ago; it’s worse today. How, against a contemporary background, do you contemplate the almighty? Who believes there’s an oasis in 2015’s scattered metaphysical sand? But I had a reason to search. A few years ago my 3-year-old son couldn’t rise from bed unaided, his walk was a limp. Our pediatrician frowned when breaking the news: Rheumatoid arthritis. And in that shiver of change I glimpsed, for my son, for me, a long line of vanishing possibilities. (The disease was most likely permanent.) And it was the same old dumb supplicant’s story. Gulped tears, inhibition, begging made ritual. Please, please, God—the tonality of fraudulence. Because how could I (secular, religiously illiterate I) talk to God? What could be said that wouldn’t embarrass my intelligence, or His (presuming I actually believed)? This was a new feeling. I’ve been a writer my entire adult life, but I didn’t have any language. At the moment it really mattered, whatever fluency I'd had came pretty much to zilch -- to smoke and cobwebs. And yet now my son was in bad shape. And so I asked God to help him.  And my son got well, in an almost miraculous fashion. It happened in an unlikely way: A stranger’s advice, skeptical doctors, a full recovery. (My wife wrote about it for the Times.)   This ecstatic surprise struck from me a real desire to thank, and to understand, and to believe more fully, in God. I prayed—self-invented, unschooled prayers—and I did nothing more than that. I did nothing, that is, to further my understanding, or even to deepen my thoughts about God, or belief in Him. Two years later, my wife found herself — a lapsed Presbyterian — diagnosed with an invasive melanoma. Time, again, to ask for mercy. I began going to temple. Or, I tried to. I'd visit synagogues and hear rabbis and I found all of it off-putting. But I was thirsting after guidance.  That's what it felt like, a need, centered in the throat. This is a universal deal, of course. Trouble = need. As it is with most of us, it is with me. I’m better suited to searching for answers than to the mundane facts of participatory worship. But next to my wife in bed, I felt like an ant colony: So many little black questions in me, hopes and doubts, crawling every which way. And then, somehow, a second time, my prayer was answered. My wife after a succinct surgery was given a clean bill of health. This set in train a search. I needed to find out more about God. Can you have faith today and remain a thinking person? I approached religious leaders: Riverside Church’s first Minister of Social Justice; the president of a Jesuit university; the head rabbi of Brooklyn’s leading reform congregations; a renowned Buddhist, etc.  I wanted only to have a back-and-forth about how to believe — a heavily secular Jew and a number of influential faith-thinkers, hashing out the how of it, discussing whether religion can work in a digital age, to be published in an epistolary way. What follows is the first of these exchanges—my interaction with Erik Kolbell, who looks like a bearded (handsomer) Will Ferrell, and is just as likable. He’s also a brilliant guy—a writer, psychotherapist, Yale Divinity School graduate, and ordained minister, and the first Minister of Social Justice at Riverside Church in New York City. (I became aware of him when I saw him on Charlie Rose talking about his inspired book "What Jesus Meant: The Beatitudes and a Meaningful Life.") Kolbell sat waiting for me in the chosen restaurant, a hand-written sign, scrawled in rickety ink, propped on his glass. “ERIK KOLBELL.”  When he smiled there was a certain Muppetish softness to the mouth —ah! Here you are, sit, sit— a smile almost comic in its munificence, in its knowledge that munificence is, nowadays, a pretty rare article of trade. What follows is a transcript of our written correspondence. The questions are mine, the answers are his. One of the things I was struck by when I went to services—Reform Jewish, Presbyterian and Catholic—was how much time was spent, in elegant ways, asking for things. Generally, these were inoffensive and general things.   -Please protect us, oh Lord.-   But that gave me pause just the same. How can we—a comfortable New York flock, for example—ask for anything, when we know there are so many who have it worse than we do?  I share your quandary on this issue, Darin, not only because it invites selfishness at the expense of compassion, but because it runs the risk of promoting an unhealthy understanding of the relationship between the individual and God.  Even asking for innocent things, such as good health, suggests a kind of magical thinking that I don't necessarily subscribe to.  If I can pray to God for health, then why not wealth as well?  And if so, is God then simply being defined as a dispenser of goods and blessings?  A generous grandfather, pockets stuffed with candy, eager to dole it out if only we ask nicely?  Rather, by way of prayer, I am fond of the first three words of Psalm 119:36, which read simply "Incline my heart…"  For my money, this is what we can justly ask of God; that our consciousness of God is such that our hearts are inclined toward those things that promote personal integration, deepened faith, human justice and universal mercy. In answer, then, to the question you pose below, with the Bellow quote, I believe prayer is the endeavor on the part of the individual (or the community) to dispose ourselves to a deepening consciousness of what God might want of us in order that these things — integration, faith, justice and mercy — are made manifest in our lives. So, then, you believe the Almighty wants something from each of us? Which leads to a bigger question: To what degree do you think God is involved in every person's life?    I am hesitant to impute any human characteristics to God, so even the term "want" makes me a little jittery.  As one theologian put it, "To say that God is love is really to say 'I experience what I call love from what I call God.'" It is the subjective experience and articulation of the objective reality of God; the infinite heavily filtered through layers of finitude. With this in mind, I would argue that our highest calling is to ascertain what it means to live a full and compassionate life, and then to aspire to live it.  We could say that this "pleases God" or we could say that it puts us in company with the manifestation of the ultimate good (or Good). As to your second point, I do not believe in a Grand Manipulator who sees to it that this or that team wins the big game, this or that town is spared the wrath of the tornado, this child lives and the other dies as a result of the same illness. I do believe that we can effect both good and ill on earth, and, as pertains the question of inexplicable and arbitrary suffering, while we cannot explain it (to do so is to demean it) we can redeem it.  And redemption is a holy task. Much of the rest of the services were finding different ways to praise the Almighty—and the Almighty asking us for praise. Why would a divinity, by definition majestic, and the author of all the splendor and complex genius  of creation, be hungry for so much praise? That seems to underestimate God—cheapen Him, make him smaller with insecurity.  As does the subject of my first question. I mean, it seems organized religion too often sees God as a praise-hungry personal assistant: someone who spends all His time doing chores for people and wanting credit for it. The flip side of asking God for things is praising him for the riches we enjoy.  If the praise is at God's bidding then we are indeed cheapening (not to mention anthropomorphizing) God.  But if, as some hymns, prayers, etc., are simply meant to be communal expressions of appreciation for the gift that is life itself, then they serve to keep us apprised of the difference between gratitude and entitlement.  To the point of loving God for His sake, I think I mentioned to you my problem with heaven and hell.  If we see them as rewards and punishments then they serve to arrest our moral development.  Any time a reward is attached to a moral gesture the gesture becomes cheapened.  Charity — caritas — is reduced to a quid pro quo.  There is nothing inherently moral about feeding the poor and visiting the sick if my motivation is ultimately selfish.  Better that I do it simply because it must be done. As la Rouchefoucauld put it, "We would frequently be ashamed of our good deeds if people saw all of the motives that produced them." And compare it to Tolstoy: "It is much better to do good in a way that no one knows anything about it." I want to ask you about a Wright Morris quote:  “The purpose of religion, quite simply, is to dispense with the problem of death.” You mentioned your doubt about heaven and hell. Do you think there's a life after death-- or, more to the point, is it imperative that religion have an answer? I think there are many believers who are agnostic about the question of an "afterlife," just as there are more than a few atheists who are similarly agnostic.  We just don't know, now, do we?  I am reminded of an old story of a rabbi talking to an unborn baby and telling the baby there is a magnificent world awaiting her. The baby responds: "Here's what I know. I know that I have every need met exactly where I am. I know that I have food, shelter, and comfort. What I don't know is what, if anything, lies at the other end of that tunnel. I don't know, because I have no empirical experience of it." This is all to say that I think it naïve hope to believe in a chunk of celestial real estate with winged angels and lilting harps. (Notice that nowhere in the Bible is there any real description of what the afterlife looks like.) But I also think it betrays a kind of arrogance to argue that the only reality that exists is the reality of our own perceptual experience, that perception gleaned and sorted out by the 10 percent of the brain we actually make full use of.  I am wide open to the possibility of a "beyond," which, again, we freight when we use language like "life after death," simply because it can suggest that whatever might lie beyond in some way resembles what we understand when we use the word "life." Think Timothy Leary, at the very least.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 12:30

Inside the “Stonewall” catastrophe: A dull, miscast, misguided, bloated, schmaltzy and shlocky disaster of a movie

"Stonewall," the newly released movie about the 1969 rebellion that launched the modern gay liberation movement, is so bad it's almost baffling. It seems beyond comprehension that people could take such an electric piece of history and make something this dull, miscast, misguided, badly written, bloated, schmaltzy and shlocky out of it, but director Roland Emmerich and screenwriter Jon Robin Baitz have managed it.

Almost everything about "Stonewall" is terrible. You watch it alternately cringing and howling; the sparsely attended screening I went to was so rocked with laughter that you would have thought we were seeing the comedy of the year.

"Stonewall" caused a great deal of controversy before anyone had seen it, thanks to a trailer that pushed the real-life heroes of the riot—namely, trans people, drag queens, people of color and women—to the side in favor of a made-up white Midwesterner named Danny (played by Jeremy Irvine). Emmerich and Baitz defended themselves, saying that the trailer wasn't representative of the whole movie. Emmerich also said explicitly that the focus on Danny was a way to win straight people over--a strange goal if ever there was one, but a goal that underscores the perverse incentives of Hollywood as well as anything could. Emmerich and Baitz clearly don't know what kind of movie they've made. The representation issues in "Stonewall" are very real, and very glaring—yet another example of the film industry's insistence on pushing white characters to the forefront of its stories, even if they don't deserve to be there. It is Danny who throws the brick that launches the riots, Danny whose cry of "gay power" ignites the crowd, Danny who leads his people into battle.

The filmmakers, stuck in a Hollywood bubble, seemed surprised that anyone would have a problem with this. A better rejoinder would have been to make a good movie. But almost every inch of "Stonewall" is wrong.

First, it's so, so long. You feel every one of its 129 minutes, like painful shards of boredom were breaking your mind apart.

The writing is off-the-charts awful. It's as though Baitz reached into a bag marked "clichés," pulled some out at random and pasted them into the script. This is a movie where someone actually says, "Those kids, they've got nothing left to lose." Someone says, "I just want to break things!" about 40 minutes before he goes on to break things. Another character actually declares, "I'm too mad to love anybody right now."

The movie is shot through strange, grimy filters, as though Emmerich was trying to project some of the seediness of the Village through the lens. It just means that you have to squint a bit to see anything clearly. There are also anachronisms so glaring you wonder how they got through the editing process. In one sequence, Stonewall patrons dance to "I'll Take You There," a song that didn't come out until 1972.

The biggest problem with "Stonewall," though, is that it's not actually about Stonewall. Any real attempt to explore the politics behind the rebellion are cast aside in favor of creaky soap opera. For some reason, "Stonewall" thinks that what we really need is lots and lots of Danny, the sensitive Indiana boy who rolls into the Village and proceeds to learn a host of life lessons from the assorted rainbow coalition of queer ruffians whose main job in life is to worry about Danny's feelings.

We spend what feels like 17 years on Danny's past in Indiana--his doomed affair with another boy, his awful father's rejection, his plucky kid sister's tearful cries as he leaves the small town life behind forever. It's all so weirdly retro--Emmerich and Baitz have crafted a melodrama as hoary and sudsy as anything made in 1925, let alone 2015. It would all be deliciously kitschy if you didn't sense that all involved thought they were making a profound masterpiece.

This sense of antiquated staginess continues when Danny lands in New York. He immediately meets a group of Lost Boys (and, despite their gender-bending, this is decidedly a boys' movie--women might as well not exist) who, in their theatrical chatter, are more "West Side Story" than anything.

Their leader is Ray, a waifish Puerto Rican hustler. It's been a long time since I've seen an actor and character so thoroughly mistreated by a movie as Ray is by "Stonewall."  Actor Jonny Beauchamp brings an appealingly aggressive energy to the part, but he's fighting a losing battle with the script. "Stonewall" is more interested in whether Danny will dance with Ray than with what caused such a seismic event as the eponymous struggle to take place. Ray's default mode is an anguished screech. He spends the whole movie wailing hysterically about why Danny won't love him, and why the world is so down on him. That the main character of color's primary function is to moon over the stupid white kid is galling enough. That the character is supposed to be based on trans pioneer Sylvia Rivera makes it all the worse.

Ray's not the only whiner, though. Everybody in "Stonewall" whines all the time. Irvine, a Brit who brings little to the central role of Danny beyond his looks, has an especially difficult time with all of the mewling, as it exposes just how shaky his American accent is. This is a big problem, because Danny has a lot to complain about. He's thrust into the sort of cautionary tale about what happens when small-town boys go astray that would not be out of place in a conservative movie from the 1950s. You can almost see the trailer: Gasp as a destitute Danny is forced to turn his first trick! (The camera closes in hilariously on his crotch as the strings of doom pierce the soundtrack.) Sympathize as he is led astray by Trevor, a liberal sleaze who just wishes those kids would stop being so angry! (Let's all light a candle for Jonathan Rhys Meyers, lampooned in a thankless role.) Feel the terror when Danny is pimped out to a sadistic, old, cross-dressing queen! (This sequence features the kind of horror-movie gay gorgons that you thought had been left behind long ago.)

So much energy is expended on this bilge that, when the riot actually comes, it comes essentially out of nowhere. Emmerich and Baitz have barely bothered to lay any groundwork for the ostensible center of their film. Stonewall itself is a blip; the real story is whether or not Danny will ever reconcile with his family and make it to Columbia like he dreamed of.

Obviously, there is a great movie to be made about Stonewall. Just as obviously, "Stonewall" is not that movie. Maybe someone will be inspired by the magnitude of its failure to make the kind of film that Stonewall deserves.

Oh, and if you want to see a wonderful movie about a gay boy's coming-of-age against a real-life political backdrop, watch "Pride." It is as good as "Stonewall" is bad.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 11:00

The Milky Way’s missing mass has been partially found

Scientific American Giant galaxies such as the Milky Way and Andromeda consist mostly of exotic dark matter. But even our galaxy's ordinary material presents a puzzle since most of it is missing and remains undiscovered by scientists. Now, however, by watching a galaxy plow through the Milky Way's outskirts, astronomers have estimated the amount of gas surrounding our galaxy's bright disk, finding that this material outweighs all of the interstellar gas and dust in our part of the Milky Way. Measurements of the cosmic microwave background—the big bang's afterglow—indicate that one sixth of all matter in the universe is ordinary, or baryonic, containing protons and neutrons (or "baryons" in the parlance of physicists), just as stars, planets and people do. Based on the motion of distant objects orbiting the Milky Way, astronomers estimate that our galaxy is roughly a trillion times as massive as the sun. If five sixths of this material is dark matter, then this exotic substance makes up 830 billion solar masses of our galaxy; baryonic matter should account for the remaining 170 billion. The trouble is, all of our galaxy’s known stars and interstellar matter add up to only about 60 billion solar masses: 50 billion in stars and 10 billion in interstellar gas and dust. (The Milky Way has more than 100 billion stars, but most are smaller than the sun.) That leaves a whopping 110 billion solar masses of ordinary material unaccounted for. If the Milky Way is even more massive than currently estimated, this missing baryon problem gets worse—and the same conundrum afflicts other giant galaxies as well. Where are the missing baryons? Perhaps in a diffuse gaseous halo around the Milky Way. X-ray satellites have detected oxygen atoms in our galaxy that have lost most of their eight electrons, a sign they inhabit gas that is millions of degrees hot—far hotter than the surface of the sun. But since we don’t know how far these fried oxygen atoms are from us, we can’t accurately gauge the size of this component of the galaxy. If they're fairly close to the disk, then this so-called circumgalactic medium isn't extensive and therefore doesn't amount to much. But if they're far away, spread throughout a gargantuan halo, this gaseous material could outweigh all of the galaxy's stars, providing fuel for star formation for billions of years to come. Fortunately for astronomers, the Milky Way is so mighty that it rules a retinue of smaller galaxies that revolve around it just as moons orbit a planet. The most splendiferous satellite galaxy is the Large Magellanic Cloud, shining 160,000 light-years from Earth. Like all the other galactic satellites, this one moves around the Milky Way, but unlike most of its peers, it abounds with gas, which gets stripped as it ramsinto the halo's own gas. The amount of gas lost depends on the speed at which our neighbor moves and how dense the halo gas is. And that density can yield a mass estimate for the halo's gas. Recently, the Hubble Space Telescope measured the galaxy's speed. This allowed astronomers Munier Salem of Columbia University, Gurtina Besla of the University of Arizona and their colleagues to study the stripped gas and estimate that the gas density in the Milky Way's halo near the Large Magellanic Cloud is 0.0001 atoms per cubic centimeter. That's not much—only about 10,000 times more tenuous than the interstellar gas in the Milky Way's disk—but the halo covers a lot of real estate. In research submitted for publication in The Astrophysical Journal, the astronomers assume that the gas density declines with distance from the Milky Way's center, and calculate that the gas adds up to 26 billion solar masses, or close to half the amount in all of the Milky Way's stars. Matthew Miller, a graduate student at the University of Michigan who is completing his dissertation on the circumgalactic medium, says this number corresponds with previous estimates but is based on a more direct measurement of the density. Still, the newly calculated halo gas mass makes up just 15 percent of the Milky Way's expected baryonic content. Besla says the true quantity of the halo gas is probably greater because its density may decline less with distance than standard models assume. Miller suspects the missing baryons may be absent from the Milky Way altogether, having never fallen into our galaxy with the dark matter, in which case they are drifting in the vast space between giant galaxies. Besla predicts that future work may yield a better measurement. Another gas-rich galaxy—the Small Magellanic Cloud, 200,000 light-years from Earth—orbits the Large Magellanic Cloud. Their dance has spilled gas into a stream more than half a million light-years long. Most of this Magellanic Stream extends beyond the Large Magellanic Cloud and thus should probe the halo's gas density elsewhere, Besla says, further constraining the mass of the circumgalactic medium. Indeed, astronomers on Earth are lucky: They inhabit one of the few giant galaxies boasting two nearby gas-rich satellite galaxies. "It's amazing how much information this system provides us," Besla says. In contrast, all satellites orbiting a more typical giant galaxy have run out of gas, and any astronomers there may look upon their peers in the Milky Way with quiet envy.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 10:00

John Boehner’s tears and Pope Francis’ radical challenge: A spiritual leader rises as a political nonentity falls

Whether Pope Francis can bring meaningful change to the Roman Catholic Church, and establish a new global role for that venerable, tarnished and internally divided institution, remains to be seen. But some things can be perceived clearly amid the gushing media coverage of the pontiff’s triumphant American visit, and the global lovefest that continues to surround his papacy. First of all, the semiotics and messaging of the Vatican are enormously altered under Francis – especially compared to his ghoulish predecessor, Benedict XVI, who openly longed for a shrunken congregation of zealots, eunuchs and old ladies. Words and attitudes matter, especially the words and attitudes of a man believed by his followers to be infused with the Holy Spirit, and who traces his authority (at least nominally) all the way back to a fisherman who was given a new name by Jesus Christ. You don’t have to believe any of that stuff, needless to say, in order to grasp the importance of what Pope Francis demonstrated in Washington and New York this week: A spiritual leader can still play a powerful unifying role in the secular world, if only momentarily or temporarily, in a way that political leaders hardly ever can and hardly ever do, especially in the poisoned ideological climate of the United States. Those ideological toxins propelled Republicans and Democrats out onto Capitol Hill after Pope Francis’ address to Congress, where they assured the cameras that they had heard two entirely different speeches and that the pontiff was really on their side. But the fact remains that the GOP leadership and the most virulent of Tea Party legislators were compelled to listen respectfully while Francis called for an end to the death penalty, the international arms trade and epidemic homelessness, and challenged the world’s supposed superpower to address the global climate crisis, welcome immigrants and combat the “cycle of poverty” that accompanies “the creation and distribution of wealth.” (Did he intentionally omit the sentence arguing that politics “cannot be a slave to the economy and finance”? We will never know for sure.) It may be unduly optimistic to hope that the pope’s speech provoked some genuine introspection among the entrenched antagonists of the Beltway, many of them so stuffed with lobbyist cash and high on shutdown fervor they have entirely forgotten that the outside world exists. Mitch McConnell is not big on introspection. If nothing else, Francis shamed the permanent paralysis and philosophical nullity of America’s political caste in the eyes of the world, which is no small accomplishment. Were those really tears of joy John Boehner was crying? Or was the unnaturally hued House Speaker reflecting on the fact that his political career was about to end in abject failure? If we compare Francis’ brief papacy to the tragic and/or pathetic tale of Boehner, who has now been brought down by right-wing revolt after five years of nothingness, the differences are both illuminating and disturbing. Both men are constrained to a large degree by circumstances and institutions they cannot control, and the pope has the advantage of wielding nearly unlimited power, untrammeled by democracy. There is certainly intense political struggle inside the Vatican, and some ultramontane Catholics would cheer at news of Pope Francis’ downfall, as the crowd at the Values Voter Summit cheered when Marco Rubio told them that Boehner was toast. But those in the church hierarchy who hate Francis can’t simply vote him out; they would actually have to murder him, which has happened on numerous occasions but is more difficult to pull off these days. (We will set aside conspiracy theories about the 33-day reign of John Paul I in 1978.) Still, it’s fair to say that one of those men has made an effort to step outside the internal politics of his institution, and to view it in terms of its global and historic mission. The other has been entirely devoured by political infighting, and never had any larger vision or sense of purpose in the first place. He will go down in history as the answer to an especially devious trivia question: Who was the Speaker of the House during its least functional era since the Civil War? Taken entirely on its own terms, the Catholic Church is supposed to play a larger and more important role in the world than acting as the enforcers of an outmoded sexual morality or as a support system for tyrants and dictators. It’s supposed to be the leading exponent of the Gospel of Jesus Christ, a capacious and contradictory task that most popes of this century and the last one have studded with asterisks and steadily defined downward. Whether you like him or not, Francis is endeavoring to recapture that sense of larger mission. Somewhere inside John Boehner’s brain, behind the scores of last Saturday’s golf foursome and the siren song of that bottle of Highland single-malt in his desk drawer, there may persist some awareness that the United States Congress was supposed to serve a higher purpose too. By all accounts Boehner is a decent guy and a Midwestern small-town success story, who wanted to play the traditional role of compromiser and back-room dealmaker. He may once have read about the discussions that led to James Madison’s Virginia Plan and the bicameral Congress created in the Constitution. The House of Representatives is supposed to be the most direct arena for the reflection of popular opinion, and the driving force of policy change. It’s where stuff is supposed to get done. Looking back at Boehner’s soon-to-be-forgotten tenure, we can only conclude that either that is no longer possible or he was not the man for the job (and quite likely both). Of course we should not succumb too readily to Francis fervor, which presents a seductive but dangerous trap to many Catholics, former Catholics and “ethnic Catholics” like me. (I was never confirmed, but my father, my grandparents and many Irish generations before them were all Catholic, and I cannot deny a residual affiliation.) For many people, not all of them believers or even theists, Francis represents the possibility of spiritual renewal, a yearning that lies deep in human history and consciousness. To Catholics, he recalls the church of John XXIII and Vatican II, the church of Latin American “liberation theology,” the church that led Bobby Kennedy to get down on his knees in a California lettuce field in his Park Avenue suit, receiving Communion alongside Cesar Chavez and a team of immigrant farm workers. Pope Francis is those things and is not those things; his positioning is highly calculated and politically astute. First of all, the new pope’s words and actions are obviously limited by his church’s tormented history and its encrusted dogmas. Francis is not going to revoke priestly celibacy, overturn the ban on contraception or reverse centuries of teaching on homosexuality overnight, and quite likely does not want to. He is not pro-choice or “pro-gay” or a feminist; he simply does not want to be held hostage by issues that make the church appear irrelevant. His canonization of the 18th-century Spanish missionary Junípero Serra, viewed by many Native Americans as a genocidal invader who enslaved their ancestors and destroyed their cultures, was at the very least a significant P.R. blunder for a pope who prides himself on speaking for the downtrodden and the oppressed. All that said, it’s not fair to dismiss Francis’ loving and inclusive rhetoric, or his refusal to pronounce judgment against social forces and political movements repeatedly demonized by previous popes, as nothing more than lipstick applied to an enormous pig. Some activists in the Catholic Worker Movement apparently felt dissed by the pope’s brief reference to Dorothy Day, the charismatic and controversial co-founder of that most radical of all Christian social-justice organizations, in his Thursday address before Congress. I see their point, sort of: Francis pulled something of an MLK-style rebranding on Day, praising her for the strength of her faith and the inspiration she drew from the Gospel and the lives of the saints. Francis did not mention that Day fervently despised capitalism and was an ardent pacifist who refused to pay federal income tax, or that Catholic Worker was (and is) essentially an anarchist political movement that understands its communal households as models for nonviolent social revolution. If any self-respecting Republicans in that chamber had actually heard of Day (or could understand the pontiff’s imperfect English), they should have stalked out in outrage. Day was arrested numerous times for direct-action protests on behalf of many different causes, was under FBI surveillance for half a century and repeatedly sided with socialist and Communist revolutionaries. She summarized her differences with them this way: Communists wanted to make the poor rich, whereas “the object of Christianity is to make the rich poor.” As







 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 09:00

Mentored in the art of manipulation: Donald Trump learned from the master — Roy Cohn

When the country finally ended Sen. Joseph McCarthy’s secular inquisition—“Have you no decency, sir?” asked one witness—his chief aide said that the witch-hunting Democrat from Wisconsin had been silenced by his colleagues because he “would not observe the social amenities.” In today’s parlance, Roy Cohn might say that McCarthy suffered because he refused to be politically correct.

McCarthy was such an effective tormentor of the innocent that his name became synonymous with character assassination. He was eventually driven out of the Senate. Disgraced alongside his boss, Cohn departed Washington for his hometown of New York City where he became the ultimate political fixer and a terror in his own right. If you needed a favor, or wanted to hurt an enemy, Cohn could do the job. He talked like a make-believe mobster and counted real ones among his clients. Having spent years under the shadow of ethics complaints, Cohn lost his license to practice law in 1986, just before he died of HIV/AIDS, a diagnosis he denied. A gay Jewish man who spewed anti-Semitic, homophobic and racist remarks, he was actually quite charming in his way and left behind many friends. Among them were gossip columnists (like McCarthy, Cohn cultivated them) and two men he mentored in the art of manipulation: Roger Stone and Donald Trump. When Trump was still in his 20s he hired Cohn and began to move in the same circles. Both were members of Le Club, a private hot spot where the rich and famous and social climbers could meet without suffering the presence of ordinary people. Later, when Studio 54 served the glitter and cocaine crowd, Cohn and Trump were there too. Cohn modeled a style for Trump that was one part friendly gossip and one part menace. Cohn looked and sounded like someone who could hurt you if you crossed him. Trump kept a photo of the glowering Cohn so he could show it to those who might be chilled by the idea that this man was his lawyer. It was Cohn who introduced Trump to a young political operator named Roger Stone in 1979. Stone had cut his teeth in the Nixon campaign of 1972 where he posed as a student socialist who donated to an opponent and then made the contribution public. The fake scandal helped scuttle antiwar congressman Rep. Pete McCloskey’s presidential bid and ensured that Nixon was around to give America three more years of a disastrous war and Watergate.

Brilliant and perpetually aggressive—“attack, attack, attack” is his motto—Stone teamed up with Trump to create an ersatz presidential bid in 1987, and the two have been political partners ever since. Like Cohn, Stone is a risk-taker. He and Trump got caught breaking campaign rules as they fought the development of Indian casinos and state officials levied a hefty fine. Stone counsels clients to “Admit nothing, deny everything, launch counterattack.” He once told a reporter that it was his practice to always, “Get even.” “When somebody screws you,” he added, “screw ‘em back—but a lot harder.”

Trump’s version of the Stone credo, as he told me, is to “hit back 10 times harder” whenever he feels attacked. Like McCarthy and Cohn and Stone, Trump loves to gossip and trade in information. He too cultivates an air of menace to keep his opponents off-guard and he hates to apologize, or back down. And, like Cohn, he insists that the kind of talk his critics consider offensive is really just the truth expressed without the social amenities. This is an ingenious tactic for someone who wants to be free to say almost anything, even if it’s insulting, and get away with. Much of what Trump says and does comes straight out of the Cohn/Stone playbook, including his eagerness to make people uncomfortable and confused. As a campaign consultant Stone advises candidates to open multiple battlefronts, and as a source for reporters he often mystifies anyone who seeks to understand what he’s up to. For his part, Trump is a man prone to outrageous statements that defy fact-checking and our fascination with him stems, at least in part, from the delightful challenge of trying to figure out when he’s serious and when he’s putting us on. The current state of the Stone-Trump relationship is puzzling indeed. Stone has earned substantial sums for Trump and has always seemed to lurk behind the scenes in his political life. However, his outrageousness can seem like a liability and in 2008 Trump told Jeffrey Toobin of the New Yorker, “Roger is a stone-cold loser.” He also complained that Stone “always tries taking credit for things he never did.” Trump told me that he finds it easy to cut off those who displease him and that none of those who are banished ever return. Given this stand, it may seem strange that Trump welcomed Stone back into his political circle prior to announcing his candidacy for the GOP nomination. The reunion was short-lived, as the Trump campaign fired Stone in August with an announcement that said he was promoting himself too much. However, Stone insists he resigned before he was fired and he has continued to stump for Trump in the media. He is, like Donald, a true descendant of the McCarthy/Cohn line and perhaps as impossible to fully disown as a member of the family. Michael D'Antonio is the author of “Never Enough: Donald Trump and the Pursuit of Success” (Sept. 22, 2015; St. Martin's Press/Thomas Dunne Books). As part of a team of journalists from Newsday, he won the Pulitzer Prize for his reporting before going on to write many acclaimed books, including “Atomic Harvest” and “The State Boys Rebellion.” He has also written for Esquire, the New York Times Magazine and Sports Illustrated.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 08:59