Lily Salter's Blog, page 998

September 27, 2015

“My Brother’s Bomber”: The compelling personal crusade to crack the terror plot behind the 1988 Lockerbie explosion

This week’s issue of the New Yorker features an article by Patrick Radden Keefe titled “The Avenger: After three decades, has the brother of a victim of the Lockerbie bombing solved the case?” At first it seems that this lengthy piece scoops, sort of, the Frontline documentary “My Brother’s Bomber,” which is airing in three parts starting Sept. 29; there is little information in the documentary that isn’t covered better in the article. But really, the pieces accompany each other. Ken Dornstein is the force behind “My Brother’s Bomber” — he is its primary investigator and its primary funder — but because he is so close to the tragedy of Lockerbie, he is the force in front of the camera, too. “My Brother’s Bomber” is as much about his personal journey to come to terms with his brother’s death in 1988 at the hands of terrorists as it is an investigation of who really engineered the crash of Pan Am Flight 103. For the New Yorker, Keefe investigates the investigation; the Frontline documentary is the discussed endpoint of the journey, and the reporter independently analyzes a lot of the footage that Dornstein incorporated into the three-hour series. The result is not exactly two different snapshots of the same story; it’s two snapshots that blurrily overlap each other, taken with different exposures. Literally, snapshots. The promotional art released by Frontline is Dornstein on a cellphone, looking out the window of a car. The Libyan flag can be seen out the window, hovering in the background. It’s an intimate shot of him in a pensive situation. Meanwhile the photograph accompanying the New Yorker article is starker. Dornstein is looking directly at the camera, almost as if he is surprised at his work by the lens. The flashbulb casts a bright, interrogation room halo on him, and the room he’s standing in. And the room he’s standing in happens to be one of the two rooms in his attic devoted to the investigation, both of which have been turned over into full “Homeland” crazy-wall mode (see: the photo at the top of this story). The viewer is moved to ponder not just the mysteries of Lockerbie, but also to come to terms with what that scale of mystery and tragedy can do to people like Ken, whose older brother died at the age of 25, when Ken was a sophomore in college. “My Brother’s Bomber” is just another chapter in the story of Ken coming to terms with the tragedy; in 2006, he wrote “The Boy Who Fell Out of the Sky,” about his brother David’s unstable charisma and mysterious death. (According to the New York Times, in a class, David actually wrote a draft of a “fictional autobiography”: “The story of an unknown young writer who dies in a plane crash, leaving behind a cache of papers and notebooks that the narrator stitches together into the story of the writer's life.” That is, of course, exactly what Ken ended up doing.) The Lockerbie crash of 1988 was not that long ago, but as the documentary demonstrates, in terms of national security, it was a palpably different era. The crash was particularly horrible, and the difficulty of finding the perpetrators was particularly complex. The bomb ruptured the fuselage of the plane, which then broke apart midair; it is thought that most of the passengers were alive for the six-mile drop, until they hit the ground. Personal effects and remains were thus remarkably intact: David’s passport and the pack of cigarettes on him were returned to his family. As Dornstein tells Keefe, the bodies of the children on the plane were found farther away from the crash site because the wind swept their smaller bodies further. The U.K. investigated the bombing, and the U.S. sent agents, but this terrorist act caught them flat-footed, and the additional problems of a divided Germany, a fading Cold War, and the iron fist of Muammar Gadhafi in Libya made finding the terrorists a delayed and ultimately thwarted endeavor. The New Yorker story relates how a fragment the size of a thumbnail—found miles away from the crash site—ended up leading investigators to the make and model of the timer attached to the bomb. The deceased David, shown in old footage put into the documentary, is ebullient, creative, larger than life. Ken Dornstein, by contrast, is a reserved man, one who can recede into the background. He pursued investigative skills and documentary filmmaking early in his career, with the puzzle of Lockerbie in the back of his mind the whole time. In 2009, the only man imprisoned for conspiring to bring down Flight 103 was released from Scottish prison on compassionate release, because the convict, Abdelbaset al-Megrahi, was thought to be dying of prostate cancer. That appears to have been the catalyst for “My Brother’s Bomber”—Megrahi was not just alive, but released. But Dorstein couldn’t travel to Libya while it was under the rule of Gadhafi. In 2011, during the world-changing Arab Spring, Dornstein found his chance, and enlisting the help of his contacts and a seasoned crisis-zone filmmaker (Tim Gruzca), he crossed the border into Libya in the back of a car. On premise alone, “My Brother’s Bomber” is compelling. In practice, it’s even more so, despite being far less clear. Frontline only released the first episode, saying that “Ken's reporting is still very much unfolding” for the following two episodes, which air Oct, 6 and 13. The first hour has the sensibility of going down a rabbit hole. The cinematography is superb—choppy, layered and of varying fidelity, as cameras go into homes or film from vehicles. The effect mimics Dornstein’s lifelong effort to sift through the layers of evidence to find what might be the essential truth of David’s death. In the months of collapse of Gadhafi’s regime, whole neighborhoods of Tripoli are abandoned; compounds are ransacked and abandoned. With frightening ease, Dornstein and his crew are able to find troves of interrogation tapes, caches of memos, and the libraries of Gadhafi’s inner circle. Dornstein has a list of likely suspects, all of whom were very close to Gadhafi; most are dead or disappeared, and as the regime collapses and another is built, some of his suspected accomplices end up in Libyan prison, awaiting trial. Much like Andrew Jarecki’s Emmy-winning “The Jinx,” the fact-finding mission is the spine for a less bounded and more haunting story. Unlike “The Jinx,” though, this story is one of a seemingly inexhaustible fount of grief, as tragedy begets tragedy behind borders and within walls. The political, as we say, is and has always been personal. Ken Dornstein’s story is a reminder of just how personal.This week’s issue of the New Yorker features an article by Patrick Radden Keefe titled “The Avenger: After three decades, has the brother of a victim of the Lockerbie bombing solved the case?” At first it seems that this lengthy piece scoops, sort of, the Frontline documentary “My Brother’s Bomber,” which is airing in three parts starting Sept. 29; there is little information in the documentary that isn’t covered better in the article. But really, the pieces accompany each other. Ken Dornstein is the force behind “My Brother’s Bomber” — he is its primary investigator and its primary funder — but because he is so close to the tragedy of Lockerbie, he is the force in front of the camera, too. “My Brother’s Bomber” is as much about his personal journey to come to terms with his brother’s death in 1988 at the hands of terrorists as it is an investigation of who really engineered the crash of Pan Am Flight 103. For the New Yorker, Keefe investigates the investigation; the Frontline documentary is the discussed endpoint of the journey, and the reporter independently analyzes a lot of the footage that Dornstein incorporated into the three-hour series. The result is not exactly two different snapshots of the same story; it’s two snapshots that blurrily overlap each other, taken with different exposures. Literally, snapshots. The promotional art released by Frontline is Dornstein on a cellphone, looking out the window of a car. The Libyan flag can be seen out the window, hovering in the background. It’s an intimate shot of him in a pensive situation. Meanwhile the photograph accompanying the New Yorker article is starker. Dornstein is looking directly at the camera, almost as if he is surprised at his work by the lens. The flashbulb casts a bright, interrogation room halo on him, and the room he’s standing in. And the room he’s standing in happens to be one of the two rooms in his attic devoted to the investigation, both of which have been turned over into full “Homeland” crazy-wall mode (see: the photo at the top of this story). The viewer is moved to ponder not just the mysteries of Lockerbie, but also to come to terms with what that scale of mystery and tragedy can do to people like Ken, whose older brother died at the age of 25, when Ken was a sophomore in college. “My Brother’s Bomber” is just another chapter in the story of Ken coming to terms with the tragedy; in 2006, he wrote “The Boy Who Fell Out of the Sky,” about his brother David’s unstable charisma and mysterious death. (According to the New York Times, in a class, David actually wrote a draft of a “fictional autobiography”: “The story of an unknown young writer who dies in a plane crash, leaving behind a cache of papers and notebooks that the narrator stitches together into the story of the writer's life.” That is, of course, exactly what Ken ended up doing.) The Lockerbie crash of 1988 was not that long ago, but as the documentary demonstrates, in terms of national security, it was a palpably different era. The crash was particularly horrible, and the difficulty of finding the perpetrators was particularly complex. The bomb ruptured the fuselage of the plane, which then broke apart midair; it is thought that most of the passengers were alive for the six-mile drop, until they hit the ground. Personal effects and remains were thus remarkably intact: David’s passport and the pack of cigarettes on him were returned to his family. As Dornstein tells Keefe, the bodies of the children on the plane were found farther away from the crash site because the wind swept their smaller bodies further. The U.K. investigated the bombing, and the U.S. sent agents, but this terrorist act caught them flat-footed, and the additional problems of a divided Germany, a fading Cold War, and the iron fist of Muammar Gadhafi in Libya made finding the terrorists a delayed and ultimately thwarted endeavor. The New Yorker story relates how a fragment the size of a thumbnail—found miles away from the crash site—ended up leading investigators to the make and model of the timer attached to the bomb. The deceased David, shown in old footage put into the documentary, is ebullient, creative, larger than life. Ken Dornstein, by contrast, is a reserved man, one who can recede into the background. He pursued investigative skills and documentary filmmaking early in his career, with the puzzle of Lockerbie in the back of his mind the whole time. In 2009, the only man imprisoned for conspiring to bring down Flight 103 was released from Scottish prison on compassionate release, because the convict, Abdelbaset al-Megrahi, was thought to be dying of prostate cancer. That appears to have been the catalyst for “My Brother’s Bomber”—Megrahi was not just alive, but released. But Dorstein couldn’t travel to Libya while it was under the rule of Gadhafi. In 2011, during the world-changing Arab Spring, Dornstein found his chance, and enlisting the help of his contacts and a seasoned crisis-zone filmmaker (Tim Gruzca), he crossed the border into Libya in the back of a car. On premise alone, “My Brother’s Bomber” is compelling. In practice, it’s even more so, despite being far less clear. Frontline only released the first episode, saying that “Ken's reporting is still very much unfolding” for the following two episodes, which air Oct, 6 and 13. The first hour has the sensibility of going down a rabbit hole. The cinematography is superb—choppy, layered and of varying fidelity, as cameras go into homes or film from vehicles. The effect mimics Dornstein’s lifelong effort to sift through the layers of evidence to find what might be the essential truth of David’s death. In the months of collapse of Gadhafi’s regime, whole neighborhoods of Tripoli are abandoned; compounds are ransacked and abandoned. With frightening ease, Dornstein and his crew are able to find troves of interrogation tapes, caches of memos, and the libraries of Gadhafi’s inner circle. Dornstein has a list of likely suspects, all of whom were very close to Gadhafi; most are dead or disappeared, and as the regime collapses and another is built, some of his suspected accomplices end up in Libyan prison, awaiting trial. Much like Andrew Jarecki’s Emmy-winning “The Jinx,” the fact-finding mission is the spine for a less bounded and more haunting story. Unlike “The Jinx,” though, this story is one of a seemingly inexhaustible fount of grief, as tragedy begets tragedy behind borders and within walls. The political, as we say, is and has always been personal. Ken Dornstein’s story is a reminder of just how personal.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 14:00

“It’s not a women’s problem, it’s a workplace problem”: Anne-Marie Slaughter on the crisis at the heart of the “having it all” problem

Anne-Marie Slaughter was a big name in the field of public policy – former head of the International Legal Studies Program at Harvard Law School, dean of the Woodrow Wilson School at Princeton, and the first women to hold the position of director of policy planning when she was appointed to that position at the State Department in 2009. Yet most of the rest of us hadn’t heard of her until her 2012 article in The Atlantic titled “Why Women Still Can’t Have It All.” That article, in which Slaughter wrote about the difficulty of balancing a high-powered political job with caring for her two sons, reignited a debate about just how far women’s current lives still were from whatever feminist dreams many of us had grown up with. In her new book, “Unfinished Business: Women Men Work Family,” Slaughter expands the conversation to look at gender roles, the work world generally, and how public policy might help us reframe the issue around care – for children, for the elderly, and for the larger community. I spoke with Slaughter by phone, asking up front for her understanding if the two 9-year-old boys playing Super Mario on Wii in the basement interrupted us. “Well, if I don’t understand that, you’re sunk,” she laughed. Your book begins when you decided to leave your position at Princeton and take this position on the staff of Secretary of State Hillary Clinton. What were the challenges you faced when you took that job working far from home, and how did you make the decision to return home from Washington? Well, it took me about three minutes to decide to take the job! That was not really an issue. I had always wanted to be in government. I tell people now, when your party is in power and you’re free, do it. Because when Bill Clinton won, I was getting tenure and when we won again I had a 2-month-old; in 2000 and 2004 my party didn’t win. So it’s like, if I’m ever gonna go, I’ve gotta go. We decided, as I wrote in the book, that [my husband and sons] should stay [in Princeton]. They had no interest in moving. We had just come back from a year in Shanghai, so they’d already been moved for a year. And both Andy and I thought that would be really bad for them, to put them in a strange town with him commuting back to Princeton and me trying to get used to a very high-pressure job. And so you made the decision that you would commute and go to Washington every week during the work week and come home on the weekends. Obviously, that’s a grueling schedule for anybody, but it sounds like in the book you sort of anticipated it was going to be hard on you, and maybe didn’t anticipate quite how hard it would be on your kids. I think that’s true. Part of it was, my kids were then 10 and 12; we had not been through the teenage years. Maybe now I would have a different view, but also teenage-hood is different for everybody. Basically, my older son had hit middle school. He’d been in middle school three months when I left. That was a big change. He’d gone from a small elementary school to a much bigger middle school, and he was 12, and it’s a difficult time. I was always traveling, even when I was a dean, so I though this wasn’t going to be that different. I had been working, but I’d been working down the street, so I could be pretty fully involved in their lives – I could go to teacher meetings and sports games and that sort of thing. Maybe I should have anticipated it, but I’d always managed to make it work, and I just assumed we would make it work this time. And it was of that experience – of realizing how hard it was to make it work – you wrote an article that appeared in The Atlantic in 2012. Were you surprised at how much conversation and debate it sparked? Absolutely. And I don’t think I said anything that other people hadn’t said before. It made an impact in part because I wasn’t writing in a women’s journal or a feminist journal, I was writing in The Atlantic, and I’m a foreign policy person and I’d had a successful career. But I think mostly it was that I’d caught a generational wave. Mothers and daughters have been debating exactly these issues. And a lot of young women were saying to their mothers, “hey, I don’t want to have to do what you did.” Or, “it’s just harder than it looks. And I’m not sure how I’m going to do this.” And a lot of mothers were saying, as I said in the book, “well, of course you can do it. This is what we fought for.” So this is the interesting question – that generational issue within feminism. You write about growing up imbued with this feminist promise that women could and should have it all. Do you think that feminism needs to – not that feminism is some monolith – I was going to say! But how do you think that women, feminism, all of us, should reframe the conversation in order to get the next generation better prepared to face the challenge of trying to have a family, trying to have a career? Do we need to reset expectations? Yes. That’s really the heart of the book, the hope of the book. Which is that we are at a point, in the second wave feminist movement, 50 years on, where it is now time to make this a conversation between mothers and fathers and daughters and sons, in which both the parents and all the children say, look, to have the best of what life has to offer, to have a fulfilling, meaningful, purposeful life, there are two sides to that. There’s the striving to pursue your own goals and invest in yourself. And then there is the love and connection to others, and investing in them and watching them grow, or caring for those you love. A good life – let’s not talk about having it all – a good life has both. But it is hard to combine them, and if one person has a really big job, the other person is going to have to be the anchor in any relationship. I’m not saying nobody can do it – sure, there are people who can do everything for their kids and have a high-powered job. They have, typically, tons of help. My husband says to our sons, “Look, having you has been incredibly important, we think it’s the greatest achievement of our lives, we would not have missed it; if you want this, then you’re going to have to figure out, with your partner, how, over time, you make room for both.” And only when men start thinking about that the same way that women think about it, are we going to get to any kind of equality. There’s a passage in the book where you’re talking to a young women shortly after the Atlantic article came out. You’re talking with her about equality in the home and being an equal partner, and she sort of wrinkles her nose at the idea of a house husband. Are we still too entrenched in gender roles to have equal partnerships on the home front? I’d say we are entrenched in gender roles more on the male side than the female side. We have changed dramatically the choices open to women – dramatically. But we have not changed the choices open to men. So a man who is the lead parent is still looked at not all that differently than he was looked at in the ’50s and ’60s. Even though "Kramer vs. Kramer" came out in 1979, here’s this hard-charging guy whose wife leaves, and he’s totally incompetent as a parent, and he learns how to be competent, and he’s actually a better father in many ways than she is a mother. And that’s emasculating? Really? There have been some very brave men who’ve said, "I want to have a different life than my father. I want to be with my children. I want to be a central figure in their lives." I see the men who have the guts to do that as the same kind of pioneers as the original women who went into offices and were called every sort of name, none of which are printable in Salon – I think everything is printable in Salon. Well, they were called ballbusters. They were attacked for being masculine. And they said no. I can be a woman and I can be a CEO. And the other issue is female sexism, or female insistence on these gender roles as much as men, and that’s the point of my husband’s piece. [Slaughter’s husband, Andrew Moravcsik, has an article in The Atlantic titled “Why I Put My Wife’s Job First.”] We are buying into this idea, too. What I’ve discovered is that my husband parents really differently than I do, and I don’t really love it a lot of the time. And also, gender roles aside, even in a same-sex marriage, if you’re raising a child with another adult person, you’re going to have differences. Yes. Right at the heart of the book is these two friends of mine who are a lesbian couple, and I write about the criteria they use [to decide who is lead parent] – who earns more money, who has a bigger career, who’s more ambitious, who wants to be more engaged with the kids, what are their temperaments, all those questions. Same-sex couples are actually leading the way to how all couples should think about these issues. You had an essay recently in the New York Times, drawn from the book, in which you talk about a work world that is toxic for everybody, and almost impossible to manage for workers who aren’t young, healthy, childless, unencumbered by any outside needs or demands. Is that a new thing? Yes. This is not just me saying this. The number of hours that white-collar workers work has steadily climbed. And that’s due to globalization and competition, there are a million reasons why. The point is about focusing on care rather than women – when we focus on women, we just count. And the question is, how many women do you have? When we focus on care, you see something quite different. We used to have a workplace where only guys could work, and now we have a workplace where there are men and women and they’re both working full-out, but there's no room for care. That just logically can’t work. You’d never have expected the "Mad Men" to be simultaneously getting clients and running home and picking up their kids from school, so how on earth do we essentially liberate women to be in the workplace without recognizing that that work still has to get done? It’s not a women’s problem, it’s a workplace problem. Right. Even on "Mad Men," whenever Don’s daughter inconveniently drops by, he pawns her off on the secretaries. They find a mom substitute right in the office. Exactly! Exactly. Speaking of the care economy, as a public policy expert you see a role outside of the family, a larger role of government or business, a societal role to help solve these issues. There’s a section in your book about what you call the infrastructure of care – childcare, eldercare and so on. These things all sound terrific, but they also seem highly unlikely in our current political climate to ever happen. I agree they feel highly unlikely, but I felt very strongly two things. Of the key messages in the book, one is around care, and one is around men, and the third message really is, we can’t do this alone. And I really feel this strongly. And that’s part of my issue with focusing on women’s confidence, or on what women can do – that’s great, I’m all for it. But that is never going to fix the system. I guess I’m also saying, enough with “we can’t do it.” We could do it, if every woman in this country, and a lot of men, said, Look, we cannot be working, although we'd like to be, many of us want to be – I’m not advocating going back to the 1950s – but we can’t do it without the same kind of infrastructure other developed countries have. One last question. Some of the reactions after the Atlantic piece noted that you’re educated, privileged, professional and in a very high-power position and you’re talking about people in high-powered, white-collar jobs. How much do your ideas translate into a blue-collar, working-class context? That is the biggest difference between the article and the book. In the article I said, point blank, I am a privileged, educated woman writing for other privileged, educated women, in a privileged, educated magazine. I heard from lots of women, women who wrote to me and many of the women I addressed in all these speeches I’ve been giving, and I really started thinking about how the early feminist movement was always more upper-middle class – I mean, that was the critique of  “The Feminine Mystique,” too. But Gloria Steinem, at the time she was first speaking, it was a time of social revolution and so she talked about solidarity with unions, and with civil rights there was a more unified sense. And all women had had the common experience of being sex objects. You could be on the factory floor or you could be a secretary, you know, you’ve been pinched or groped or whatever else. So there was more unity than there is now. And that is, again, why I think it’s so much more important to focus on care than to focus on women. As I said, when we focus on women, we start counting. And most of the things we can count are the women at the top: how many CEOs, how many surgeons, how many professors? Yet the majority of minimum-wage workers are women, and two thirds of shift workers are women. So If we’re really going to help women, my point is, when you focus on care then you actually do see links – because you see that high-powered lawyer who decided to go part-time to be home with her kids and is knocked off the track for partnership or managing partner, and you see much more, as I wrote in the Times, far more dramatically, the woman who has to stay home because her kid is sick, and then loses her job. And so this is the frame that I do think makes sense. All these policy solutions will help poor women much more than rich women, because rich women can buy their way out of it.Anne-Marie Slaughter was a big name in the field of public policy – former head of the International Legal Studies Program at Harvard Law School, dean of the Woodrow Wilson School at Princeton, and the first women to hold the position of director of policy planning when she was appointed to that position at the State Department in 2009. Yet most of the rest of us hadn’t heard of her until her 2012 article in The Atlantic titled “Why Women Still Can’t Have It All.” That article, in which Slaughter wrote about the difficulty of balancing a high-powered political job with caring for her two sons, reignited a debate about just how far women’s current lives still were from whatever feminist dreams many of us had grown up with. In her new book, “Unfinished Business: Women Men Work Family,” Slaughter expands the conversation to look at gender roles, the work world generally, and how public policy might help us reframe the issue around care – for children, for the elderly, and for the larger community. I spoke with Slaughter by phone, asking up front for her understanding if the two 9-year-old boys playing Super Mario on Wii in the basement interrupted us. “Well, if I don’t understand that, you’re sunk,” she laughed. Your book begins when you decided to leave your position at Princeton and take this position on the staff of Secretary of State Hillary Clinton. What were the challenges you faced when you took that job working far from home, and how did you make the decision to return home from Washington? Well, it took me about three minutes to decide to take the job! That was not really an issue. I had always wanted to be in government. I tell people now, when your party is in power and you’re free, do it. Because when Bill Clinton won, I was getting tenure and when we won again I had a 2-month-old; in 2000 and 2004 my party didn’t win. So it’s like, if I’m ever gonna go, I’ve gotta go. We decided, as I wrote in the book, that [my husband and sons] should stay [in Princeton]. They had no interest in moving. We had just come back from a year in Shanghai, so they’d already been moved for a year. And both Andy and I thought that would be really bad for them, to put them in a strange town with him commuting back to Princeton and me trying to get used to a very high-pressure job. And so you made the decision that you would commute and go to Washington every week during the work week and come home on the weekends. Obviously, that’s a grueling schedule for anybody, but it sounds like in the book you sort of anticipated it was going to be hard on you, and maybe didn’t anticipate quite how hard it would be on your kids. I think that’s true. Part of it was, my kids were then 10 and 12; we had not been through the teenage years. Maybe now I would have a different view, but also teenage-hood is different for everybody. Basically, my older son had hit middle school. He’d been in middle school three months when I left. That was a big change. He’d gone from a small elementary school to a much bigger middle school, and he was 12, and it’s a difficult time. I was always traveling, even when I was a dean, so I though this wasn’t going to be that different. I had been working, but I’d been working down the street, so I could be pretty fully involved in their lives – I could go to teacher meetings and sports games and that sort of thing. Maybe I should have anticipated it, but I’d always managed to make it work, and I just assumed we would make it work this time. And it was of that experience – of realizing how hard it was to make it work – you wrote an article that appeared in The Atlantic in 2012. Were you surprised at how much conversation and debate it sparked? Absolutely. And I don’t think I said anything that other people hadn’t said before. It made an impact in part because I wasn’t writing in a women’s journal or a feminist journal, I was writing in The Atlantic, and I’m a foreign policy person and I’d had a successful career. But I think mostly it was that I’d caught a generational wave. Mothers and daughters have been debating exactly these issues. And a lot of young women were saying to their mothers, “hey, I don’t want to have to do what you did.” Or, “it’s just harder than it looks. And I’m not sure how I’m going to do this.” And a lot of mothers were saying, as I said in the book, “well, of course you can do it. This is what we fought for.” So this is the interesting question – that generational issue within feminism. You write about growing up imbued with this feminist promise that women could and should have it all. Do you think that feminism needs to – not that feminism is some monolith – I was going to say! But how do you think that women, feminism, all of us, should reframe the conversation in order to get the next generation better prepared to face the challenge of trying to have a family, trying to have a career? Do we need to reset expectations? Yes. That’s really the heart of the book, the hope of the book. Which is that we are at a point, in the second wave feminist movement, 50 years on, where it is now time to make this a conversation between mothers and fathers and daughters and sons, in which both the parents and all the children say, look, to have the best of what life has to offer, to have a fulfilling, meaningful, purposeful life, there are two sides to that. There’s the striving to pursue your own goals and invest in yourself. And then there is the love and connection to others, and investing in them and watching them grow, or caring for those you love. A good life – let’s not talk about having it all – a good life has both. But it is hard to combine them, and if one person has a really big job, the other person is going to have to be the anchor in any relationship. I’m not saying nobody can do it – sure, there are people who can do everything for their kids and have a high-powered job. They have, typically, tons of help. My husband says to our sons, “Look, having you has been incredibly important, we think it’s the greatest achievement of our lives, we would not have missed it; if you want this, then you’re going to have to figure out, with your partner, how, over time, you make room for both.” And only when men start thinking about that the same way that women think about it, are we going to get to any kind of equality. There’s a passage in the book where you’re talking to a young women shortly after the Atlantic article came out. You’re talking with her about equality in the home and being an equal partner, and she sort of wrinkles her nose at the idea of a house husband. Are we still too entrenched in gender roles to have equal partnerships on the home front? I’d say we are entrenched in gender roles more on the male side than the female side. We have changed dramatically the choices open to women – dramatically. But we have not changed the choices open to men. So a man who is the lead parent is still looked at not all that differently than he was looked at in the ’50s and ’60s. Even though "Kramer vs. Kramer" came out in 1979, here’s this hard-charging guy whose wife leaves, and he’s totally incompetent as a parent, and he learns how to be competent, and he’s actually a better father in many ways than she is a mother. And that’s emasculating? Really? There have been some very brave men who’ve said, "I want to have a different life than my father. I want to be with my children. I want to be a central figure in their lives." I see the men who have the guts to do that as the same kind of pioneers as the original women who went into offices and were called every sort of name, none of which are printable in Salon – I think everything is printable in Salon. Well, they were called ballbusters. They were attacked for being masculine. And they said no. I can be a woman and I can be a CEO. And the other issue is female sexism, or female insistence on these gender roles as much as men, and that’s the point of my husband’s piece. [Slaughter’s husband, Andrew Moravcsik, has an article in The Atlantic titled “Why I Put My Wife’s Job First.”] We are buying into this idea, too. What I’ve discovered is that my husband parents really differently than I do, and I don’t really love it a lot of the time. And also, gender roles aside, even in a same-sex marriage, if you’re raising a child with another adult person, you’re going to have differences. Yes. Right at the heart of the book is these two friends of mine who are a lesbian couple, and I write about the criteria they use [to decide who is lead parent] – who earns more money, who has a bigger career, who’s more ambitious, who wants to be more engaged with the kids, what are their temperaments, all those questions. Same-sex couples are actually leading the way to how all couples should think about these issues. You had an essay recently in the New York Times, drawn from the book, in which you talk about a work world that is toxic for everybody, and almost impossible to manage for workers who aren’t young, healthy, childless, unencumbered by any outside needs or demands. Is that a new thing? Yes. This is not just me saying this. The number of hours that white-collar workers work has steadily climbed. And that’s due to globalization and competition, there are a million reasons why. The point is about focusing on care rather than women – when we focus on women, we just count. And the question is, how many women do you have? When we focus on care, you see something quite different. We used to have a workplace where only guys could work, and now we have a workplace where there are men and women and they’re both working full-out, but there's no room for care. That just logically can’t work. You’d never have expected the "Mad Men" to be simultaneously getting clients and running home and picking up their kids from school, so how on earth do we essentially liberate women to be in the workplace without recognizing that that work still has to get done? It’s not a women’s problem, it’s a workplace problem. Right. Even on "Mad Men," whenever Don’s daughter inconveniently drops by, he pawns her off on the secretaries. They find a mom substitute right in the office. Exactly! Exactly. Speaking of the care economy, as a public policy expert you see a role outside of the family, a larger role of government or business, a societal role to help solve these issues. There’s a section in your book about what you call the infrastructure of care – childcare, eldercare and so on. These things all sound terrific, but they also seem highly unlikely in our current political climate to ever happen. I agree they feel highly unlikely, but I felt very strongly two things. Of the key messages in the book, one is around care, and one is around men, and the third message really is, we can’t do this alone. And I really feel this strongly. And that’s part of my issue with focusing on women’s confidence, or on what women can do – that’s great, I’m all for it. But that is never going to fix the system. I guess I’m also saying, enough with “we can’t do it.” We could do it, if every woman in this country, and a lot of men, said, Look, we cannot be working, although we'd like to be, many of us want to be – I’m not advocating going back to the 1950s – but we can’t do it without the same kind of infrastructure other developed countries have. One last question. Some of the reactions after the Atlantic piece noted that you’re educated, privileged, professional and in a very high-power position and you’re talking about people in high-powered, white-collar jobs. How much do your ideas translate into a blue-collar, working-class context? That is the biggest difference between the article and the book. In the article I said, point blank, I am a privileged, educated woman writing for other privileged, educated women, in a privileged, educated magazine. I heard from lots of women, women who wrote to me and many of the women I addressed in all these speeches I’ve been giving, and I really started thinking about how the early feminist movement was always more upper-middle class – I mean, that was the critique of  “The Feminine Mystique,” too. But Gloria Steinem, at the time she was first speaking, it was a time of social revolution and so she talked about solidarity with unions, and with civil rights there was a more unified sense. And all women had had the common experience of being sex objects. You could be on the factory floor or you could be a secretary, you know, you’ve been pinched or groped or whatever else. So there was more unity than there is now. And that is, again, why I think it’s so much more important to focus on care than to focus on women. As I said, when we focus on women, we start counting. And most of the things we can count are the women at the top: how many CEOs, how many surgeons, how many professors? Yet the majority of minimum-wage workers are women, and two thirds of shift workers are women. So If we’re really going to help women, my point is, when you focus on care then you actually do see links – because you see that high-powered lawyer who decided to go part-time to be home with her kids and is knocked off the track for partnership or managing partner, and you see much more, as I wrote in the Times, far more dramatically, the woman who has to stay home because her kid is sick, and then loses her job. And so this is the frame that I do think makes sense. All these policy solutions will help poor women much more than rich women, because rich women can buy their way out of it.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 11:00

The military’s secret military: Green Berets, Navy SEALs and the special ops you’ll never know about

You can find them in dusty, sunbaked badlands, moist tropical forests, and the salty spray of third-world littorals. Standing in judgement, buffeted by the rotor wash of a helicopter or sweltering beneath the relentless desert sun, they instruct, yell, and cajole as skinnier men playact under their watchful eyes. In many places, more than their particular brand of camouflage, better boots, and designer gear sets them apart. Their days are scented by stale sweat and gunpowder; their nights are spent in rustic locales or third-world bars. These men -- and they are mostly men -- belong to an exclusive military fraternity that traces its heritage back to the birth of the nation. Typically, they’ve spent the better part of a decade as more conventional soldiers, sailors, marines, or airmen before making the cut. They’ve probably been deployed overseas four to 10 times. The officers are generally approaching their mid-thirties; the enlisted men, their late twenties. They’ve had more schooling than most in the military. They’re likely to be married with a couple of kids. And day after day, they carry out shadowy missions over much of the planet: sometimes covert raids, more often hush-hush training exercises from Chad to Uganda, Bahrain to Saudi Arabia, Albania to Romania, Bangladesh to Sri Lanka, Belize to Uruguay. They belong to the Special Operations forces (SOF), America’s most elite troops -- Army Green Berets and Navy SEALs, among others -- and odds are, if you throw a dart at a world map or stop a spinning globe with your index finger and don’t hit water, they’ve been there sometime in 2015. The Wide World of Special Ops This year, U.S. Special Operations forces have already deployed to 135 nations, according to Ken McGraw, a spokesman for Special Operations Command (SOCOM).  That’s roughly 70% of the countries on the planet.  Every day, in fact, America’s most elite troops are carrying out missions in 80 to 90 nations, practicing night raids or sometimes conducting them for real, engaging in sniper training or sometimes actually gunning down enemies from afar. As part of a global engagement strategy of endless hush-hush operations conducted on every continent but Antarctica, they have now eclipsed the number and range of special ops missions undertaken at the height of the conflicts in Iraq and Afghanistan. In the waning days of the Bush administration, Special Operations forces (SOF) were reportedly deployed in only about 60 nations around the world.  By 2010, according to the Washington Post, that number had swelled to 75.  Three years later, it had jumped to 134 nations, “slipping” to 133 last year, before reaching a new record of 135 this summer.  This 80% increase over the last five years is indicative of SOCOM’s exponential expansion which first shifted into high gear following the 9/11 attacks. Special Operations Command’s funding, for example, has more than tripled from about $3 billion in 2001 to nearly $10 billion in 2014 “constant dollars,”according to the Government Accountability Office (GAO).  And this doesn’t include funding from the various service branches, which SOCOM estimates at around another $8 billion annually, or other undisclosed sums that the GAO was unable to track.  The average number of Special Operations forces deployed overseas has nearly tripled during these same years, while SOCOM more than doubled its personnel from about 33,000 in 2001 to nearly 70,000now. Each day, according to SOCOM commanderGeneral Joseph Votel, approximately 11,000 special operators are deployed or stationed outside the United States with many more on standby, ready to respond in the event of an overseas crisis. “I think a lot of our resources are focused in Iraq and in the Middle East, in Syria for right now. That's really where our head has been,” Votel told the Aspen Security Forum in July.  Still, he insisted his troops were not “doing anything on the ground in Syria” -- even if they had carried out a night raid there a couple of months before and it was later revealed that they are involved in a covert campaign of drone strikes in that country. “I think we are increasing our focus on Eastern Europe at this time,” he added. “At the same time we continue to provide some level of support on South America for Colombia and the other interests that we have down there. And then of course we're engaged out in the Pacific with a lot of our partners, reassuring them and working those relationships and maintaining our presence out there.” In reality, the average percentage of Special Operations forces deployed to the Greater Middle East has decreased in recent years.  Back in 2006, 85% of special operators were deployed in support of Central Command or CENTCOM, the geographic combatant command (GCC) that oversees operations in the region.  By last year, that number had dropped to 69%, according to GAO figures.  Over that same span, Northern Command -- devoted to homeland defense -- held steady at 1%, European Command (EUCOM) doubled its percentage, from 3% to 6%, Pacific Command (PACOM) increased from 7% to 10%, and Southern Command, which overseas Central and South America as well as the Caribbean, inched up from 3% to 4%. The largest increase, however, was in a region conspicuously absent from Votel’s rundown of special ops deployments.  In 2006, just 1% of the special operators deployed abroad were sent to Africa Command’s area of operations.  Last year, it was 10%. Globetrotting is SOCOM’s stock in trade and, not coincidentally, it’s divided into a collection of planet-girding “sub-unified commands”: the self-explanatory SOCAFRICA; SOCEUR, the European contingent; SOCCENT, the sub-unified command of CENTCOM; SOCKOR, which is devoted strictly to Korea; SOCPAC, which covers the rest of the Asia-Pacific region; SOCSOUTH, which conducts missions in Central America, South America, and the Caribbean; SOCNORTH, which is devoted to “homeland defense”; and the ever-itinerant Joint Special Operations Command or JSOC, a clandestine sub-command (formerly headed by Votel) made up of personnel from each service branch, including SEALs, Air Force special tactics airmen, and the Army's Delta Force that specializes in tracking and killing suspected terrorists. The elite of the elite in the special ops community, JSOC takes on covert, clandestine, and low-visibility operations in the hottest of hot spots.  Some covert ops that have come to light in recent years include a host of Delta Force missions: among them, an operation in May in which members of the elite force killed an Islamic State commander known as Abu Sayyaf during a night raid in Syria; the 2014 release of long-time Taliban prisoner Army Sergeant Bowe Bergdahl; the capture of Ahmed Abu Khattala, a suspect in 2012 terror attacks in Benghazi, Libya; and the 2013 abduction of Anas al-Libi, an al-Qaeda militant, off a street in that same country.  Similarly, Navy SEALs have, among other operations, carried out successful hostage rescue missions in Afghanistan and Somalia in 2012; a disastrous one in Yemen in 2014; a 2013 kidnap raid in Somalia that went awry; and -- that same year -- a failed evacuation mission in South Sudan in which three SEALs were wounded when their aircraft was hit by small arms fire. SOCOM’s SOF Alphabet Soup Most deployments have, however, been training missions designed to tutor proxies and forge stronger ties with allies. “Special Operations forces provide individual-level training, unit-level training, and formal classroom training,” explains SOCOM’s Ken McGraw.  “Individual training can be in subjects like basic rifle marksmanship, land navigation, airborne operations, and first aid.  They provide unit-level training in subjects like small unit tactics, counterterrorism operations and maritime operations. SOF can also provide formal classroom training in subjects like the military decision-making process or staff planning.” From 2012 to 2014, for instance, Special Operations forces carried out 500 Joint Combined Exchange Training (JCET) missions in as many as 67 countries each year.  JCETs are officially devoted to training U.S. forces, but they nonetheless serve as a key facet of SOCOM’s global engagement strategy. The missions “foster key military partnerships with foreign militaries, enhance partner-nations' capability to provide for their own defense, and build interoperability between U.S. SOF and partner-nation forces,” according to SOCOM’s McGraw. And JCETs are just a fraction of the story.  SOCOM carries out many other multinational overseas training operations.   According to data from the Office of the Under Secretary of Defense (Comptroller), for example, Special Operations forces conducted 75 training exercises in 30 countries in 2014.  The numbers were projected to jump to 98 exercises in 34 countries by the end of this year. “SOCOM places a premium on international partnerships and building their capacity.  Today, SOCOM has persistent partnerships with about 60 countries through our Special Operations Forces Liaison Elements and Joint Planning and Advisory Teams,” said SOCOM’s Votel at a conference earlier this year, drawing attention to two of the many types of shadowy Special Ops entities that operate overseas.  These SOFLEs and JPATs belong to a mind-bending alphabet soup of special ops entities operating around the globe, a jumble of opaque acronyms and stilted abbreviations masking a secret world of clandestine efforts often conducted in the shadows in impoverished lands ruled by problematic regimes.  The proliferation of this bewildering SOCOM shorthand -- SOJTFs and CJSOTFs, SOCCEs and SOLEs -- mirrors the relentless expansion of the command, with its signature brand of military speak or milspeak proving as indecipherable to most Americans as its missions are secret from them. Around the world, you can find Special Operations Joint Task Forces (SOJTFs), Combined Joint Special Operations Task Forces (CJSOTFs), and Joint Special Operations Task Forces (JSOTFs), Theater Special Operations Commands (TSOCs), as well as Special Operations Command and Control Elements (SOCCEs) and Special Operations Liaison Elements (SOLEs).  And that list doesn’t even include Special Operations Command Forward (SOC FWD) elements -- small teams which, according to the military, “shape and coordinate special operations forces security cooperation and engagement in support of theater special operations command, geographic combatant command, and country team goals and objectives.” Special Operations Command will not divulge the locations or even a simple count of its SOC FWDs for “security reasons.”  When asked how releasing only the number could imperil security, SOCOM’s Ken McGraw was typically opaque.  “The information is classified,” he responded.  “I am not the classification authority for that information so I do not know the specifics of why the information is classified.”  Open source data suggests, however, that they are clustered in favored black ops stomping grounds, including SOC FWD Pakistan, SOC FWD Yemen, and SOC FWD Lebanon, as well as SOC FWD East Africa, SOC FWD Central Africa, and SOC FWD West Africa. What’s clear is that SOCOM prefers to operate in the shadows while its personnel and missions expand globally to little notice or attention.  “The key thing that SOCOM brings to the table is that we are -- we think of ourselves -- as a global force. We support the geographic combatant commanders, but we are not bound by the artificial boundaries that normally define the regional areas in which they operate. So what we try to do is we try to operate across those boundaries,” SOCOM’s Votel told the Aspen Security Forum. In one particular blurring of boundaries, Special Operations liaison officers (SOLOs) are embedded in at least 14 key U.S. embassies to assist in advising the special forces of various allied nations.  Already operating in Australia, Brazil, Canada, Colombia, El Salvador, France, Israel, Italy, Jordan, Kenya, Poland, Peru, Turkey, and the United Kingdom, the SOLO program is poised, according to Votel, to expand to 40 countries by 2019.  The command, and especially JSOC, has also forged close ties with the Central Intelligence Agency, the Federal Bureau of Investigation, and the National Security Agency, among other outfits, through the use of liaison officers and Special Operations Support Teams (SOSTs). “In today’s environment, our effectiveness is directly tied to our ability to operate with domestic and international partners. We, as a joint force, must continue to institutionalize interoperability, integration, and interdependence between conventional forces and special operations forces through doctrine, training, and operational deployments,” Votel told the Senate Armed Services Committee this spring.  “From working with indigenous forces and local governments to improve local security, to high-risk counterterrorism operations -- SOF are in vital roles performing essential tasks.” SOCOM will not name the 135 countries in which America’s most elite forces were deployed this year, let alone disclose the nature of those operations.  Most were, undoubtedly, training efforts.  Documents obtained from the Pentagon via the Freedom of Information Act outlining Joint Combined Exchange Training in 2013 offer an indication of what Special Operations forces do on a daily basis and also what skills are deemed necessary for their real-world missions: combat marksmanship, patrolling, weapons training, small unit tactics, special operations in urban terrain, close quarters combat, advanced marksmanship, sniper employment, long-range shooting, deliberate attack, and heavy weapons employment, in addition to combat casualty care, human rights awareness, land navigation, and mission planning, among others. From Joint Special Operations Task Force-Juniper Shield, which operates in Africa’s Trans-Sahara region, and Special Operations Command and Control Element-Horn of Africa, to Army Special Operations Forces Liaison Element-Korea and Combined Joint Special Operations Task Force-Arabian Peninsula, the global growth of SOF missions has been breathtaking.  SEALs or Green Berets, Delta Force operators or Air Commandos, they are constantly taking on what Votel likes to call the “nation’s most complex, demanding, and high-risk challenges.” These forces carry out operations almost entirely unknown to the American taxpayers who fund them, operations conducted far from the scrutiny of the media or meaningful outside oversight of any kind.  Everyday, in around 80 or more countries that Special Operations Command will not name, they undertake missions the command refuses to talk about.  They exist in a secret world of obtuse acronyms and shadowy efforts, of mystery missions kept secret from the American public, not to mention most of the citizens of the 135 nations where they’ve been deployed this year. This summer, when Votel commented that more special ops troops are deployed to more locations and are conducting more operations than at the height of the Afghan and Iraq wars, he drew attention to two conflicts in which those forces played major roles that have not turned out well for the United States.  Consider that symbolic of what the bulking up of his command has meant in these years. “Ultimately, the best indicator of our success will be the success of the [geographic combatant commands],” says the special ops chief, but with U.S.setbacks in Africa Command’s area of operations from Mali and Nigeria toBurkina Faso and Cameroon; in Central Command’s bailiwick from Iraq andAfghanistan to Yemen and Syria; in the PACOM region vis-à-vis China; and perhaps even in the EUCOM area of operations due to Russia, it’s far from clear what successes can be attributed to the ever-expanding secret operations of America’s secret military.  The special ops commander seems resigned to the very real limitations of what his secretive but much-ballyhooed, highly-trained, well-funded, heavily-armed operators can do. “We can buy space, we can buy time,” says Votel, stressing that SOCOM can “play a very, very key role” in countering “violent extremism,” but only up to a point -- and that point seems to fall strikingly short of anything resembling victory or even significant foreign policy success.  “Ultimately, you know, problems like we see in Iraq and Syria,” he says, “aren't going to be resolved by us.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 10:00

Ronald Reagan’s “welfare queen” myth: How the Gipper kickstarted the war on the working poor

Welfare’s virtual extinction has gone all but unnoticed by the American public and the press. But also unnoticed by many has been the expansion of other types of help for the poor. Thanks in part to changes made by the George W. Bush administration, more poor individuals claim SNAP than ever before. The State Children’s Health Insurance Program (now called CHIP, minus the “State”) was created in 1997 to expand the availability of public health insurance to millions of lower-income children. More recently, the Affordable Care Act has made health care coverage even more accessible to lower-income adults with and without children. Perhaps most important, a system of tax credits aimed at the working poor, especially those with dependent children, has grown considerably. The most important of these is the Earned Income Tax Credit (EITC). The EITC is refundable, which means that if the amount for which low-income workers are eligible is more than they owe in taxes, they will get a refund for the difference. Low-income working parents often get tax refunds that are far greater than the income taxes withheld from their paychecks during the year. These tax credits provide a significant income boost to low-income parents working a formal job (parents are not eligible if they’re working off the books). Because tax credits like the EITC are viewed by many as being pro-work, they have long enjoyed support from Democrats and Republicans alike. But here’s the catch: only those who are working can claim them. These expansions of aid for the working poor mean that even after a watershed welfare reform, we, as a country, aren’t spending less on poor families than we once did. In fact, we now spend much more. Yet for all this spending, these programs, except for SNAP, have offered little to help two people like Modonna and Brianna during their roughest spells, when Modonna has had no work. To see clearly who the winners and losers are in the new regime, compare Modonna’s situation before and after she lost her job. In 2009, the last year she was employed, her cashier’s salary was probably about $17,500. After taxes, her monthly paycheck would have totaled around $1,325. While she would not have qualified for a penny of welfare, at tax time she could have claimed a refund of about $3,800, all due to refundable tax credits (of course, her employer still would have withheld FICA taxes for Social Security and Medicare, so her income wasn’t totally tax-free). She also would have been entitled to around $160 each month in SNAP benefits. Taken together, the cash and food aid she could have claimed, even when working full-time, would have been in the range of $5,700 per year. The federal government was providing Modonna with a 36 percent pay raise to supplement her low earnings. Now, having lost her job and exhausted her unemployment insurance, Modonna gets nothing from the government at tax time. Despite her dire situation, she can’t get any help with housing costs. So many people are on the waiting list for housing assistance in Chicago that no new applications are being accepted. The only safety net program available to her at present is SNAP, which went from about $160 to $367 a month when her earnings fell to zero. But that difference doesn’t make up for Modonna’s lost wages. Not to mention the fact that SNAP is meant to be used only to purchase food, not to pay the rent, keep the utility company happy, or purchase school supplies. Thus, as Modonna’s earnings fell from $17,500 to nothing, the annual cash and food stamps she could claim from the government also fell, from $5,700 to $4,400. Welfare pre-1996 style might have provided a lifeline for Modonna as she frantically searched for another job. A welfare check might have kept her and her daughter in their little studio apartment, where they could keep their things, sleep in their own beds, take showers, and prepare meals. It might have made looking for a job easier —paying for a bus pass or a new outfit or hairdo that could help her compete with the many others applying for the same job. But welfare is dead. They just aren’t giving it out anymore. Who killed welfare? You might say that it all started with a charismatic presidential candidate hailing from a state far from Washington, D.C., running during a time of immense change for the country. There was no doubt he had a way with people. It was in the smoothness of his voice and the way he could lock on to someone, even over the TV. Still, he needed an issue that would capture people’s attention. He needed something with curb appeal. In 1976, Ronald Reagan was trying to oust a sitting president in his own party, a none-too-easy task. As he refined his stump speech, he tested out a theme that had worked well when he ran for governor of California and found that it resonated with audiences all across the country: It was time to reform welfare. Over the years, America had expanded its hodgepodge system of programs for the poor again and again. In Reagan’s time, the system was built around Aid to Families with Dependent Children (AFDC), the cash assistance program that was first authorized in 1935, during the depths of the Great Depression. This program offered cash to those who could prove their economic need and demanded little in return. It had no time limits and no mandate that recipients get a job or prove that they were unable to work. As its caseload grew over the years, AFDC came to be viewed by many as a program that rewarded indolence. And by supporting single mothers, it seemed to condone nonmarital childbearing. Perhaps the real question is not why welfare died, but why a program at such odds with American values had lasted as long as it did. In fact, welfare’s birth was a bit of a historical accident. After the Civil War, which had produced a generation of young widowed mothers, many states stepped in with “mother’s aid” programs, which helped widows care for their children in their own homes rather than placing them in orphanages. But during the Great Depression, state coffers ran dry. Aid to Dependent Children (ADC), as the program was first called, was the federal government’s solution to the crisis. Like the earlier state programs, it was based on the assumption that it was best for a widowed mother to raise her children at home. In the grand scheme of things, ADC was a minor footnote in America’s big bang of social welfare legislation in 1935 that created Social Security for the elderly, unemployment insurance for those who lost their jobs through no fault of their own, and other programs to support the needy aged and blind. Its architects saw ADC as a stopgap measure, believing that once male breadwinners began paying in to Social Security, their widows would later be able to claim their deceased husbands’ benefits. Yet ADC didn’t shrink over the years; it grew. The federal government slowly began to loosen eligibility restrictions, and a caseload of a few hundred thousand recipients in the late 1930s had expanded to 3.6 million by 1962. Widowed mothers did move on to Social Security. But other single mothers —divorcées and women who had never been married —began to use the program at greater rates. There was wide variation in the amount of support offered across the states. In those with large black populations, such as Mississippi and Alabama, single mothers got nickels and dimes on the dollar of what was provided in largely white states, such as Massachusetts and Minnesota. And since the American public deemed divorced or never-married mothers less deserving than widows, many states initiated practices intended to keep them off the rolls. Poverty rose to the top of the public agenda in the 1960s, in part spurred by the publication of Michael Harrington’s The Other America: Poverty in the United States. Harrington’s 1962 book made a claim that shocked the nation at a time when it was experiencing a period of unprecedented affluence: based on the best available evidence, between 40 million and 50 million Americans—20 to 25 percent of the nation’s population—still lived in poverty, suffering from “inadequate housing, medicine, food, and opportunity.” Shedding light on the lives of the poor from New York to Appalachia to the Deep South, Harrington’s book asked how it was possible that so much poverty existed in a land of such prosperity. It challenged the country to ask what it was prepared to do about it. Prompted in part by the strong public reaction to The Other America, and just weeks after President John F. Kennedy’s assassination, President Lyndon Johnson declared an “unconditional war on poverty in America.” In his 1964 State of the Union address, Johnson lamented that “many Americans live on the outskirts of hope —some because of their poverty, and some because of their color, and all too many because of both.” He charged the country with a new task: to uplift the poor, “to help replace their despair with opportunity.” This at a time when the federal government didn’t yet have an official way to measure whether someone was poor. In his efforts to raise awareness about poverty in America, Johnson launched a series of “poverty tours” via Air Force One, heading to places such as Martin County, Kentucky, where he visited with struggling families and highlighted the plight of the Appalachian poor, whose jobs in the coal mines were rapidly disappearing. A few years later, as Robert F. Kennedy contemplated a run for the presidency, he toured California’s San Joaquin Valley, the Mississippi Delta, and Appalachia to see whether the initial rollout of the War on Poverty programs had made any difference in the human suffering felt there. RFK’s tours were organized in part by his Harvard-educated aide Peter Edelman. (Edelman met his future wife, Marian Wright —later founder of the Children’s Defense Fund—on the Mississippi Delta tour. “She was really smart, and really good-looking,” he later wrote of the event.) Dressed in a dark suit and wearing thick, black-framed glasses, Edelman worked with others on Kennedy’s staff and local officials to schedule visits with families and organize community hearings. In eastern Kentucky, RFK held meetings in such small towns as Whitesburg and Fleming-Neon. Neither Edelman nor anyone else involved anticipated the keen interest in the eastern Kentucky trip among members of the press, who were waiting to hear whether Kennedy would run for president. Since the organizers had not secured a bus for the press pool, reporters covering the trip were forced to rent their own vehicles and formed a caravan that spanned thirty or forty cars. Edelman remembers that “by the end of the first day we were three hours behind schedule.” Kennedy’s poverty activism was cut short by his assassination in June 1968. But Johnson’s call to action had fueled an explosion in policy making. More programs targeting poor families were passed as part of Johnson’s Great Society and its War on Poverty than at any other time in American history. Congress made the fledgling Food Stamp Program permanent (although the program grew dramatically during the 1970s under President Richard Nixon) and increased federal funds for school breakfasts and lunches, making them free to children from poor families. Social Security was expanded to better serve the poorest of its claimants, Head Start was born, and new health insurance programs for the poor (Medicaid) and elderly (Medicare) were created. What the War on Poverty did not do was target the cash welfare system (by then renamed Aid to Families with Dependent Children, or AFDC) for expansion. Yet the late 1960s and early 1970s marked the greatest period of caseload growth in the program’s history. Between 1964 and 1976, the number of Americans getting cash assistance through AFDC nearly tripled, from 4.2 million to 11.3 million. This dramatic rise was driven in part by the efforts of the National Welfare Rights Organization (NWRO). A group led by welfare recipients and radical social workers, the NWRO brought poor families to welfare offices to demand aid and put pressure on program administrators to treat applicants fairly. The NWRO was also the impetus behind a series of court decisions in the late 1960s and the 1970s that struck down discriminatory practices that had kept some families over the prior decades off the welfare rolls, particularly those headed by blacks, as well as divorced and never-married mothers. Through “man in the house” rules, state caseworkers had engaged in midnight raids to ensure that recipients had no adult males living in the home. In addition, “suitable home” requirements had enabled caseworkers to exclude applicants if a home visit revealed “disorder.” Some instituted “white glove tests” to ensure “good housekeeping.” An applicant could be denied if the caseworker’s white glove revealed dust on a windowsill or the fireplace mantel. When these practices were struck down, the caseloads grew bigger, and with rising caseloads came rising expenditures. No longer was cash welfare an inconsequential footnote among government programs. It was now a significant commitment of the federal and state governments in its own right. As costs increased, AFDC’s unpopularity only grew. The largest, most representative survey of American attitudes, the General Social Survey, has consistently shown that between 60 and 70 percent of the American public believes that the government is “spending too little on assistance for the poor.” However, if Americans are asked about programs labeled “welfare” in particular, their support for assistance drops considerably. Even President Franklin D. Roosevelt claimed that “welfare is a narcotic, a subtle destroyer of the human spirit.” Although there is little evidence to support such a claim, welfare is widely believed to engender dependency. Providing more aid to poor single mothers during the 1960s and 1970s likely reduced their work effort somewhat. But it didn’t lead to the mass exodus from the workforce that the rhetoric of the time often suggested. Sometimes evidence, however, doesn’t stand a chance against a compelling narrative. Americans were suspicious of welfare because they feared that it sapped the able-bodied of their desire to raise themselves up by their own bootstraps. By the mid-1970s, with the country grappling with what seemed like a fundamental societal shift, another reason for wariness toward welfare arose. In 1960, only about 5 percent of births were to unmarried women, consistent with the two previous decades. But then the percentage began to rise at an astonishing pace, doubling by the early 1970s and nearly doubling again over the next decade. A cascade of criticism blamed welfare for this trend. According to this narrative, supporting unwed mothers with public dollars made them more likely to trade in a husband for the dole. Once again, no credible social scientist has ever found evidence that the sharp rise in nonmarital childbearing was driven by welfare. While welfare may have led to a small decrease in the rate of marriage among the poor during those years, it could not begin to explain the skyrocketing numbers of births to unwed women. Yet Americans were primed to buy the story that AFDC, a system that went so against the grain of the self-sufficiency they believed in, was the main culprit in causing the spread of single motherhood. And so it was that Ronald Reagan, preparing his run for the presidency during a period when discontent with this stepchild of the welfare state was particularly high, found an issue with broad appeal and seized on it as a way to differentiate himself from his more moderate opponent. His stump speech soon began to feature the “welfare queen”—a villain who was duping the government in a grand style. Unlike the average American, she wasn’t expected to work or marry. The father or fathers of her offspring were given a pass on the responsibility of caring for the children they sired. The campaign even found a woman who became the symbol of all that was wrong with welfare. In a speech in January 1976, Reagan announced that she “[has] used 80 names, 30 addresses, 15 telephone numbers to collect food stamps, Social Security, veterans benefits for four nonexistent, deceased veteran husbands, as well as welfare. Her tax-free cash income alone has been running $150,000 a year.” As he punctuated the dollar value with just the right intonation, audible gasps could be heard from the crowd. Reagan’s claims were loosely based on a real person. Hailing from Chicago, Linda Taylor was a character as worthy of the big screen as Reagan himself. In a profile in Slate, Josh Levin wrote that in the 1970s alone, “Taylor was investigated for homicide, kidnapping, and baby trafficking.” She was implicated in multiple counts of insurance fraud and had numerous husbands, whom she used and discarded. Without a doubt, she was a real villain. But she was very far from a typical welfare recipient. Although negative racial stereotypes had plagued welfare throughout its existence, the emphasis on race was more widespread and virulent after Reagan turned his focus to the system. His welfare queen soon became deeply ingrained in American culture. She was black, decked out in furs, and driving her Cadillac to the welfare office to pick up her check. None of these stereotypes even came close to reflecting reality, particularly in regard to race. It was true that as of the late 1960s and beyond, a disproportionate percentage of blacks participated in AFDC. But there was never a point at which blacks accounted for a majority of recipients. The typical AFDC recipient, even in Reagan’s day, was white. Reagan lost the Republican primary to Ford in 1976 but defeated President Jimmy Carter in 1980. As president, Reagan took a somewhat softer tone, rhetorically portraying the welfare recipient as more of a victim of bad public policy than a villain. Like FDR, President Reagan viewed the poor as caught up in a system that acted like a narcotic. He was buoyed by the work of the libertarian social scientist Charles Murray, whose influential 1984 book Losing Ground argued that social welfare policies had increased long-term poverty. Murray’s logic was simple: Pay women to stay single and have babies, and more of them will do so. Pay them not to work, and you have a double disaster on your hands. Murray laid the blame for continuing high rates of poverty squarely at the feet of the welfare system. By discouraging both work and marriage, the system was ensuring that millions of American women and children remained poor. In his second inaugural address, Reagan argued for Murray’s thesis; his call was to help the poor “escape the spider’s web of dependency.” Despite this grand narrative and call to action, the changes Reagan was able to make to the welfare system were not extensive. The most notable legislative accomplishment of the 1980s was the Family Support Act, a bipartisan effort by conservatives and New Democrats who sought to distance themselves from the tax-and-spend image that was losing them seats in Congress. Arkansas governor Bill Clinton was a leader among the latter group. The act was the most significant attempt to date to put teeth into a work requirement for the welfare poor and to enhance child support enforcement. Those with new requirements imposed upon them were supposed to work at least part-time or to participate in a training program, but there were numerous exemptions. In the end, the program amounted to little more than an unfunded mandate. There was a jobs program with a catchy acronym (JOBS, standing for “job opportunities and basic skills”), but few states took their part seriously, and life changed for only a small fraction of welfare recipients. President Reagan famously quipped that “we waged a war on poverty, and poverty won.” Judged by the size of the welfare rolls, Reagan’s campaign against welfare was at least as futile. By 1988, there were 10.9 million recipients on AFDC, about the same number as when he took office . Four years later, when Reagan’s successor, George H. W. Bush, left office , the welfare caseloads reached 13.8 million —4.5 million adults and their 9.3 million dependent children. How was it that welfare, an immensely unpopular program, could withstand such an offensive? If welfare’s chief nemesis, Ronald Reagan, had failed, who possibly stood a chance? Excerpted from "$2 a Day: Living on Almost Nothing in America" by Kathryn J. Edin and H. Luke Shaefer. Published by Houghton Mifflin Harcourt. Copyright 2015 by Kathryn J. Edin and H. Luke Shaefer. Reprinted with permission of the publisher. All rights reserved. 2_a_day_embedWelfare’s virtual extinction has gone all but unnoticed by the American public and the press. But also unnoticed by many has been the expansion of other types of help for the poor. Thanks in part to changes made by the George W. Bush administration, more poor individuals claim SNAP than ever before. The State Children’s Health Insurance Program (now called CHIP, minus the “State”) was created in 1997 to expand the availability of public health insurance to millions of lower-income children. More recently, the Affordable Care Act has made health care coverage even more accessible to lower-income adults with and without children. Perhaps most important, a system of tax credits aimed at the working poor, especially those with dependent children, has grown considerably. The most important of these is the Earned Income Tax Credit (EITC). The EITC is refundable, which means that if the amount for which low-income workers are eligible is more than they owe in taxes, they will get a refund for the difference. Low-income working parents often get tax refunds that are far greater than the income taxes withheld from their paychecks during the year. These tax credits provide a significant income boost to low-income parents working a formal job (parents are not eligible if they’re working off the books). Because tax credits like the EITC are viewed by many as being pro-work, they have long enjoyed support from Democrats and Republicans alike. But here’s the catch: only those who are working can claim them. These expansions of aid for the working poor mean that even after a watershed welfare reform, we, as a country, aren’t spending less on poor families than we once did. In fact, we now spend much more. Yet for all this spending, these programs, except for SNAP, have offered little to help two people like Modonna and Brianna during their roughest spells, when Modonna has had no work. To see clearly who the winners and losers are in the new regime, compare Modonna’s situation before and after she lost her job. In 2009, the last year she was employed, her cashier’s salary was probably about $17,500. After taxes, her monthly paycheck would have totaled around $1,325. While she would not have qualified for a penny of welfare, at tax time she could have claimed a refund of about $3,800, all due to refundable tax credits (of course, her employer still would have withheld FICA taxes for Social Security and Medicare, so her income wasn’t totally tax-free). She also would have been entitled to around $160 each month in SNAP benefits. Taken together, the cash and food aid she could have claimed, even when working full-time, would have been in the range of $5,700 per year. The federal government was providing Modonna with a 36 percent pay raise to supplement her low earnings. Now, having lost her job and exhausted her unemployment insurance, Modonna gets nothing from the government at tax time. Despite her dire situation, she can’t get any help with housing costs. So many people are on the waiting list for housing assistance in Chicago that no new applications are being accepted. The only safety net program available to her at present is SNAP, which went from about $160 to $367 a month when her earnings fell to zero. But that difference doesn’t make up for Modonna’s lost wages. Not to mention the fact that SNAP is meant to be used only to purchase food, not to pay the rent, keep the utility company happy, or purchase school supplies. Thus, as Modonna’s earnings fell from $17,500 to nothing, the annual cash and food stamps she could claim from the government also fell, from $5,700 to $4,400. Welfare pre-1996 style might have provided a lifeline for Modonna as she frantically searched for another job. A welfare check might have kept her and her daughter in their little studio apartment, where they could keep their things, sleep in their own beds, take showers, and prepare meals. It might have made looking for a job easier —paying for a bus pass or a new outfit or hairdo that could help her compete with the many others applying for the same job. But welfare is dead. They just aren’t giving it out anymore. Who killed welfare? You might say that it all started with a charismatic presidential candidate hailing from a state far from Washington, D.C., running during a time of immense change for the country. There was no doubt he had a way with people. It was in the smoothness of his voice and the way he could lock on to someone, even over the TV. Still, he needed an issue that would capture people’s attention. He needed something with curb appeal. In 1976, Ronald Reagan was trying to oust a sitting president in his own party, a none-too-easy task. As he refined his stump speech, he tested out a theme that had worked well when he ran for governor of California and found that it resonated with audiences all across the country: It was time to reform welfare. Over the years, America had expanded its hodgepodge system of programs for the poor again and again. In Reagan’s time, the system was built around Aid to Families with Dependent Children (AFDC), the cash assistance program that was first authorized in 1935, during the depths of the Great Depression. This program offered cash to those who could prove their economic need and demanded little in return. It had no time limits and no mandate that recipients get a job or prove that they were unable to work. As its caseload grew over the years, AFDC came to be viewed by many as a program that rewarded indolence. And by supporting single mothers, it seemed to condone nonmarital childbearing. Perhaps the real question is not why welfare died, but why a program at such odds with American values had lasted as long as it did. In fact, welfare’s birth was a bit of a historical accident. After the Civil War, which had produced a generation of young widowed mothers, many states stepped in with “mother’s aid” programs, which helped widows care for their children in their own homes rather than placing them in orphanages. But during the Great Depression, state coffers ran dry. Aid to Dependent Children (ADC), as the program was first called, was the federal government’s solution to the crisis. Like the earlier state programs, it was based on the assumption that it was best for a widowed mother to raise her children at home. In the grand scheme of things, ADC was a minor footnote in America’s big bang of social welfare legislation in 1935 that created Social Security for the elderly, unemployment insurance for those who lost their jobs through no fault of their own, and other programs to support the needy aged and blind. Its architects saw ADC as a stopgap measure, believing that once male breadwinners began paying in to Social Security, their widows would later be able to claim their deceased husbands’ benefits. Yet ADC didn’t shrink over the years; it grew. The federal government slowly began to loosen eligibility restrictions, and a caseload of a few hundred thousand recipients in the late 1930s had expanded to 3.6 million by 1962. Widowed mothers did move on to Social Security. But other single mothers —divorcées and women who had never been married —began to use the program at greater rates. There was wide variation in the amount of support offered across the states. In those with large black populations, such as Mississippi and Alabama, single mothers got nickels and dimes on the dollar of what was provided in largely white states, such as Massachusetts and Minnesota. And since the American public deemed divorced or never-married mothers less deserving than widows, many states initiated practices intended to keep them off the rolls. Poverty rose to the top of the public agenda in the 1960s, in part spurred by the publication of Michael Harrington’s The Other America: Poverty in the United States. Harrington’s 1962 book made a claim that shocked the nation at a time when it was experiencing a period of unprecedented affluence: based on the best available evidence, between 40 million and 50 million Americans—20 to 25 percent of the nation’s population—still lived in poverty, suffering from “inadequate housing, medicine, food, and opportunity.” Shedding light on the lives of the poor from New York to Appalachia to the Deep South, Harrington’s book asked how it was possible that so much poverty existed in a land of such prosperity. It challenged the country to ask what it was prepared to do about it. Prompted in part by the strong public reaction to The Other America, and just weeks after President John F. Kennedy’s assassination, President Lyndon Johnson declared an “unconditional war on poverty in America.” In his 1964 State of the Union address, Johnson lamented that “many Americans live on the outskirts of hope —some because of their poverty, and some because of their color, and all too many because of both.” He charged the country with a new task: to uplift the poor, “to help replace their despair with opportunity.” This at a time when the federal government didn’t yet have an official way to measure whether someone was poor. In his efforts to raise awareness about poverty in America, Johnson launched a series of “poverty tours” via Air Force One, heading to places such as Martin County, Kentucky, where he visited with struggling families and highlighted the plight of the Appalachian poor, whose jobs in the coal mines were rapidly disappearing. A few years later, as Robert F. Kennedy contemplated a run for the presidency, he toured California’s San Joaquin Valley, the Mississippi Delta, and Appalachia to see whether the initial rollout of the War on Poverty programs had made any difference in the human suffering felt there. RFK’s tours were organized in part by his Harvard-educated aide Peter Edelman. (Edelman met his future wife, Marian Wright —later founder of the Children’s Defense Fund—on the Mississippi Delta tour. “She was really smart, and really good-looking,” he later wrote of the event.) Dressed in a dark suit and wearing thick, black-framed glasses, Edelman worked with others on Kennedy’s staff and local officials to schedule visits with families and organize community hearings. In eastern Kentucky, RFK held meetings in such small towns as Whitesburg and Fleming-Neon. Neither Edelman nor anyone else involved anticipated the keen interest in the eastern Kentucky trip among members of the press, who were waiting to hear whether Kennedy would run for president. Since the organizers had not secured a bus for the press pool, reporters covering the trip were forced to rent their own vehicles and formed a caravan that spanned thirty or forty cars. Edelman remembers that “by the end of the first day we were three hours behind schedule.” Kennedy’s poverty activism was cut short by his assassination in June 1968. But Johnson’s call to action had fueled an explosion in policy making. More programs targeting poor families were passed as part of Johnson’s Great Society and its War on Poverty than at any other time in American history. Congress made the fledgling Food Stamp Program permanent (although the program grew dramatically during the 1970s under President Richard Nixon) and increased federal funds for school breakfasts and lunches, making them free to children from poor families. Social Security was expanded to better serve the poorest of its claimants, Head Start was born, and new health insurance programs for the poor (Medicaid) and elderly (Medicare) were created. What the War on Poverty did not do was target the cash welfare system (by then renamed Aid to Families with Dependent Children, or AFDC) for expansion. Yet the late 1960s and early 1970s marked the greatest period of caseload growth in the program’s history. Between 1964 and 1976, the number of Americans getting cash assistance through AFDC nearly tripled, from 4.2 million to 11.3 million. This dramatic rise was driven in part by the efforts of the National Welfare Rights Organization (NWRO). A group led by welfare recipients and radical social workers, the NWRO brought poor families to welfare offices to demand aid and put pressure on program administrators to treat applicants fairly. The NWRO was also the impetus behind a series of court decisions in the late 1960s and the 1970s that struck down discriminatory practices that had kept some families over the prior decades off the welfare rolls, particularly those headed by blacks, as well as divorced and never-married mothers. Through “man in the house” rules, state caseworkers had engaged in midnight raids to ensure that recipients had no adult males living in the home. In addition, “suitable home” requirements had enabled caseworkers to exclude applicants if a home visit revealed “disorder.” Some instituted “white glove tests” to ensure “good housekeeping.” An applicant could be denied if the caseworker’s white glove revealed dust on a windowsill or the fireplace mantel. When these practices were struck down, the caseloads grew bigger, and with rising caseloads came rising expenditures. No longer was cash welfare an inconsequential footnote among government programs. It was now a significant commitment of the federal and state governments in its own right. As costs increased, AFDC’s unpopularity only grew. The largest, most representative survey of American attitudes, the General Social Survey, has consistently shown that between 60 and 70 percent of the American public believes that the government is “spending too little on assistance for the poor.” However, if Americans are asked about programs labeled “welfare” in particular, their support for assistance drops considerably. Even President Franklin D. Roosevelt claimed that “welfare is a narcotic, a subtle destroyer of the human spirit.” Although there is little evidence to support such a claim, welfare is widely believed to engender dependency. Providing more aid to poor single mothers during the 1960s and 1970s likely reduced their work effort somewhat. But it didn’t lead to the mass exodus from the workforce that the rhetoric of the time often suggested. Sometimes evidence, however, doesn’t stand a chance against a compelling narrative. Americans were suspicious of welfare because they feared that it sapped the able-bodied of their desire to raise themselves up by their own bootstraps. By the mid-1970s, with the country grappling with what seemed like a fundamental societal shift, another reason for wariness toward welfare arose. In 1960, only about 5 percent of births were to unmarried women, consistent with the two previous decades. But then the percentage began to rise at an astonishing pace, doubling by the early 1970s and nearly doubling again over the next decade. A cascade of criticism blamed welfare for this trend. According to this narrative, supporting unwed mothers with public dollars made them more likely to trade in a husband for the dole. Once again, no credible social scientist has ever found evidence that the sharp rise in nonmarital childbearing was driven by welfare. While welfare may have led to a small decrease in the rate of marriage among the poor during those years, it could not begin to explain the skyrocketing numbers of births to unwed women. Yet Americans were primed to buy the story that AFDC, a system that went so against the grain of the self-sufficiency they believed in, was the main culprit in causing the spread of single motherhood. And so it was that Ronald Reagan, preparing his run for the presidency during a period when discontent with this stepchild of the welfare state was particularly high, found an issue with broad appeal and seized on it as a way to differentiate himself from his more moderate opponent. His stump speech soon began to feature the “welfare queen”—a villain who was duping the government in a grand style. Unlike the average American, she wasn’t expected to work or marry. The father or fathers of her offspring were given a pass on the responsibility of caring for the children they sired. The campaign even found a woman who became the symbol of all that was wrong with welfare. In a speech in January 1976, Reagan announced that she “[has] used 80 names, 30 addresses, 15 telephone numbers to collect food stamps, Social Security, veterans benefits for four nonexistent, deceased veteran husbands, as well as welfare. Her tax-free cash income alone has been running $150,000 a year.” As he punctuated the dollar value with just the right intonation, audible gasps could be heard from the crowd. Reagan’s claims were loosely based on a real person. Hailing from Chicago, Linda Taylor was a character as worthy of the big screen as Reagan himself. In a profile in Slate, Josh Levin wrote that in the 1970s alone, “Taylor was investigated for homicide, kidnapping, and baby trafficking.” She was implicated in multiple counts of insurance fraud and had numerous husbands, whom she used and discarded. Without a doubt, she was a real villain. But she was very far from a typical welfare recipient. Although negative racial stereotypes had plagued welfare throughout its existence, the emphasis on race was more widespread and virulent after Reagan turned his focus to the system. His welfare queen soon became deeply ingrained in American culture. She was black, decked out in furs, and driving her Cadillac to the welfare office to pick up her check. None of these stereotypes even came close to reflecting reality, particularly in regard to race. It was true that as of the late 1960s and beyond, a disproportionate percentage of blacks participated in AFDC. But there was never a point at which blacks accounted for a majority of recipients. The typical AFDC recipient, even in Reagan’s day, was white. Reagan lost the Republican primary to Ford in 1976 but defeated President Jimmy Carter in 1980. As president, Reagan took a somewhat softer tone, rhetorically portraying the welfare recipient as more of a victim of bad public policy than a villain. Like FDR, President Reagan viewed the poor as caught up in a system that acted like a narcotic. He was buoyed by the work of the libertarian social scientist Charles Murray, whose influential 1984 book Losing Ground argued that social welfare policies had increased long-term poverty. Murray’s logic was simple: Pay women to stay single and have babies, and more of them will do so. Pay them not to work, and you have a double disaster on your hands. Murray laid the blame for continuing high rates of poverty squarely at the feet of the welfare system. By discouraging both work and marriage, the system was ensuring that millions of American women and children remained poor. In his second inaugural address, Reagan argued for Murray’s thesis; his call was to help the poor “escape the spider’s web of dependency.” Despite this grand narrative and call to action, the changes Reagan was able to make to the welfare system were not extensive. The most notable legislative accomplishment of the 1980s was the Family Support Act, a bipartisan effort by conservatives and New Democrats who sought to distance themselves from the tax-and-spend image that was losing them seats in Congress. Arkansas governor Bill Clinton was a leader among the latter group. The act was the most significant attempt to date to put teeth into a work requirement for the welfare poor and to enhance child support enforcement. Those with new requirements imposed upon them were supposed to work at least part-time or to participate in a training program, but there were numerous exemptions. In the end, the program amounted to little more than an unfunded mandate. There was a jobs program with a catchy acronym (JOBS, standing for “job opportunities and basic skills”), but few states took their part seriously, and life changed for only a small fraction of welfare recipients. President Reagan famously quipped that “we waged a war on poverty, and poverty won.” Judged by the size of the welfare rolls, Reagan’s campaign against welfare was at least as futile. By 1988, there were 10.9 million recipients on AFDC, about the same number as when he took office . Four years later, when Reagan’s successor, George H. W. Bush, left office , the welfare caseloads reached 13.8 million —4.5 million adults and their 9.3 million dependent children. How was it that welfare, an immensely unpopular program, could withstand such an offensive? If welfare’s chief nemesis, Ronald Reagan, had failed, who possibly stood a chance? Excerpted from "$2 a Day: Living on Almost Nothing in America" by Kathryn J. Edin and H. Luke Shaefer. Published by Houghton Mifflin Harcourt. Copyright 2015 by Kathryn J. Edin and H. Luke Shaefer. Reprinted with permission of the publisher. All rights reserved. 2_a_day_embed

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 08:59

Flying domestically just got that much more miserable for people in these 4 states

AlterNet Thanks to provisions in the little-known Real ID Act – passed in 2005 – four states will soon be unable to use a regular driver's licenses to fly even within the continental United States.

The Department of Homeland Security has named New York, Louisiana, Minnesota, American Samoa, and New Hampshire as locations where the residents will be required to use alternative to fly on commercial airplanes.

Although there is no reason given for why these states and regions were singled out, it could possibly be because these driver's licenses – the traditional form of identification used at airports – aren't compatible with new enactments of federal "Real ID" laws. According to Travel and Leisure:

"The new rules will go into effect sometime in 2016 (the exact date has not been announced), and there will be a three-month forgiveness period, during which people with these licenses will be warned that their IDs are no longer valid for flights.

Here’s the breakdown: if you're from one of these states, “acceptable” IDs include passports and passport cards, as well as permanent resident cards, U.S. military ID, and DHS trusted traveler cards such a Global Entry and NEXUS. The TSA will also accept Enhanced Driver’s Licenses, the kind that are currently used to replace passports for travel to and from  Canada, Mexico, and the Caribbean. Of the noncompliant states, only New York and Minnesota issue enhanced licenses.

The new DHS enforcement is rooted in the REAL ID Act, passed in 2005 based on the recommendation by the 9/11 Commission that the government should “set standards for the issuance of sources of identification, such as driver's licenses,” according to Department of Homeland Security's brief.

AlterNet Thanks to provisions in the little-known Real ID Act – passed in 2005 – four states will soon be unable to use a regular driver's licenses to fly even within the continental United States.

The Department of Homeland Security has named New York, Louisiana, Minnesota, American Samoa, and New Hampshire as locations where the residents will be required to use alternative to fly on commercial airplanes.

Although there is no reason given for why these states and regions were singled out, it could possibly be because these driver's licenses – the traditional form of identification used at airports – aren't compatible with new enactments of federal "Real ID" laws. According to Travel and Leisure:

"The new rules will go into effect sometime in 2016 (the exact date has not been announced), and there will be a three-month forgiveness period, during which people with these licenses will be warned that their IDs are no longer valid for flights.

Here’s the breakdown: if you're from one of these states, “acceptable” IDs include passports and passport cards, as well as permanent resident cards, U.S. military ID, and DHS trusted traveler cards such a Global Entry and NEXUS. The TSA will also accept Enhanced Driver’s Licenses, the kind that are currently used to replace passports for travel to and from  Canada, Mexico, and the Caribbean. Of the noncompliant states, only New York and Minnesota issue enhanced licenses.

The new DHS enforcement is rooted in the REAL ID Act, passed in 2005 based on the recommendation by the 9/11 Commission that the government should “set standards for the issuance of sources of identification, such as driver's licenses,” according to Department of Homeland Security's brief.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 08:00

Ben Carson’s great betrayal: How he ignores history in favor of the Republican Party

The Black Freedom Struggle began in America when the first Africans were brought to Florida in 1581. It continued onward through emancipation and reconstruction as black Americans “built a nation under their feet”, resisting chattel slavery, self-manumitting, taking up arms, and then building political and social institutions across the South and the rest of the United States. The Black Freedom Struggle would reach its peak with the Civil Rights Movement and be seared into American public memory with the Great March on Washington for Jobs and Freedom, and iconic speeches by Dr. King and others. The Civil Rights Movement continues today with Black Lives Matter and the centuries-long fight by black and brown folks against police thuggery, for a more equitable society, dignity, and full human rights for all peoples on both sides of the color line. The Black Freedom Struggle inspired other groups—women, gays and lesbians, the differently-abled—in the United States to resist and fight Power. It has also been a source of inspiration for people’s movements around the world. Of course, the individuals who led (and lead) the Black Freedom Struggle are not perfect. They, like all of us, are flawed. Black resistance to white supremacy occasionally (both necessarily and understandably) involved moments of fleeting flirtation with racial chauvinism. And one cannot overlook how political stagecraft and cruel realpolitik tried to erase the leadership role played by gays and lesbians in the Civil Rights Movement--this is a shameful blemish on the radically humanistic and transformative vision of American life offered by that glorious struggle. But in all, the Black Freedom Struggle has been a source of inspiration; black Americans are the moral conscience of a nation. Black America has earned that title even as much as it has been unfairly forced upon it. In that idealized role, black Americans are called to defend the weak against the strong, speak truth to power, and force America to live up to the promise of its democratic creed and vision. This obligation can give strength, clarity of purpose and energy to Black Americans and others who honor that legacy. Being part of a community that is “the miner’s canary” and “moral conscience of a nation” can exact a heavy burden. As such, some black folks have decided that the burden and obligation are too great to carry. Their shoulders are too narrow and weak. Ben Carson, black conservative and 2016 Republican presidential primary candidate, is one such person. Last week, Ben Carson surrendered to xenophobia, nativism, and intolerance when he suggested that Muslims are inherently incapable of being President of the United States because their faith is incompatible with the Constitution. As reported by CNN, in a conversation on Wednesday of this week Carson then suggested:
"I find black Republicans are treated extremely well in the Republican Party. In fact, I don't hear much about being a black Republican," he said Wednesday at an event in Michigan. "I think the Republicans have done a far superior job of getting over racism."
Carson was a Democrat for years, but said he's found the Republican Party to be more welcoming. "When you look at the philosophies of the two parties now, what I have noticed as a black Republican is that Republicans tend to look more at the character of people. And Democrats tend to look more at the color of their skin," he said Wednesday. Ben Carson’s comments are delusional, hypocritical, and vexing. Carson, like many movement conservatives, is a Christian theocrat who wants to weaken the boundaries between church and state in the United States. Carson, like other contemporary American conservatives, fetishizes the Constitution except when he wants to radically alter it: His suggestion that there should be a religious litmus test for office actually violates Article VI. Black Americans are not lockstep or uniform in their political beliefs. Spirited disagreement is central to black American political life. But for Carson to suggest that the Republican Party, with its Birtherism, Southern Strategy of overt and covert racism, and clear examples of “old fashioned” anti-black animus in the Age of Obama, is somehow a force for racial “progress” is an analysis that can only be offered by a person who is possessed of some sort of Stockholm Syndrome or willfully blind to empirical reality. Ben Carson’s pandering to Islamophobia is a violation of the Black Freedom Struggle’s spirit that black folks as unique victims of Power in America have a moral obligation to stand with the weak against the strong. Ultimately, he has rejected the legacy and burden of the Black Freedom Struggle. These are not meritorious acts of radical autonomy or individuality. Rather, they are acts of cowardice and betrayal. But if one rejects the Black Freedom Struggle, what do they replace it with? Black conservatives such as Ben Carson receive head-patting approval from white conservatives. The primary role of black conservatives in the post civil rights era is, as I have suggested many times both here at Salon and elsewhere, is to serve as human chaff and a defense shield against claims that white racism exists—and that today’s Republican Party is an organization whose “name brand” is based on mining white racial resentment, rage, and animus. Ben Carson, like Herman Cain before him, Supreme Court Justice Clarence Thomas, and the panoply of black conservatives trotted out on Fox News and elsewhere to excuse-make for white racism, are professional “black best friends” for the Republican Party. Ben Carson’s rejection of the Black Freedom Struggle and public embrace of Islamophobia is also very lucrative. Black conservatives, like women who reject feminism, gays and lesbians who oppose marriage equality, and Hispanics and Latinos who publicly bloviate against “illegal immigrants,” occupy a very lucrative niche in the right-wing media and entertainment apparatus. In the mid- to long-term, Carson’s black conservative hustle will earn him money on the lecture circuit. In the short-term, Carson’s Islamophobia has garnered at least $1 million in donations to his campaign. Betraying the Black Freedom Struggle is both ego gratifying for black conservatives—they are deemed by the White Right as the “special” or “good” black who is not the like the “other ones”—and financially lucrative. How do Black conservatives such as Ben Carson and Clarence Thomas, among others, reconcile their rejection of the Black Freedom Struggle with the fact that they, as members of the black elite and professional classes, are direct beneficiaries and products of it? They can imagine themselves as the true holders of the flame who are defending Black America’s “real interests” from trickery and deception by Democrats who want to keep black folks on a “plantation”. This is specious and insulting, of course, as such claims assume that black Americans are stupid, dumb, and unlike white folks, have no ability to make rational political calculi about their own collective self-interest. Contemporary black conservatives could also choose to rewrite the last 70 years or so of history--Republicans are the saviors of black Americans for time immemorial; Democrats are permanent enslavers and Klansman. In this imagined world, the Civil Rights Movement, and its won-in-blood-and-death victories -- such as the Voting Rights Act -- is somehow no longer needed. Moreover, protections for Black Americans which acknowledge the unique and continuing threat to their right to vote and full citizenship are somehow condescending and infantilizing. This is the logic of Clarence Thomas in his neutering the Voting and Civil Rights Acts. This betrayal of one of the core tenets of the Black Freedom Struggle is also tacitly and actively endorsed by black conservatives who are members of the Republican Party, because the latter’s strategy and goal for maintaining electoral power in the present and future is to limit the ability of non-whites to vote. My claims here are not at all based on some type of inexorable race essentialism or related fictions of “biological race.” The mantle of the Black Freedom Struggle, the miner’s canary, and the calling to be the moral conscience of a nation, are a function of history, values, political socialization, linked fate, the “blues sensibility”, and “love principle” that have driven black American freedom and resistance in the United States and elsewhere. Black conservatives in the post-civil-rights era are of that legacy while still having chosen to turn their backs on it. And others like Ben Carson, men and women influenced by radical Christian fundamentalism and cultivated ignorance on the historical and contemporary realities of the color line and American politics, are black conservative Don Quixotes, stuck in a fantasy world, fighting windmills, chimeras, and other enemies that do not exist. In their made up world, lies and fantasies are more comforting than hard realities and truths. Ben Carson and other black conservatives may have turned their backs to the Black Freedom Struggle — but it still claims them nonetheless.The Black Freedom Struggle began in America when the first Africans were brought to Florida in 1581. It continued onward through emancipation and reconstruction as black Americans “built a nation under their feet”, resisting chattel slavery, self-manumitting, taking up arms, and then building political and social institutions across the South and the rest of the United States. The Black Freedom Struggle would reach its peak with the Civil Rights Movement and be seared into American public memory with the Great March on Washington for Jobs and Freedom, and iconic speeches by Dr. King and others. The Civil Rights Movement continues today with Black Lives Matter and the centuries-long fight by black and brown folks against police thuggery, for a more equitable society, dignity, and full human rights for all peoples on both sides of the color line. The Black Freedom Struggle inspired other groups—women, gays and lesbians, the differently-abled—in the United States to resist and fight Power. It has also been a source of inspiration for people’s movements around the world. Of course, the individuals who led (and lead) the Black Freedom Struggle are not perfect. They, like all of us, are flawed. Black resistance to white supremacy occasionally (both necessarily and understandably) involved moments of fleeting flirtation with racial chauvinism. And one cannot overlook how political stagecraft and cruel realpolitik tried to erase the leadership role played by gays and lesbians in the Civil Rights Movement--this is a shameful blemish on the radically humanistic and transformative vision of American life offered by that glorious struggle. But in all, the Black Freedom Struggle has been a source of inspiration; black Americans are the moral conscience of a nation. Black America has earned that title even as much as it has been unfairly forced upon it. In that idealized role, black Americans are called to defend the weak against the strong, speak truth to power, and force America to live up to the promise of its democratic creed and vision. This obligation can give strength, clarity of purpose and energy to Black Americans and others who honor that legacy. Being part of a community that is “the miner’s canary” and “moral conscience of a nation” can exact a heavy burden. As such, some black folks have decided that the burden and obligation are too great to carry. Their shoulders are too narrow and weak. Ben Carson, black conservative and 2016 Republican presidential primary candidate, is one such person. Last week, Ben Carson surrendered to xenophobia, nativism, and intolerance when he suggested that Muslims are inherently incapable of being President of the United States because their faith is incompatible with the Constitution. As reported by CNN, in a conversation on Wednesday of this week Carson then suggested:
"I find black Republicans are treated extremely well in the Republican Party. In fact, I don't hear much about being a black Republican," he said Wednesday at an event in Michigan. "I think the Republicans have done a far superior job of getting over racism."
Carson was a Democrat for years, but said he's found the Republican Party to be more welcoming. "When you look at the philosophies of the two parties now, what I have noticed as a black Republican is that Republicans tend to look more at the character of people. And Democrats tend to look more at the color of their skin," he said Wednesday. Ben Carson’s comments are delusional, hypocritical, and vexing. Carson, like many movement conservatives, is a Christian theocrat who wants to weaken the boundaries between church and state in the United States. Carson, like other contemporary American conservatives, fetishizes the Constitution except when he wants to radically alter it: His suggestion that there should be a religious litmus test for office actually violates Article VI. Black Americans are not lockstep or uniform in their political beliefs. Spirited disagreement is central to black American political life. But for Carson to suggest that the Republican Party, with its Birtherism, Southern Strategy of overt and covert racism, and clear examples of “old fashioned” anti-black animus in the Age of Obama, is somehow a force for racial “progress” is an analysis that can only be offered by a person who is possessed of some sort of Stockholm Syndrome or willfully blind to empirical reality. Ben Carson’s pandering to Islamophobia is a violation of the Black Freedom Struggle’s spirit that black folks as unique victims of Power in America have a moral obligation to stand with the weak against the strong. Ultimately, he has rejected the legacy and burden of the Black Freedom Struggle. These are not meritorious acts of radical autonomy or individuality. Rather, they are acts of cowardice and betrayal. But if one rejects the Black Freedom Struggle, what do they replace it with? Black conservatives such as Ben Carson receive head-patting approval from white conservatives. The primary role of black conservatives in the post civil rights era is, as I have suggested many times both here at Salon and elsewhere, is to serve as human chaff and a defense shield against claims that white racism exists—and that today’s Republican Party is an organization whose “name brand” is based on mining white racial resentment, rage, and animus. Ben Carson, like Herman Cain before him, Supreme Court Justice Clarence Thomas, and the panoply of black conservatives trotted out on Fox News and elsewhere to excuse-make for white racism, are professional “black best friends” for the Republican Party. Ben Carson’s rejection of the Black Freedom Struggle and public embrace of Islamophobia is also very lucrative. Black conservatives, like women who reject feminism, gays and lesbians who oppose marriage equality, and Hispanics and Latinos who publicly bloviate against “illegal immigrants,” occupy a very lucrative niche in the right-wing media and entertainment apparatus. In the mid- to long-term, Carson’s black conservative hustle will earn him money on the lecture circuit. In the short-term, Carson’s Islamophobia has garnered at least $1 million in donations to his campaign. Betraying the Black Freedom Struggle is both ego gratifying for black conservatives—they are deemed by the White Right as the “special” or “good” black who is not the like the “other ones”—and financially lucrative. How do Black conservatives such as Ben Carson and Clarence Thomas, among others, reconcile their rejection of the Black Freedom Struggle with the fact that they, as members of the black elite and professional classes, are direct beneficiaries and products of it? They can imagine themselves as the true holders of the flame who are defending Black America’s “real interests” from trickery and deception by Democrats who want to keep black folks on a “plantation”. This is specious and insulting, of course, as such claims assume that black Americans are stupid, dumb, and unlike white folks, have no ability to make rational political calculi about their own collective self-interest. Contemporary black conservatives could also choose to rewrite the last 70 years or so of history--Republicans are the saviors of black Americans for time immemorial; Democrats are permanent enslavers and Klansman. In this imagined world, the Civil Rights Movement, and its won-in-blood-and-death victories -- such as the Voting Rights Act -- is somehow no longer needed. Moreover, protections for Black Americans which acknowledge the unique and continuing threat to their right to vote and full citizenship are somehow condescending and infantilizing. This is the logic of Clarence Thomas in his neutering the Voting and Civil Rights Acts. This betrayal of one of the core tenets of the Black Freedom Struggle is also tacitly and actively endorsed by black conservatives who are members of the Republican Party, because the latter’s strategy and goal for maintaining electoral power in the present and future is to limit the ability of non-whites to vote. My claims here are not at all based on some type of inexorable race essentialism or related fictions of “biological race.” The mantle of the Black Freedom Struggle, the miner’s canary, and the calling to be the moral conscience of a nation, are a function of history, values, political socialization, linked fate, the “blues sensibility”, and “love principle” that have driven black American freedom and resistance in the United States and elsewhere. Black conservatives in the post-civil-rights era are of that legacy while still having chosen to turn their backs on it. And others like Ben Carson, men and women influenced by radical Christian fundamentalism and cultivated ignorance on the historical and contemporary realities of the color line and American politics, are black conservative Don Quixotes, stuck in a fantasy world, fighting windmills, chimeras, and other enemies that do not exist. In their made up world, lies and fantasies are more comforting than hard realities and truths. Ben Carson and other black conservatives may have turned their backs to the Black Freedom Struggle — but it still claims them nonetheless.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2015 07:30

September 26, 2015

The day I said goodbye to a country I could no longer call home

My journey to become a citizen in the United States of America was mercifully short and uneventful. After nearly a decade as a permanent resident I finally filed an application to become a U.S. citizen this spring. I had two reasons.

I was getting tired of the harassment, money and stress involved in applying for visas, documents I needed to go essentially anywhere outside of the United States. 

Friends in countries I wanted to visit always faced the same intricate appeal from me: “Could you please send me your most personal details: copies of your passport, your bank statement, your utility bill —so I can prove to your immigration authorities that I actually know you, that I won’t linger in the country as a freeloader, and although I am only coming for two days and have the return ticket to prove it, I still need your testimony that I really do plan to leave.”

The "supporting documents" for my visa applications to certain European nations were often over 30 pages. 

A visa application is embedded in a framework where the applicant is forever a suspect, someone out to steal money, benefits and rights from lawful citizens of that country unless proven harmless.  Temporarily.  Until your next visa application.

My second reason for not applying for U.S. citizenship—the golden passport that grants you unrestricted, visa-free entry to most nations on Earth— was perhaps more complicated.

I did not want to think too deeply about not being an Indian citizen. 

Most of my childhood in India I had grown up among people who fought various injustices of the Indian State. The tricolor flag only evoked for me histories of dispossession of the Kashmiri people.  The nation and its symbols had proven their true worth to my generation on a winter’s day in 1992 when a gang of Hindu militants tore down a 13th century mosque and riots against Muslims flamed across the country in its aftermath. 

So it was not the nation-state that made me hesitate.

But like many immigrants with complicated relationships to people and places "back home," India, for me, was never only a nation-state. 

It was a place of childhood rains, loves lost, and homes never to be stepped in; like the times they occupied, they were all gone. Taking on American citizenship seemed in a way to sever off, yet again, ties to such dark waters, to such luminous, delicate webs of stories.  So I hesitated.

Then last year Narendra Modi was elected to head the Indian nation-state. My hesitation to adopt the symbols of a nation-state that sent drones to the Middle East ran up against being a citizen of a nation-state that was headed now by a man who had at many levels of government allowed and incited horrific acts of violence against Muslims.   

Unhappily, falteringly, I decided to apply for U.S. citizenship. 

At least I could now stop bothering my friends to vouch for me when I traveled. My family could pass through immigration together without me being plucked off like an exotic but troublesome weed.

My application process was tedious but not too long.  I filed in March and I was called in for an interview in August.  It was a short interview conducted by a young woman who had a pennant from her university’s football team on her wall. 

I passed the interview.  The next step was to be a "loyalty oath ceremony" where I’d be sworn in as a U.S. citizen and get the all-powerful Naturalization Certificate.

These ceremonies are scheduled to maximize the number of people who are in that stage of the process—which is to say that they happen infrequently. So although you are allowed to reschedule, it is not wise to do so.  People’s jobs, access to housing and healthcare, depend on that Naturalization Certificate and no one can afford to dally with the American state. 

The ceremonies happen in one location at a time in any given state and people have to drive to that location no matter where it is that they actually reside. Sometimes these drives are two to three  hours long.  The ceremonies almost always take place on work days, so immigrants have to negotiate with their workplaces to  attend them, organize childcare where needed and pray for good weather, a benign boss and a good neighbor who can pick up the kid from school in their stead. 

 The day of my loyalty ceremony dawned bright and hot.  I was lucky that it was scheduled to be in my own town—about a mile from my home. 

Big electronic signs had been installed on the roads, as during sports events, with the words "Naturalization Ceremony This Way." 

My first clue to what awaited ought to have been that the numerous ushers who were helping out in the parking lot were volunteers from the American Legion.  They were all veterans.

The ceremony was being held in a school gymnasium, temporarily set up with room for a judge, a desk for immigration controls, and decked out with American flags and buntings. 

The ceremony was to start at 2:00 p.m.  Our letters said so. 

One hundred people, their families and friends had come, keeping that time in mind.  Nearly all of us got there by 1:30. 

Two Burmese women sat on either side of me. They spoke very little English, were housecleaners by profession and had driven three hours to be there. They were getting the day off from their company, but it was a day off without pay. They were both paid $7.50 an hour.

We waited. The clock ticked on. It got hotter—it was 90 degrees outside.  The children, the bravest among us, began to cry, complain and voice what we all felt but didn’t dare say: “When can we go home?”

Finally the judge arrived at 3:00 p.m.—a full hour after the scheduled time. He was closely followed by a group of lovely young women, all white as far as I could tell, who, we were told, were a local vocalist group and  were going to provide the entertainment for the event. 

The judge had decided to push the ceremony back for an hour, without informing any of us, simply to fit the schedule of the choir.

He smiled at us—all 100 of us who had driven for hours to get there on time, who had thoughts of our children left behind, who had now waited for nearly two hours for the ceremony to start.

The ceremony began.

Local dignitaries, the mayor, state senators, a representative of the Bar Association all gave speeches. 

All began by congratulating us on this hard journey that we had undertaken to reach this important day, and this promised land of opportunities. They all ended by reminding us of the responsibilities we now had as citizens: to protect and defend the United States and to be a model to our own communities. 

We were told that we were wonderful models already. We, as one speaker elegantly pointed out, “were all dressed nicely and none needed to pull up our pants.”

We were reminded how lucky we were to be there, as the American flag represented “freedom and democracy” all over the world.  I did not have the opportunity to ask the man who said that which parts of the world he had traveled to, though I would be curious to hear where.

The vocalists started to sing. In dulcet tones and with beautiful smiles (they all  smiled uniformly throughout the performance) they told us:

From the halls of Montezuma, to the shores of Tripoli We fight our country's battles, in the air, on land, and sea First to fight for right and freedom, and to keep our honor clean

Several of us waiting to be made citizens were from Mexico and from North African countries.  The words of the song were a kind reminder as to whose histories mattered today. It certainly was not ours.

The Burmese women sitting on both sides of me fidgeted a bit. They had now sat through nearly 50 minutes of constant talking in English, a language they didn’t speak.  But one of them pulled out a bag of candies and before offering it to her own friend or taking one herself, she offered it to me, the stranger she didn’t know. 

The ceremony ended with the judge telling us, for nearly 20 minutes, how America cared for children and tried its best to provide for every child.  My limited knowledge of current affairs told me that more than 16 million children in this country live in poverty. Maybe these records did not reach the judge beyond the music of the smiling choir.  And so we became citizens.

Did it have to be this way?

Could not the authority figures have entrusted this group of immigrants with the responsibility that James Baldwin had once entrusted his nephew—“to make America what it must become”? Apparently, contrary to Baldwin’s wish and project, we all had to assimilate to the "burning house." 

I came away from the ceremony with my Naturalization Certificate and a piece of Burmese candy.  One I knew to be useful.  The other was valuable.

My journey to become a citizen in the United States of America was mercifully short and uneventful. After nearly a decade as a permanent resident I finally filed an application to become a U.S. citizen this spring. I had two reasons.

I was getting tired of the harassment, money and stress involved in applying for visas, documents I needed to go essentially anywhere outside of the United States. 

Friends in countries I wanted to visit always faced the same intricate appeal from me: “Could you please send me your most personal details: copies of your passport, your bank statement, your utility bill —so I can prove to your immigration authorities that I actually know you, that I won’t linger in the country as a freeloader, and although I am only coming for two days and have the return ticket to prove it, I still need your testimony that I really do plan to leave.”

The "supporting documents" for my visa applications to certain European nations were often over 30 pages. 

A visa application is embedded in a framework where the applicant is forever a suspect, someone out to steal money, benefits and rights from lawful citizens of that country unless proven harmless.  Temporarily.  Until your next visa application.

My second reason for not applying for U.S. citizenship—the golden passport that grants you unrestricted, visa-free entry to most nations on Earth— was perhaps more complicated.

I did not want to think too deeply about not being an Indian citizen. 

Most of my childhood in India I had grown up among people who fought various injustices of the Indian State. The tricolor flag only evoked for me histories of dispossession of the Kashmiri people.  The nation and its symbols had proven their true worth to my generation on a winter’s day in 1992 when a gang of Hindu militants tore down a 13th century mosque and riots against Muslims flamed across the country in its aftermath. 

So it was not the nation-state that made me hesitate.

But like many immigrants with complicated relationships to people and places "back home," India, for me, was never only a nation-state. 

It was a place of childhood rains, loves lost, and homes never to be stepped in; like the times they occupied, they were all gone. Taking on American citizenship seemed in a way to sever off, yet again, ties to such dark waters, to such luminous, delicate webs of stories.  So I hesitated.

Then last year Narendra Modi was elected to head the Indian nation-state. My hesitation to adopt the symbols of a nation-state that sent drones to the Middle East ran up against being a citizen of a nation-state that was headed now by a man who had at many levels of government allowed and incited horrific acts of violence against Muslims.   

Unhappily, falteringly, I decided to apply for U.S. citizenship. 

At least I could now stop bothering my friends to vouch for me when I traveled. My family could pass through immigration together without me being plucked off like an exotic but troublesome weed.

My application process was tedious but not too long.  I filed in March and I was called in for an interview in August.  It was a short interview conducted by a young woman who had a pennant from her university’s football team on her wall. 

I passed the interview.  The next step was to be a "loyalty oath ceremony" where I’d be sworn in as a U.S. citizen and get the all-powerful Naturalization Certificate.

These ceremonies are scheduled to maximize the number of people who are in that stage of the process—which is to say that they happen infrequently. So although you are allowed to reschedule, it is not wise to do so.  People’s jobs, access to housing and healthcare, depend on that Naturalization Certificate and no one can afford to dally with the American state. 

The ceremonies happen in one location at a time in any given state and people have to drive to that location no matter where it is that they actually reside. Sometimes these drives are two to three  hours long.  The ceremonies almost always take place on work days, so immigrants have to negotiate with their workplaces to  attend them, organize childcare where needed and pray for good weather, a benign boss and a good neighbor who can pick up the kid from school in their stead. 

 The day of my loyalty ceremony dawned bright and hot.  I was lucky that it was scheduled to be in my own town—about a mile from my home. 

Big electronic signs had been installed on the roads, as during sports events, with the words "Naturalization Ceremony This Way." 

My first clue to what awaited ought to have been that the numerous ushers who were helping out in the parking lot were volunteers from the American Legion.  They were all veterans.

The ceremony was being held in a school gymnasium, temporarily set up with room for a judge, a desk for immigration controls, and decked out with American flags and buntings. 

The ceremony was to start at 2:00 p.m.  Our letters said so. 

One hundred people, their families and friends had come, keeping that time in mind.  Nearly all of us got there by 1:30. 

Two Burmese women sat on either side of me. They spoke very little English, were housecleaners by profession and had driven three hours to be there. They were getting the day off from their company, but it was a day off without pay. They were both paid $7.50 an hour.

We waited. The clock ticked on. It got hotter—it was 90 degrees outside.  The children, the bravest among us, began to cry, complain and voice what we all felt but didn’t dare say: “When can we go home?”

Finally the judge arrived at 3:00 p.m.—a full hour after the scheduled time. He was closely followed by a group of lovely young women, all white as far as I could tell, who, we were told, were a local vocalist group and  were going to provide the entertainment for the event. 

The judge had decided to push the ceremony back for an hour, without informing any of us, simply to fit the schedule of the choir.

He smiled at us—all 100 of us who had driven for hours to get there on time, who had thoughts of our children left behind, who had now waited for nearly two hours for the ceremony to start.

The ceremony began.

Local dignitaries, the mayor, state senators, a representative of the Bar Association all gave speeches. 

All began by congratulating us on this hard journey that we had undertaken to reach this important day, and this promised land of opportunities. They all ended by reminding us of the responsibilities we now had as citizens: to protect and defend the United States and to be a model to our own communities. 

We were told that we were wonderful models already. We, as one speaker elegantly pointed out, “were all dressed nicely and none needed to pull up our pants.”

We were reminded how lucky we were to be there, as the American flag represented “freedom and democracy” all over the world.  I did not have the opportunity to ask the man who said that which parts of the world he had traveled to, though I would be curious to hear where.

The vocalists started to sing. In dulcet tones and with beautiful smiles (they all  smiled uniformly throughout the performance) they told us:

From the halls of Montezuma, to the shores of Tripoli We fight our country's battles, in the air, on land, and sea First to fight for right and freedom, and to keep our honor clean

Several of us waiting to be made citizens were from Mexico and from North African countries.  The words of the song were a kind reminder as to whose histories mattered today. It certainly was not ours.

The Burmese women sitting on both sides of me fidgeted a bit. They had now sat through nearly 50 minutes of constant talking in English, a language they didn’t speak.  But one of them pulled out a bag of candies and before offering it to her own friend or taking one herself, she offered it to me, the stranger she didn’t know. 

The ceremony ended with the judge telling us, for nearly 20 minutes, how America cared for children and tried its best to provide for every child.  My limited knowledge of current affairs told me that more than 16 million children in this country live in poverty. Maybe these records did not reach the judge beyond the music of the smiling choir.  And so we became citizens.

Did it have to be this way?

Could not the authority figures have entrusted this group of immigrants with the responsibility that James Baldwin had once entrusted his nephew—“to make America what it must become”? Apparently, contrary to Baldwin’s wish and project, we all had to assimilate to the "burning house." 

I came away from the ceremony with my Naturalization Certificate and a piece of Burmese candy.  One I knew to be useful.  The other was valuable.

My journey to become a citizen in the United States of America was mercifully short and uneventful. After nearly a decade as a permanent resident I finally filed an application to become a U.S. citizen this spring. I had two reasons.

I was getting tired of the harassment, money and stress involved in applying for visas, documents I needed to go essentially anywhere outside of the United States. 

Friends in countries I wanted to visit always faced the same intricate appeal from me: “Could you please send me your most personal details: copies of your passport, your bank statement, your utility bill —so I can prove to your immigration authorities that I actually know you, that I won’t linger in the country as a freeloader, and although I am only coming for two days and have the return ticket to prove it, I still need your testimony that I really do plan to leave.”

The "supporting documents" for my visa applications to certain European nations were often over 30 pages. 

A visa application is embedded in a framework where the applicant is forever a suspect, someone out to steal money, benefits and rights from lawful citizens of that country unless proven harmless.  Temporarily.  Until your next visa application.

My second reason for not applying for U.S. citizenship—the golden passport that grants you unrestricted, visa-free entry to most nations on Earth— was perhaps more complicated.

I did not want to think too deeply about not being an Indian citizen. 

Most of my childhood in India I had grown up among people who fought various injustices of the Indian State. The tricolor flag only evoked for me histories of dispossession of the Kashmiri people.  The nation and its symbols had proven their true worth to my generation on a winter’s day in 1992 when a gang of Hindu militants tore down a 13th century mosque and riots against Muslims flamed across the country in its aftermath. 

So it was not the nation-state that made me hesitate.

But like many immigrants with complicated relationships to people and places "back home," India, for me, was never only a nation-state. 

It was a place of childhood rains, loves lost, and homes never to be stepped in; like the times they occupied, they were all gone. Taking on American citizenship seemed in a way to sever off, yet again, ties to such dark waters, to such luminous, delicate webs of stories.  So I hesitated.

Then last year Narendra Modi was elected to head the Indian nation-state. My hesitation to adopt the symbols of a nation-state that sent drones to the Middle East ran up against being a citizen of a nation-state that was headed now by a man who had at many levels of government allowed and incited horrific acts of violence against Muslims.   

Unhappily, falteringly, I decided to apply for U.S. citizenship. 

At least I could now stop bothering my friends to vouch for me when I traveled. My family could pass through immigration together without me being plucked off like an exotic but troublesome weed.

My application process was tedious but not too long.  I filed in March and I was called in for an interview in August.  It was a short interview conducted by a young woman who had a pennant from her university’s football team on her wall. 

I passed the interview.  The next step was to be a "loyalty oath ceremony" where I’d be sworn in as a U.S. citizen and get the all-powerful Naturalization Certificate.

These ceremonies are scheduled to maximize the number of people who are in that stage of the process—which is to say that they happen infrequently. So although you are allowed to reschedule, it is not wise to do so.  People’s jobs, access to housing and healthcare, depend on that Naturalization Certificate and no one can afford to dally with the American state. 

The ceremonies happen in one location at a time in any given state and people have to drive to that location no matter where it is that they actually reside. Sometimes these drives are two to three  hours long.  The ceremonies almost always take place on work days, so immigrants have to negotiate with their workplaces to  attend them, organize childcare where needed and pray for good weather, a benign boss and a good neighbor who can pick up the kid from school in their stead. 

 The day of my loyalty ceremony dawned bright and hot.  I was lucky that it was scheduled to be in my own town—about a mile from my home. 

Big electronic signs had been installed on the roads, as during sports events, with the words "Naturalization Ceremony This Way." 

My first clue to what awaited ought to have been that the numerous ushers who were helping out in the parking lot were volunteers from the American Legion.  They were all veterans.

The ceremony was being held in a school gymnasium, temporarily set up with room for a judge, a desk for immigration controls, and decked out with American flags and buntings. 

The ceremony was to start at 2:00 p.m.  Our letters said so. 

One hundred people, their families and friends had come, keeping that time in mind.  Nearly all of us got there by 1:30. 

Two Burmese women sat on either side of me. They spoke very little English, were housecleaners by profession and had driven three hours to be there. They were getting the day off from their company, but it was a day off without pay. They were both paid $7.50 an hour.

We waited. The clock ticked on. It got hotter—it was 90 degrees outside.  The children, the bravest among us, began to cry, complain and voice what we all felt but didn’t dare say: “When can we go home?”

Finally the judge arrived at 3:00 p.m.—a full hour after the scheduled time. He was closely followed by a group of lovely young women, all white as far as I could tell, who, we were told, were a local vocalist group and  were going to provide the entertainment for the event. 

The judge had decided to push the ceremony back for an hour, without informing any of us, simply to fit the schedule of the choir.

He smiled at us—all 100 of us who had driven for hours to get there on time, who had thoughts of our children left behind, who had now waited for nearly two hours for the ceremony to start.

The ceremony began.

Local dignitaries, the mayor, state senators, a representative of the Bar Association all gave speeches. 

All began by congratulating us on this hard journey that we had undertaken to reach this important day, and this promised land of opportunities. They all ended by reminding us of the responsibilities we now had as citizens: to protect and defend the United States and to be a model to our own communities. 

We were told that we were wonderful models already. We, as one speaker elegantly pointed out, “were all dressed nicely and none needed to pull up our pants.”

We were reminded how lucky we were to be there, as the American flag represented “freedom and democracy” all over the world.  I did not have the opportunity to ask the man who said that which parts of the world he had traveled to, though I would be curious to hear where.

The vocalists started to sing. In dulcet tones and with beautiful smiles (they all  smiled uniformly throughout the performance) they told us:

From the halls of Montezuma, to the shores of Tripoli We fight our country's battles, in the air, on land, and sea First to fight for right and freedom, and to keep our honor clean

Several of us waiting to be made citizens were from Mexico and from North African countries.  The words of the song were a kind reminder as to whose histories mattered today. It certainly was not ours.

The Burmese women sitting on both sides of me fidgeted a bit. They had now sat through nearly 50 minutes of constant talking in English, a language they didn’t speak.  But one of them pulled out a bag of candies and before offering it to her own friend or taking one herself, she offered it to me, the stranger she didn’t know. 

The ceremony ended with the judge telling us, for nearly 20 minutes, how America cared for children and tried its best to provide for every child.  My limited knowledge of current affairs told me that more than 16 million children in this country live in poverty. Maybe these records did not reach the judge beyond the music of the smiling choir.  And so we became citizens.

Did it have to be this way?

Could not the authority figures have entrusted this group of immigrants with the responsibility that James Baldwin had once entrusted his nephew—“to make America what it must become”? Apparently, contrary to Baldwin’s wish and project, we all had to assimilate to the "burning house." 

I came away from the ceremony with my Naturalization Certificate and a piece of Burmese candy.  One I knew to be useful.  The other was valuable.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 16:30

Confronted by my own bullsh*t: I wanted to be the voice of nonviolence for my church after George Zimmerman’s acquittal, but all I could do was cry for all my inconsistencies

"Well, friend, want to go to the shooting range with me?” Clayton said, his light brown eyes lighting up mischievously. We were stretching before the CrossFit class Clayton coaches when I mentioned that I’d recently realized he was my “token conservative friend,” the way some people have a “token black friend.” His response was to invite me to, of all things, a gun range. I reached for my toes as my liberal gun-control-advocate self immediately and gleefully replied, “Seriously? Of course I do.” Because politics should never, if at all possible, get in the way of fun. Little did I know that this would be one of several experiences, during what turned out to be the week of the George Zimmerman acquittal, that would make it virtually impossible for me to claim the liberal outrage/moral high ground I would later wish I could maintain, since life and its ambiguities sometimes throw our ideals into crisis. * A few days after his offer, I saw Clayton’s short, muscular frame walking up to my front door carrying a heavy black bag. He was there for a quick gun-safety lesson before we headed to the range, since I had never in my life actually held a handgun. Clayton is a Texan, a Republican, and a Second Amendment enthusiast. But since Clayton has a degree from Texas A&M and has lived part of his life in Saudi Arabia, where his dad was an oil man, he describes himself as a “well-educated and well-traveled redneck.” “There are four things you need to know,” Clayton said, beginning my very first gun-safety lesson. “One, always assume every gun you pick up is loaded. Two, never aim a gun at something you do not intend to destroy. Three, keep your finger off the trigger until you are ready to fire, and four, know your target and what’s beyond it. A gun is basically a paperweight. In and of themselves,” he claimed, “they are only dangerous if people do not follow these rules.” I’m not sure what the statistics are on gun-shaped paperweight deaths, I thought, but I’ll be sure to look that up. “Okay, ready?” Clayton asked. “I have no idea,” I replied. He placed a matte black handgun and a box of ammunition on our kitchen table, and it felt as illicit as if he had just placed a kilo of cocaine or a stack of Hustler magazines on the very surface where we pray and eat our dinners as a family. I tried to ask some intelligent questions. “What kind of gun is this?” “It’s a 40.” Like I had any idea what the hell that meant. “What’s a 9mm? I’ve heard a lot about those.” “This is.” And he lifted his shirt up to show his concealed handgun. “Man, you don’t carry that thing around all the time, do you?” He smiled. “If I’m not in gym shorts or pajamas, yes.” Later, in the firing-range parking lot, filled almost exclusively with pickup trucks, I made the astute observation, “No Obama bumper stickers.” “Weird, huh?” he joked. When I’m someplace cool, say an old cathedral or a hipster ice cream shop, I am sure to check in on Facebook. But not here. Partly because it was Monday morning and Clayton had penciled in our fun shooting date as a “work meeting,” but also because I didn’t want a rash of shit from my friends or parishioners— almost all of whom are liberal— asking if I’d lost my mind or simply been abducted by rednecks. As we stepped onto the casing-littered, black rubber-matted floor of the indoor firing range, I was aware of several important points: one, our guns were loaded and intended to destroy the paper target in front of us; two, I should put my finger on the trigger only when I intended to shoot; three, a wall of rubber and concrete was behind my target; and four, I sweat. I knew from hearing gunshots in my neighborhood that guns were loud. And I knew from the movies that there was a kickback when a gun was fired. But, holy shit, was I unprepared for how loud and jolting firing a handgun would be. Or how fun. We shot for about an hour, and after we were done, Clayton told me that I did pretty well, for a first-timer. (Except for when a hot shell casing went down my shirt and I jerked around so mindlessly that he had to reach over and turn the loaded gun in my hand back toward the target, making me feel like a total dumbass. A really, really dangerous dumbass.) But I loved it. I loved it like I love roller coasters and riding a motorcycle: not something I want in my life all the time, but an activity that is fun to do once in a while, that makes me feel like I’m alive and a little bit lethal. “Can we shoot skeet next time?” I asked eagerly as we made our way back to the camo-covered front desk to retrieve our IDs. The whole shop looked like a duck blind. As though if something dangerous or tasty came through the front door, all the young, acne-ridden guys who work there could take it down without danger of being spotted. On the way back to my house, I suggested we stop for pupusas (stuffed Salvadorian corn cakes) so that we both could have a novel experience on a Monday. Sitting at one of the five stools by the window at Tacos Acapulco— looking out on the check-cashing joints and Mexican panaderias that dot East Colfax Avenue— I took the opportunity to ask a burning question: “So why in the world do you want to carry a gun all the time?” I’d never knowingly been this close to a gun-carrier before, and it felt like my chance to ask something I’d always wanted to know. I could only hope my question didn’t feel to him like it does to our black friend Shayla when people ask to touch her afro. As he tried managing with his fork the melted cheese that refused to detach itself from the pupusa, he said, “Self-defense, and pride of country. We have this right, so we should exercise it. Also if someone tried to hurt us while we were sitting here, I could take them down.” It was a foreign worldview to me, that people could go through life so aware of the possibility that someone might try to hurt them, and that, as a response, they would strap a gun on their body as they made their way through Denver. I didn’t understand or even approve. But Clayton is my token conservative friend, and I love him, and he went through the trouble of taking me to the shooting range, so I left it there. * The week I went shooting with Clayton was also the week of my mother’s seventieth and my sister’s fiftieth birthday party. It was a murder mystery dinner, so, five nights after blasting paper targets with Clayton at the shooting range, I sat on the back patio of my parents’ suburban Denver home and pretended to be a hippie winemaker for the sake of a contrived drama. Normally, my natural misanthropy would prevent me from participating in such awkward nonsense, but I soon remembered how many times I had voluntarily dressed myself up and played a role in other contrived dramas that didn’t involve a four-course meal or civil company (like the year I tried to be a Deadhead), so I submitted to the murder mystery dinner for the sake of two women I love. My role called for a flowing skirt, peasant blouse, and flowers in my hair— none of which I own or could possibly endure wearing, so a nightgown and lots of beads had to do the trick. Throughout the mostly pleasant evening, I would see Mom talking to my brother out of the side of her mouth, just like she did when we were kids and she wanted to tell Dad something she didn’t want us to know. I watched my mom, unaware that an unscripted drama was unfolding around the edges of the fictional one that called for flowers in my fauxhawk. As I snuck into the kitchen to check my phone for messages, my dad followed me to fill me in on what was happening. It turns out my mom’s side-of-the-mouth whispers were about something serious. My mom had been receiving threats from an unbalanced (and reportedly armed) woman who was blaming my mom for a loss she had experienced. My mom had nothing to do with this loss, but that didn’t stop this woman from fixating on her as the one to blame. And she knew where my mom went to church on Sundays. “It’s made being at church pretty tense for us,” my father told me. My older brother Gary, who is a law enforcement officer in a federal prison and who, along with his wife and three kids, attends the same church as my parents, walked by Dad and me in the kitchen and said, “Horrible, right? The past three weeks I’ve carried a concealed weapon to church in case she shows up and tries anything.” I immediately thought of Clayton and his heretofore foreign worldview, weighing it against how I now felt instinctually glad that my brother would be able to react if a crazy person tried to hurt our mother. And how, at the same time, it felt like madness that I would be glad someone was carrying a gun to church. But that’s the thing about my values—they tend to bump up against reality, and when that happens, I may need to throw them out the window. That, or I ignore reality. For me, more often than not, it’s the values that go. My gut reaction to my brother’s gun-carrying disturbed me, but not as much in the moment as it would the next morning. * On the night of the party, I missed the breaking news that George Zimmerman, who had shot and killed unarmed teen Trayvon Martin, had been found not guilty on all counts. For more than a year, the case had ignited fierce debate over racism and Florida’s “Stand Your Ground” law, which allows the use of violent force if someone believes their life is being threatened. My Facebook feed was lit up with protests, outrage, and rants. I wanted to join in and act as a voice for nonviolence that week, but when I heard on NPR that George Zimmerman’s brother was saying he rejected the idea that Trayvon Martin was unarmed, Martin’s weapon being the sidewalk on which he broke George’s nose, well, my first reaction was not nonviolence but an overwhelming urge to reach through the radio and give that man a fully armed punch in the throat. Even more, that very week, a federal law enforcement officer was carrying a concealed weapon into my mom’s church every Sunday. Which is insane and something I would normally want to post a rant about on my Facebook wall for all the liberals like me to “like.” Except in this case, that particular law enforcement officer (a) was my brother, and (b) carried that weapon to protect his (my) family, his (my) mother, from a crazy woman who wanted her dead. When I heard that my brother was armed to protect my own mom, I wasn’t alarmed like any good gun-control supporting pastor would be. I was relieved. And now what the hell do I post on Facebook? What do I do with that? I also had to deal with the fact that I simply could not express the level of antiracist outrage I wanted to, knowing something that no one else would know unless I said it out loud: despite my politics and liberalism, when a group of young black men in my neighborhood walk by, my gut reaction is to brace myself in a different way than I would if those men were white. I hate this about myself, but if I said that there is not residual racism in me, racism that— after forty-four years of being reinforced by messages in the media and culture around me— I simply do not know how to escape, I would be lying. Even if I do own an “eracism” bumper sticker. The morning after the George Zimmerman verdict, as I was reflecting on what to say to my church about it, I wanted to be a voice for nonviolence, antiracism, and gun-control as I felt I should (or as I saw people on Twitter demanding: “If your pastor doesn’t preach about gun control and racism this week, find a new church”) — but all I could do was stand in my kitchen and cry. Cry for all my inconsistencies. For my parishioner and mother of two, Andrea Gutierrez, who said to me that mothers of kids with brown and black skin now feel like their children can legally be target practice on the streets of suburbia. For a nation divided — both sides hating the other. For all the ways I silently perpetuate the things I criticize. For the death threats toward my family and the death threats toward the Zimmerman family. For Tracy Martin and Sybrina Fulton, whose child, Trayvon, was shot dead, and who were told that it was more his fault than the fault of the shooter. Moments after hearing about the acquittal, I walked my dog and called Duffy, a particularly thoughtful parishioner. “I’m really screwed up about all of this,” I said, proceeding to detail all the reasons that, even though I feel so strongly about these issues, I could not with any integrity “stand my own ground” against violence and racism — not because I no longer believe in standing against those things ( I do), but because my own life and my own heart contain too much ambiguity. There is both violence and nonviolence in me, and yet I don’t believe in them both. She suggested that maybe others felt the same way and that maybe what they needed from their pastor wasn’t the moral outrage and rants they were already seeing on Facebook; maybe they just needed me to confess my own crippling inconsistencies as a way for them to acknowledge their own. That felt like a horrible idea, but I knew she was right. So often in the church, being a pastor or a “spiritual leader” means being the example of “godly living.” A pastor is supposed to be the person who is really good at this Christianity stuff — the person others can look to as an example of righteousness. But as much as being the person who is the best Christian, who “follows Jesus” the most closely can feel a little seductive, it’s simply never been who I am or who my parishioners need me to be. I’m not running after Jesus. Jesus is running my ass down. Yeah, I am a leader, but I’m leading them onto the street to get hit by the speeding bus of confession and absolution, sin and sainthood, death and resurrection— that is, the gospel of Jesus Christ. I’m a leader, but only by saying, “Oh, screw it. I’ll go first.” I stood the next day in the copper light of sundown in the parish hall where House for All Sinners and Saints meets and confessed all of this to my congregation. I told them there had been a million reasons for me to want to be the prophetic voice for change, but every time I tried, I was confronted by my own bullshit. I told them I was unqualified to be an example of anything but needing Jesus. That evening I admitted to my congregation that I had to look at how my outrage feels good for a while, but only like eating candy corn feels good for a while— I know it’s nothing more than empty calories. My outrage feels empty because what I am desperate for is to speak the truth of my burden of sin and have Jesus take it from me, yet ranting about the system or about other people will always be my go-to instead. Because maybe if I show the right level of outrage, it’ll make up for the fact that every single day of my life I have benefitted from the very same system that acquitted George Zimmerman. My opinions feel good until I crash from the self-righteous sugar high, then realize I’m still sick and hungry for a taste of mercy. * The first time I was asked to give a lecture on preaching at the Festival of Homiletics, a national conference for preachers, they wanted me to give a talk on what preaching is like at House for All. I wasn’t sure what to say, so I asked my congregation. There was passion in their replies, and none of it had to do with how much they appreciate their preacher being such an amazing role model for them. Not one of them said they love all the real-life applications they receive in the sermons for how to have a more victorious marriage. Almost all of them said they love that their preacher is so obviously preaching to herself and just allowing them to overhear it. My friend Tullian put it this way: “Those most qualified to speak the gospel are those who truly know how unqualified they are to speak the gospel.” Never once did Jesus scan the room for the best example of holy living and send that person out to tell others about him. He always sent stumblers and sinners. I find that comforting. Reprinted from "ACCIDENTAL SAINTS: FINDING GOD IN ALL THE WRONG PEOPLE." Copyright © 2015 by Nadia Bolz-Weber. Published by Convergent Books, an imprint of Penguin Random House LLC."Well, friend, want to go to the shooting range with me?” Clayton said, his light brown eyes lighting up mischievously. We were stretching before the CrossFit class Clayton coaches when I mentioned that I’d recently realized he was my “token conservative friend,” the way some people have a “token black friend.” His response was to invite me to, of all things, a gun range. I reached for my toes as my liberal gun-control-advocate self immediately and gleefully replied, “Seriously? Of course I do.” Because politics should never, if at all possible, get in the way of fun. Little did I know that this would be one of several experiences, during what turned out to be the week of the George Zimmerman acquittal, that would make it virtually impossible for me to claim the liberal outrage/moral high ground I would later wish I could maintain, since life and its ambiguities sometimes throw our ideals into crisis. * A few days after his offer, I saw Clayton’s short, muscular frame walking up to my front door carrying a heavy black bag. He was there for a quick gun-safety lesson before we headed to the range, since I had never in my life actually held a handgun. Clayton is a Texan, a Republican, and a Second Amendment enthusiast. But since Clayton has a degree from Texas A&M and has lived part of his life in Saudi Arabia, where his dad was an oil man, he describes himself as a “well-educated and well-traveled redneck.” “There are four things you need to know,” Clayton said, beginning my very first gun-safety lesson. “One, always assume every gun you pick up is loaded. Two, never aim a gun at something you do not intend to destroy. Three, keep your finger off the trigger until you are ready to fire, and four, know your target and what’s beyond it. A gun is basically a paperweight. In and of themselves,” he claimed, “they are only dangerous if people do not follow these rules.” I’m not sure what the statistics are on gun-shaped paperweight deaths, I thought, but I’ll be sure to look that up. “Okay, ready?” Clayton asked. “I have no idea,” I replied. He placed a matte black handgun and a box of ammunition on our kitchen table, and it felt as illicit as if he had just placed a kilo of cocaine or a stack of Hustler magazines on the very surface where we pray and eat our dinners as a family. I tried to ask some intelligent questions. “What kind of gun is this?” “It’s a 40.” Like I had any idea what the hell that meant. “What’s a 9mm? I’ve heard a lot about those.” “This is.” And he lifted his shirt up to show his concealed handgun. “Man, you don’t carry that thing around all the time, do you?” He smiled. “If I’m not in gym shorts or pajamas, yes.” Later, in the firing-range parking lot, filled almost exclusively with pickup trucks, I made the astute observation, “No Obama bumper stickers.” “Weird, huh?” he joked. When I’m someplace cool, say an old cathedral or a hipster ice cream shop, I am sure to check in on Facebook. But not here. Partly because it was Monday morning and Clayton had penciled in our fun shooting date as a “work meeting,” but also because I didn’t want a rash of shit from my friends or parishioners— almost all of whom are liberal— asking if I’d lost my mind or simply been abducted by rednecks. As we stepped onto the casing-littered, black rubber-matted floor of the indoor firing range, I was aware of several important points: one, our guns were loaded and intended to destroy the paper target in front of us; two, I should put my finger on the trigger only when I intended to shoot; three, a wall of rubber and concrete was behind my target; and four, I sweat. I knew from hearing gunshots in my neighborhood that guns were loud. And I knew from the movies that there was a kickback when a gun was fired. But, holy shit, was I unprepared for how loud and jolting firing a handgun would be. Or how fun. We shot for about an hour, and after we were done, Clayton told me that I did pretty well, for a first-timer. (Except for when a hot shell casing went down my shirt and I jerked around so mindlessly that he had to reach over and turn the loaded gun in my hand back toward the target, making me feel like a total dumbass. A really, really dangerous dumbass.) But I loved it. I loved it like I love roller coasters and riding a motorcycle: not something I want in my life all the time, but an activity that is fun to do once in a while, that makes me feel like I’m alive and a little bit lethal. “Can we shoot skeet next time?” I asked eagerly as we made our way back to the camo-covered front desk to retrieve our IDs. The whole shop looked like a duck blind. As though if something dangerous or tasty came through the front door, all the young, acne-ridden guys who work there could take it down without danger of being spotted. On the way back to my house, I suggested we stop for pupusas (stuffed Salvadorian corn cakes) so that we both could have a novel experience on a Monday. Sitting at one of the five stools by the window at Tacos Acapulco— looking out on the check-cashing joints and Mexican panaderias that dot East Colfax Avenue— I took the opportunity to ask a burning question: “So why in the world do you want to carry a gun all the time?” I’d never knowingly been this close to a gun-carrier before, and it felt like my chance to ask something I’d always wanted to know. I could only hope my question didn’t feel to him like it does to our black friend Shayla when people ask to touch her afro. As he tried managing with his fork the melted cheese that refused to detach itself from the pupusa, he said, “Self-defense, and pride of country. We have this right, so we should exercise it. Also if someone tried to hurt us while we were sitting here, I could take them down.” It was a foreign worldview to me, that people could go through life so aware of the possibility that someone might try to hurt them, and that, as a response, they would strap a gun on their body as they made their way through Denver. I didn’t understand or even approve. But Clayton is my token conservative friend, and I love him, and he went through the trouble of taking me to the shooting range, so I left it there. * The week I went shooting with Clayton was also the week of my mother’s seventieth and my sister’s fiftieth birthday party. It was a murder mystery dinner, so, five nights after blasting paper targets with Clayton at the shooting range, I sat on the back patio of my parents’ suburban Denver home and pretended to be a hippie winemaker for the sake of a contrived drama. Normally, my natural misanthropy would prevent me from participating in such awkward nonsense, but I soon remembered how many times I had voluntarily dressed myself up and played a role in other contrived dramas that didn’t involve a four-course meal or civil company (like the year I tried to be a Deadhead), so I submitted to the murder mystery dinner for the sake of two women I love. My role called for a flowing skirt, peasant blouse, and flowers in my hair— none of which I own or could possibly endure wearing, so a nightgown and lots of beads had to do the trick. Throughout the mostly pleasant evening, I would see Mom talking to my brother out of the side of her mouth, just like she did when we were kids and she wanted to tell Dad something she didn’t want us to know. I watched my mom, unaware that an unscripted drama was unfolding around the edges of the fictional one that called for flowers in my fauxhawk. As I snuck into the kitchen to check my phone for messages, my dad followed me to fill me in on what was happening. It turns out my mom’s side-of-the-mouth whispers were about something serious. My mom had been receiving threats from an unbalanced (and reportedly armed) woman who was blaming my mom for a loss she had experienced. My mom had nothing to do with this loss, but that didn’t stop this woman from fixating on her as the one to blame. And she knew where my mom went to church on Sundays. “It’s made being at church pretty tense for us,” my father told me. My older brother Gary, who is a law enforcement officer in a federal prison and who, along with his wife and three kids, attends the same church as my parents, walked by Dad and me in the kitchen and said, “Horrible, right? The past three weeks I’ve carried a concealed weapon to church in case she shows up and tries anything.” I immediately thought of Clayton and his heretofore foreign worldview, weighing it against how I now felt instinctually glad that my brother would be able to react if a crazy person tried to hurt our mother. And how, at the same time, it felt like madness that I would be glad someone was carrying a gun to church. But that’s the thing about my values—they tend to bump up against reality, and when that happens, I may need to throw them out the window. That, or I ignore reality. For me, more often than not, it’s the values that go. My gut reaction to my brother’s gun-carrying disturbed me, but not as much in the moment as it would the next morning. * On the night of the party, I missed the breaking news that George Zimmerman, who had shot and killed unarmed teen Trayvon Martin, had been found not guilty on all counts. For more than a year, the case had ignited fierce debate over racism and Florida’s “Stand Your Ground” law, which allows the use of violent force if someone believes their life is being threatened. My Facebook feed was lit up with protests, outrage, and rants. I wanted to join in and act as a voice for nonviolence that week, but when I heard on NPR that George Zimmerman’s brother was saying he rejected the idea that Trayvon Martin was unarmed, Martin’s weapon being the sidewalk on which he broke George’s nose, well, my first reaction was not nonviolence but an overwhelming urge to reach through the radio and give that man a fully armed punch in the throat. Even more, that very week, a federal law enforcement officer was carrying a concealed weapon into my mom’s church every Sunday. Which is insane and something I would normally want to post a rant about on my Facebook wall for all the liberals like me to “like.” Except in this case, that particular law enforcement officer (a) was my brother, and (b) carried that weapon to protect his (my) family, his (my) mother, from a crazy woman who wanted her dead. When I heard that my brother was armed to protect my own mom, I wasn’t alarmed like any good gun-control supporting pastor would be. I was relieved. And now what the hell do I post on Facebook? What do I do with that? I also had to deal with the fact that I simply could not express the level of antiracist outrage I wanted to, knowing something that no one else would know unless I said it out loud: despite my politics and liberalism, when a group of young black men in my neighborhood walk by, my gut reaction is to brace myself in a different way than I would if those men were white. I hate this about myself, but if I said that there is not residual racism in me, racism that— after forty-four years of being reinforced by messages in the media and culture around me— I simply do not know how to escape, I would be lying. Even if I do own an “eracism” bumper sticker. The morning after the George Zimmerman verdict, as I was reflecting on what to say to my church about it, I wanted to be a voice for nonviolence, antiracism, and gun-control as I felt I should (or as I saw people on Twitter demanding: “If your pastor doesn’t preach about gun control and racism this week, find a new church”) — but all I could do was stand in my kitchen and cry. Cry for all my inconsistencies. For my parishioner and mother of two, Andrea Gutierrez, who said to me that mothers of kids with brown and black skin now feel like their children can legally be target practice on the streets of suburbia. For a nation divided — both sides hating the other. For all the ways I silently perpetuate the things I criticize. For the death threats toward my family and the death threats toward the Zimmerman family. For Tracy Martin and Sybrina Fulton, whose child, Trayvon, was shot dead, and who were told that it was more his fault than the fault of the shooter. Moments after hearing about the acquittal, I walked my dog and called Duffy, a particularly thoughtful parishioner. “I’m really screwed up about all of this,” I said, proceeding to detail all the reasons that, even though I feel so strongly about these issues, I could not with any integrity “stand my own ground” against violence and racism — not because I no longer believe in standing against those things ( I do), but because my own life and my own heart contain too much ambiguity. There is both violence and nonviolence in me, and yet I don’t believe in them both. She suggested that maybe others felt the same way and that maybe what they needed from their pastor wasn’t the moral outrage and rants they were already seeing on Facebook; maybe they just needed me to confess my own crippling inconsistencies as a way for them to acknowledge their own. That felt like a horrible idea, but I knew she was right. So often in the church, being a pastor or a “spiritual leader” means being the example of “godly living.” A pastor is supposed to be the person who is really good at this Christianity stuff — the person others can look to as an example of righteousness. But as much as being the person who is the best Christian, who “follows Jesus” the most closely can feel a little seductive, it’s simply never been who I am or who my parishioners need me to be. I’m not running after Jesus. Jesus is running my ass down. Yeah, I am a leader, but I’m leading them onto the street to get hit by the speeding bus of confession and absolution, sin and sainthood, death and resurrection— that is, the gospel of Jesus Christ. I’m a leader, but only by saying, “Oh, screw it. I’ll go first.” I stood the next day in the copper light of sundown in the parish hall where House for All Sinners and Saints meets and confessed all of this to my congregation. I told them there had been a million reasons for me to want to be the prophetic voice for change, but every time I tried, I was confronted by my own bullshit. I told them I was unqualified to be an example of anything but needing Jesus. That evening I admitted to my congregation that I had to look at how my outrage feels good for a while, but only like eating candy corn feels good for a while— I know it’s nothing more than empty calories. My outrage feels empty because what I am desperate for is to speak the truth of my burden of sin and have Jesus take it from me, yet ranting about the system or about other people will always be my go-to instead. Because maybe if I show the right level of outrage, it’ll make up for the fact that every single day of my life I have benefitted from the very same system that acquitted George Zimmerman. My opinions feel good until I crash from the self-righteous sugar high, then realize I’m still sick and hungry for a taste of mercy. * The first time I was asked to give a lecture on preaching at the Festival of Homiletics, a national conference for preachers, they wanted me to give a talk on what preaching is like at House for All. I wasn’t sure what to say, so I asked my congregation. There was passion in their replies, and none of it had to do with how much they appreciate their preacher being such an amazing role model for them. Not one of them said they love all the real-life applications they receive in the sermons for how to have a more victorious marriage. Almost all of them said they love that their preacher is so obviously preaching to herself and just allowing them to overhear it. My friend Tullian put it this way: “Those most qualified to speak the gospel are those who truly know how unqualified they are to speak the gospel.” Never once did Jesus scan the room for the best example of holy living and send that person out to tell others about him. He always sent stumblers and sinners. I find that comforting. Reprinted from "ACCIDENTAL SAINTS: FINDING GOD IN ALL THE WRONG PEOPLE." Copyright © 2015 by Nadia Bolz-Weber. Published by Convergent Books, an imprint of Penguin Random House LLC.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 15:00

Praying at the church of rock and roll: How John Lennon made me a skeptic, Morrissey made me a believer and “Exile on Main Street” never let me down

“We believe in nothing, Lebowski!” say the nihilists who harass The Dude in his tub. I always equated nihilism with punk rock — there’s that song “88 Lines By 44 Women,” by The Nails, where the lead singer runs down the list of the bad relationships he’s had -- Jackie the “rich punk rocker,” Sarah the “modern dancer,” Suzy the Ohioan Scientologist -- and then he points out that Terry “didn’t give a shit, was just a nihilist.” Whereas Suzy probably believed in quite a lot, some of it hard to wrap one’s head around, I always favored Terry. That single hit the radio in 1982, right around the time many Reagan-supported Christian fundamentalists were gaining power, profit and influence, especially via their cable Sunday morning network TV exposure. In the Middle East, fundamentalist Muslims had succeeded in ousting the Shah via a passionate revolution. Scientolology was also expanding. I think they annexed Tom Cruise during this era.  There was even the Church of the Subgenius, led by pipe-smoking Bob Dobbs, which was appealingly satirical, though I wasn’t sure if it was a joke or not.   Most of all, it made me shrink. I envied those who believed... in anything at all. I was born in the late 1960s and raised “culturally Jewish," in that I loved baseball, jazz, Woody Allen and that mushroom and barley mixture they served us on holidays. I loved Anne Frank. I still have a certificate on my bulletin board which reminds me that I have a couple of trees planted in Israel, but I’ve never visited them and do not intend to now. I was Bar Mitzvahed on Coney Island (not in Luna Park or on the boardwalk in front of the corn and clams hut, but in an actual temple) but had no idea what I was saying during the manhood ritual. I read my Haf Torah phonetically and, let’s face it, went through the whole thing so that I could have a dance party (“Tainted Love” was the big hit, as was anything off "Dare" and  Yaz’s “Situation” and I got the message Soft Cell and Phil Oakey were laying down easily). Nobody in my family pushed me toward any religion. I don't blame them. I actually felt lucky at the time. So many of my friends had no choice. If anything, religion like a lot of rules and study to me, and like Joey Ramone, another secular Jew (I suspect), I didn't wanna be learned or tamed. I had a hard enough time mastering Trigonometry and all the muffled lyrics to the new R.E.M. EP, much less the ins and outs of the holy Torah. This a roundabout way of saying I, too, believed in nothing. And now, over 30 years later, I find that I am a middle aged man who still believes in zip. The difference is that as I find myself on the wrong side of 45, I have only just started to wonder why.   Autumn's kick-off this week brings as it does every year new birthday (for me and for Gwen Stefani who is only a  day younger than I am), as well as the baseball playoffs and the hint of Christmas and Chanukah and Kwanzaa (seven weeks away — prepare yourselves, it's coming). Once again, I find myself surrounded on all sides by  devout New Yorkers, and happy children on my TV. With regard to sports, I know people pray for the Yankees (or the Red Sox, or this year the Mets) to succeed while I can only grit my teeth. If there’s a hurricane, as we are in hurricane season, the believers pray for safety, while I stock cans of Heinz beans and try to remember where I placed my knife with the compass on the handle. They are sure of themselves and grateful for the gift of faith. I am ashamed for the lack of it. I don't even have the option to be a proud atheist like Bill Maher or the late great Chris Hitchens. Atheism, to me, is just another form of belief in something greater than oneself, and the only thing that's greater than myself, the only way I excel, is by realizing just how much my head is filled with rock and roll facts, figures, theories and lyrics.   I’m the Rain Man of pop, with no room for the spirit. It may try to enter me, but it will come up against a wall of British indie. I don’t blame my parents. I blame the Stone Roses for releasing a perfect debut album. I blame all four Beatles, and Bob Marley and Eric B and Rakim and even They Might Be Giants. They've squatted in my soul where faith and belonging might have found a place to blossom. There's even room for terrible music there, but not for God. And even if I had a vacancy, I am pretty sure I would evict it after a while. For this, and I know this is somewhat of a psychopath's cliché, I blame John Lennon (that said, my favorite Salinger book is "Franny and Zooey"). Lennon made me the skeptic I am today. The cynic. The guy who loiters in the used car dealership of soulfulness without ever taking the keys. My father left the family in 1980 to become the ramblin’ gamblin' man that he remains today (I think he's still alive, I haven't seen him in 10 years). The old man was not cut out for a domestic life and a job that kept him from the track. Later that year, my mother woke me up one morning in early December, in tears. I thought she was going to tell me that my Dad died.  A loanshark shot him in the belly, maybe? Or a horse broke from the paddock and trampled him. Maybe he was shot in a poker match like Stagger Lee. Instead, she told me that John Lennon, my hands-down favorite Beatle, was murdered steps from his doorway in Manhattan, about 45 minutes from my house.   I’d never heard any solo Beatles songs. I was 11 and still just into a phase where I was collecting each Beatles record, and once you have your own vinyl copy of The White Album, complete with the poster… well, hell, you can spend days just looking at the collage. It takes months for an 11-year-old, even an already pop-savvy one, to fully absorb the double album. But after that, I began to collect Lennon's solo stuff  and drawing his name on my white Hanes tees. Soon, I happened to hear the song “God,” from his solo debut, 1970's "Plastic Ono Band." There are incredible songs all over that record:  “Mother,” “Isolation,” “Well Well Well,” even “My Mummy’s Dead” (amazing nobody’s ever covered that one). But “God,” with its climactic list of things that John no longer believed in, seemed to fortify all my suspicions and faithlessness in a matter of seconds. Things John did not believe in: Magic. I-Ching. Bible. Tarot. Hitler (a relief). Jesus. Kennedy. Mantra. Gita. Yoga. Kings. Elvis. Zimmerman (Bob Dylan) and the Beatles.  He believed in himself and Yoko, that was it. That was “reality,” John promised. God was, according to the lyrics, nothing but a “concept by which we measure our pain.” He said it again for emphasis, but he didn’t need to on my account. Here was the same guy who, in his 20s, sang about love and peace. Now he was cleaning house and it felt righteous to me. It helped me mourn him. But it also left me one suspicious little duck as I came of age. “It’s cool not to believe in anything,” I told myself.   I not only renounced my faith (no press release), but I renounced Ringo, Paul and George. John was my guy, perfect, dead, a kind of saint. Bob Dylan, who I was discovering at the time as well,  was my guy too. Bob sang, “Don’t follow leaders," when he was a young man. Yeah, fuck those leaders. And watch the parking meters. I defined myself by how much shit I refused to take, from anyone, anywhere, anyhow that I chose. But John didn't even buy into Bob. Lennon was truly a love-and-peace kind of guy, for all his perfect vitriol and skepticism — remember, he dispensed with the Maharishi in a devastating three minutes in the form of “Sexy Sadie,” and did the same to Dylan by answering his Christian period hit “Serve Somebody” with the venomous “Serve Yourself.” (“You tell me you found Jesus Christ, well that’s great, and he’s the only one. You say you just found Buddha and he’s sitting on his ass in the sun.”). Joni Mitchell was even better. She didn't even buy into John. "They won't give peace a chance," she sings in "California," on Blue, "that was just a dream some of us had."  Go Joni, raise your eyebrow for me. Ringo was still flashing that insipid peace sign every time he saw a camera and Paul was singing about "Pipes of Peace," and we all know George was following his path, so they had to make way for Joni, and Leonard Cohen, another perfect cynic ("Everybody knows that the dice are loaded..." he sang.   Quel sorprese, Lenny).   As I got a little bit older, by say 1984, I was whatever you call a white suburban kid who dresses in all black, with a black raincoat and spikey hair. A punk? A Goth? An American iteration of a rain soaked British indie youth? I was not a skinhead or a Rude Boy or a hippie, I know that. But I could not go full-tilt into punk rock or follow the Dead or even reinvent myself as a B-Boy because that would require allegiance to some kind of mini cultural ethos. Gang of Four loved a man in uniform, but I didn’t trust people in culture drag whether they were cops or B Boys. Yes, my rain coat was in its own way Mancunian culture drag, but it felt plain enough that I gave myself a pass. My sister owned tie-dyed t-shirts. Other kids I knew dressed like Michael Stipe. I dressed like a postman from Salford.   I was a fan of no one. TV shows would let me down, authors were never as good as people said they were, and Echo and the Bunnymen, U2, Depeche Mode and Siouxsie started to suck the more popular they got. It took a lot for a band (or a writer or a filmmaker, or, you know… a young woman) to penetrate my reinforced, barbed-wire-covered wall of perfect doubt. Only one band got in, really. One in hundreds.   If there was a wrench in my perfect I am a rock/island status, it came from my beloved land of raincoats.   In 1984 and 1985, The Smiths, Morrissey, Johnny Marr, Mike Joyce and Andy Rourke, seemed to pass right through my wall of bullshit proofing like vapor, and swirled around me until  I was dizzy and swore my first allegiance to a band in the better part of a decade. The Smiths confused me because I couldn't tell if they believed in anything at all or were just as skeptical as I was. Morrissey was contradictory. "My faith in love is still devout," he sang, but earlier, he'd dismissed the same emotion as a "miserable lie." It didn't matter — part of the reason I couldn't take my eye off them (in addition to the fact that they were amazing to just ... observe) was to figure out where they truly lay, on my side of philosophy or with the believers? “If it’s not love, then it’s the bomb that will bring us together,” he promised. Did that mean that love was necessary? Or was the bomb just as acceptable? ("Come, come, nuclear war," he pleaded years later as a solo artist.) By college, I had every chance to be changed. I was happy there (a small liberal arts school in Vermont with a modicum of notoriety, thanks to a certain book by a certain writer) and I was doing enough acid that I could have probably been indoctrinated into the Manson family and died for Charlie, but I was also listening to a lot of indie rock and most of the singers of this indie rock employed  this thing called irony ("does anyone remember irony") as they sang their mostly deadpan lyrics (Camper Van Beethoven and the like). It was no call to arms, this stuff; nothing to take literally. It was safe.  I didn't really have to take any skinheads bowling. I could stay in my perfect bubble and quietly rue anyone who believed in anything at all. Vegans. War protestors (come the Gulf). All causes, good or bad, were bound to be counted out as bullshit before too long. Any mode of spiritual salvation was a money grab, if you asked me then. Three-card monte. Squeegee men. Prophets. They were all the same. By graduation, when I was on the cusp of becoming a published writer/artist myself, Trent Reznor was busy screaming “God is dead and no one cares, if there is a hell, I’ll see you there.” I liked that. No one cared. I didn't care. And yet I was still too much of a wuss to join the Atheists. Trust no one, ran the "X-Files" catch phrase, which I amended to, "not even those who trust no one."   One day in my 40s I asked my shrink what I should do about this problem, because nobody wants to start feeling their mortality without at least some kind of insurance policy for the soul. He suggested that I meditate. He gave me a mantra (a Sanskrit word with literally no meaning, he said) and I attempted it dutifully, but as I sat there, cross-legged on my floor, breathing in and out and reciting this word, my mind began to wander and I started to think about balance, and how it’s dangerous to have too much faith and dangerous to have too little and that nobody really has no faith at all and then I started thinking about "Led Zeppelin 2" and how it's such a great record to play in the fall ("leaves are falling all around...")   After almost 20 years of working in rock and roll it began to occur to me: maybe that alone was what I believed in, and I was facing a real paradox — if you believe in nothing because of rock and roll, doesn’t that really translate into an utterly devout and worshipful relationship with rock and roll itself, one as devoted and unbreakable as any faith bond that's more widely accepted?    After all, no matter where you go or how base you are, it's always there. Bob Seger sang “Rock n’ Roll Never Forgets,” and even more convincingly KISS (via Argent) sang “God Gave Rock 'n’ Roll to You."  Yes, those songs kind of blow (well, Seger has better ones anyway) but the point held water. So I could not be a Rastafarian, a Satanist, a Buddhist. I would be a Rock 'n’ Roll-ist. I search for meaning in the grooves of "Imperial Bedroom" or "Bookends" or "There's A Riot Going On" because I believe there is meaning there. Surely Elvis Costello would frown upon me deifying him, and Tom Waits, too; and if he were alive, Kurt Cobain would be horrified, as would Nick Drake (although it probably didn't take much to horrify that guy), but that's who I am and as I push 50, it's getting high time to just accept it. Rock and roll made a skeptic out of me but it also gave me my own private temple where the B-52s are welcome as is Matthew Sweet and even The Eagles (actually just Eagles if you ask them).   I never stop trying, of course, to get with the acceptable faiths or bring others around to mine.   After we broke up, my ex-girlfriend, another rock writer of some note, got heavily into yoga and the culture of yoga that goes along with it and it felt like I lost her twice. I started dating another woman who was obsessed with her gluten intake and when I said "That's all bullshit," I lost her too. It became clear that at this rate, I was destined to end up with nothing but my records. So I began to wonder: is that all bad? Little Richard gave up playing for God, but when I hear his music I hear and feel something that I can only identify as joyful and powerful and holy.  And so I played her Little Richard.  A lot of it. And some Rock Steady. Some Madness.  Selecter.  And to her it was just music. Good music. It wasn't sacred. I should have gone with Springsteen first.   If I have a point here it's that who is to say that the full-body warmth and tingle that takes over when I hear "Led Zeppelin 2" or "Astral Weeks" is a kind of small rapture, a little strain of the great surrender that the devout feel. I certainly give myself over to it.  It has its way with my mood and my body. Maybe I’m the most spiritual person I know; I just turn to "Revolver" instead of the holy texts. "Revolver" is a holy text. The only question remains: What is the leap of faith required when you have tried and failed to revere anything but rock and roll on a spiritual level? I guess it’s that much of rock and roll is flawed and derivative, but then, you could say that about all faiths. Atheism, nihilism, any other -ism that rejects these things outright is certainly something to be respected, but if you’re looking and wondering, eventually, you gotta make the leap and hope your entire belief system isn’t shattered by bad EDM or another inscrutable Radiohead album — which could send me right back into the Temple proper. I know it’s there for me as an option, but so is my copy of "Exile on Main Street," which has yet to let me down.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 14:00

Can a thinking person still have faith? My skeptical, honest quest for religious answers

"I do not believe in God and I am not an atheist." -- Camus When I searched for God I found a man who looked like Will Ferrell. This search I’d kept secret. God, these days? Are you nuts? “Think what times these are,” Saul Bellow wrote—a generation ago; it’s worse today. How, against a contemporary background, do you contemplate the almighty? Who believes there’s an oasis in 2015’s scattered metaphysical sand? But I had a reason to search. A few years ago my 3-year-old son couldn’t rise from bed unaided, his walk was a limp. Our pediatrician frowned when breaking the news: Rheumatoid arthritis. And in that shiver of change I glimpsed, for my son, for me, a long line of vanishing possibilities. (The disease was most likely permanent.) And it was the same old dumb supplicant’s story. Gulped tears, inhibition, begging made ritual. Please, please, God—the tonality of fraudulence. Because how could I (secular, religiously illiterate I) talk to God? What could be said that wouldn’t embarrass my intelligence, or His (presuming I actually believed)? This was a new feeling. I’ve been a writer my entire adult life, but I didn’t have any language. At the moment it really mattered, whatever fluency I'd had came pretty much to zilch -- to smoke and cobwebs. And yet now my son was in bad shape. And so I asked God to help him.  And my son got well, in an almost miraculous fashion. It happened in an unlikely way: A stranger’s advice, skeptical doctors, a full recovery. (My wife wrote about it for the Times.)   This ecstatic surprise struck from me a real desire to thank, and to understand, and to believe more fully, in God. I prayed—self-invented, unschooled prayers—and I did nothing more than that. I did nothing, that is, to further my understanding, or even to deepen my thoughts about God, or belief in Him. Two years later, my wife found herself — a lapsed Presbyterian — diagnosed with an invasive melanoma. Time, again, to ask for mercy. I began going to temple. Or, I tried to. I'd visit synagogues and hear rabbis and I found all of it off-putting. But I was thirsting after guidance.  That's what it felt like, a need, centered in the throat. This is a universal deal, of course. Trouble = need. As it is with most of us, it is with me. I’m better suited to searching for answers than to the mundane facts of participatory worship. But next to my wife in bed, I felt like an ant colony: So many little black questions in me, hopes and doubts, crawling every which way. And then, somehow, a second time, my prayer was answered. My wife after a succinct surgery was given a clean bill of health. This set in train a search. I needed to find out more about God. Can you have faith today and remain a thinking person? I approached religious leaders: Riverside Church’s first Minister of Social Justice; the president of a Jesuit university; the head rabbi of Brooklyn’s leading reform congregations; a renowned Buddhist, etc.  I wanted only to have a back-and-forth about how to believe — a heavily secular Jew and a number of influential faith-thinkers, hashing out the how of it, discussing whether religion can work in a digital age, to be published in an epistolary way. What follows is the first of these exchanges—my interaction with Erik Kolbell, who looks like a bearded (handsomer) Will Ferrell, and is just as likable. He’s also a brilliant guy—a writer, psychotherapist, Yale Divinity School graduate, and ordained minister, and the first Minister of Social Justice at Riverside Church in New York City. (I became aware of him when I saw him on Charlie Rose talking about his inspired book "What Jesus Meant: The Beatitudes and a Meaningful Life.") Kolbell sat waiting for me in the chosen restaurant, a hand-written sign, scrawled in rickety ink, propped on his glass. “ERIK KOLBELL.”  When he smiled there was a certain Muppetish softness to the mouth —ah! Here you are, sit, sit— a smile almost comic in its munificence, in its knowledge that munificence is, nowadays, a pretty rare article of trade. What follows is a transcript of our written correspondence. The questions are mine, the answers are his. One of the things I was struck by when I went to services—Reform Jewish, Presbyterian and Catholic—was how much time was spent, in elegant ways, asking for things. Generally, these were inoffensive and general things.   -Please protect us, oh Lord.-   But that gave me pause just the same. How can we—a comfortable New York flock, for example—ask for anything, when we know there are so many who have it worse than we do?  I share your quandary on this issue, Darin, not only because it invites selfishness at the expense of compassion, but because it runs the risk of promoting an unhealthy understanding of the relationship between the individual and God.  Even asking for innocent things, such as good health, suggests a kind of magical thinking that I don't necessarily subscribe to.  If I can pray to God for health, then why not wealth as well?  And if so, is God then simply being defined as a dispenser of goods and blessings?  A generous grandfather, pockets stuffed with candy, eager to dole it out if only we ask nicely?  Rather, by way of prayer, I am fond of the first three words of Psalm 119:36, which read simply "Incline my heart…"  For my money, this is what we can justly ask of God; that our consciousness of God is such that our hearts are inclined toward those things that promote personal integration, deepened faith, human justice and universal mercy. In answer, then, to the question you pose below, with the Bellow quote, I believe prayer is the endeavor on the part of the individual (or the community) to dispose ourselves to a deepening consciousness of what God might want of us in order that these things — integration, faith, justice and mercy — are made manifest in our lives. So, then, you believe the Almighty wants something from each of us? Which leads to a bigger question: To what degree do you think God is involved in every person's life?    I am hesitant to impute any human characteristics to God, so even the term "want" makes me a little jittery.  As one theologian put it, "To say that God is love is really to say 'I experience what I call love from what I call God.'" It is the subjective experience and articulation of the objective reality of God; the infinite heavily filtered through layers of finitude. With this in mind, I would argue that our highest calling is to ascertain what it means to live a full and compassionate life, and then to aspire to live it.  We could say that this "pleases God" or we could say that it puts us in company with the manifestation of the ultimate good (or Good). As to your second point, I do not believe in a Grand Manipulator who sees to it that this or that team wins the big game, this or that town is spared the wrath of the tornado, this child lives and the other dies as a result of the same illness. I do believe that we can effect both good and ill on earth, and, as pertains the question of inexplicable and arbitrary suffering, while we cannot explain it (to do so is to demean it) we can redeem it.  And redemption is a holy task. Much of the rest of the services were finding different ways to praise the Almighty—and the Almighty asking us for praise. Why would a divinity, by definition majestic, and the author of all the splendor and complex genius  of creation, be hungry for so much praise? That seems to underestimate God—cheapen Him, make him smaller with insecurity.  As does the subject of my first question. I mean, it seems organized religion too often sees God as a praise-hungry personal assistant: someone who spends all His time doing chores for people and wanting credit for it. The flip side of asking God for things is praising him for the riches we enjoy.  If the praise is at God's bidding then we are indeed cheapening (not to mention anthropomorphizing) God.  But if, as some hymns, prayers, etc., are simply meant to be communal expressions of appreciation for the gift that is life itself, then they serve to keep us apprised of the difference between gratitude and entitlement.  To the point of loving God for His sake, I think I mentioned to you my problem with heaven and hell.  If we see them as rewards and punishments then they serve to arrest our moral development.  Any time a reward is attached to a moral gesture the gesture becomes cheapened.  Charity — caritas — is reduced to a quid pro quo.  There is nothing inherently moral about feeding the poor and visiting the sick if my motivation is ultimately selfish.  Better that I do it simply because it must be done. As la Rouchefoucauld put it, "We would frequently be ashamed of our good deeds if people saw all of the motives that produced them." And compare it to Tolstoy: "It is much better to do good in a way that no one knows anything about it." I want to ask you about a Wright Morris quote:  “The purpose of religion, quite simply, is to dispense with the problem of death.” You mentioned your doubt about heaven and hell. Do you think there's a life after death-- or, more to the point, is it imperative that religion have an answer? I think there are many believers who are agnostic about the question of an "afterlife," just as there are more than a few atheists who are similarly agnostic.  We just don't know, now, do we?  I am reminded of an old story of a rabbi talking to an unborn baby and telling the baby there is a magnificent world awaiting her. The baby responds: "Here's what I know. I know that I have every need met exactly where I am. I know that I have food, shelter, and comfort. What I don't know is what, if anything, lies at the other end of that tunnel. I don't know, because I have no empirical experience of it." This is all to say that I think it naïve hope to believe in a chunk of celestial real estate with winged angels and lilting harps. (Notice that nowhere in the Bible is there any real description of what the afterlife looks like.) But I also think it betrays a kind of arrogance to argue that the only reality that exists is the reality of our own perceptual experience, that perception gleaned and sorted out by the 10 percent of the brain we actually make full use of.  I am wide open to the possibility of a "beyond," which, again, we freight when we use language like "life after death," simply because it can suggest that whatever might lie beyond in some way resembles what we understand when we use the word "life." Think Timothy Leary, at the very least."I do not believe in God and I am not an atheist." -- Camus When I searched for God I found a man who looked like Will Ferrell. This search I’d kept secret. God, these days? Are you nuts? “Think what times these are,” Saul Bellow wrote—a generation ago; it’s worse today. How, against a contemporary background, do you contemplate the almighty? Who believes there’s an oasis in 2015’s scattered metaphysical sand? But I had a reason to search. A few years ago my 3-year-old son couldn’t rise from bed unaided, his walk was a limp. Our pediatrician frowned when breaking the news: Rheumatoid arthritis. And in that shiver of change I glimpsed, for my son, for me, a long line of vanishing possibilities. (The disease was most likely permanent.) And it was the same old dumb supplicant’s story. Gulped tears, inhibition, begging made ritual. Please, please, God—the tonality of fraudulence. Because how could I (secular, religiously illiterate I) talk to God? What could be said that wouldn’t embarrass my intelligence, or His (presuming I actually believed)? This was a new feeling. I’ve been a writer my entire adult life, but I didn’t have any language. At the moment it really mattered, whatever fluency I'd had came pretty much to zilch -- to smoke and cobwebs. And yet now my son was in bad shape. And so I asked God to help him.  And my son got well, in an almost miraculous fashion. It happened in an unlikely way: A stranger’s advice, skeptical doctors, a full recovery. (My wife wrote about it for the Times.)   This ecstatic surprise struck from me a real desire to thank, and to understand, and to believe more fully, in God. I prayed—self-invented, unschooled prayers—and I did nothing more than that. I did nothing, that is, to further my understanding, or even to deepen my thoughts about God, or belief in Him. Two years later, my wife found herself — a lapsed Presbyterian — diagnosed with an invasive melanoma. Time, again, to ask for mercy. I began going to temple. Or, I tried to. I'd visit synagogues and hear rabbis and I found all of it off-putting. But I was thirsting after guidance.  That's what it felt like, a need, centered in the throat. This is a universal deal, of course. Trouble = need. As it is with most of us, it is with me. I’m better suited to searching for answers than to the mundane facts of participatory worship. But next to my wife in bed, I felt like an ant colony: So many little black questions in me, hopes and doubts, crawling every which way. And then, somehow, a second time, my prayer was answered. My wife after a succinct surgery was given a clean bill of health. This set in train a search. I needed to find out more about God. Can you have faith today and remain a thinking person? I approached religious leaders: Riverside Church’s first Minister of Social Justice; the president of a Jesuit university; the head rabbi of Brooklyn’s leading reform congregations; a renowned Buddhist, etc.  I wanted only to have a back-and-forth about how to believe — a heavily secular Jew and a number of influential faith-thinkers, hashing out the how of it, discussing whether religion can work in a digital age, to be published in an epistolary way. What follows is the first of these exchanges—my interaction with Erik Kolbell, who looks like a bearded (handsomer) Will Ferrell, and is just as likable. He’s also a brilliant guy—a writer, psychotherapist, Yale Divinity School graduate, and ordained minister, and the first Minister of Social Justice at Riverside Church in New York City. (I became aware of him when I saw him on Charlie Rose talking about his inspired book "What Jesus Meant: The Beatitudes and a Meaningful Life.") Kolbell sat waiting for me in the chosen restaurant, a hand-written sign, scrawled in rickety ink, propped on his glass. “ERIK KOLBELL.”  When he smiled there was a certain Muppetish softness to the mouth —ah! Here you are, sit, sit— a smile almost comic in its munificence, in its knowledge that munificence is, nowadays, a pretty rare article of trade. What follows is a transcript of our written correspondence. The questions are mine, the answers are his. One of the things I was struck by when I went to services—Reform Jewish, Presbyterian and Catholic—was how much time was spent, in elegant ways, asking for things. Generally, these were inoffensive and general things.   -Please protect us, oh Lord.-   But that gave me pause just the same. How can we—a comfortable New York flock, for example—ask for anything, when we know there are so many who have it worse than we do?  I share your quandary on this issue, Darin, not only because it invites selfishness at the expense of compassion, but because it runs the risk of promoting an unhealthy understanding of the relationship between the individual and God.  Even asking for innocent things, such as good health, suggests a kind of magical thinking that I don't necessarily subscribe to.  If I can pray to God for health, then why not wealth as well?  And if so, is God then simply being defined as a dispenser of goods and blessings?  A generous grandfather, pockets stuffed with candy, eager to dole it out if only we ask nicely?  Rather, by way of prayer, I am fond of the first three words of Psalm 119:36, which read simply "Incline my heart…"  For my money, this is what we can justly ask of God; that our consciousness of God is such that our hearts are inclined toward those things that promote personal integration, deepened faith, human justice and universal mercy. In answer, then, to the question you pose below, with the Bellow quote, I believe prayer is the endeavor on the part of the individual (or the community) to dispose ourselves to a deepening consciousness of what God might want of us in order that these things — integration, faith, justice and mercy — are made manifest in our lives. So, then, you believe the Almighty wants something from each of us? Which leads to a bigger question: To what degree do you think God is involved in every person's life?    I am hesitant to impute any human characteristics to God, so even the term "want" makes me a little jittery.  As one theologian put it, "To say that God is love is really to say 'I experience what I call love from what I call God.'" It is the subjective experience and articulation of the objective reality of God; the infinite heavily filtered through layers of finitude. With this in mind, I would argue that our highest calling is to ascertain what it means to live a full and compassionate life, and then to aspire to live it.  We could say that this "pleases God" or we could say that it puts us in company with the manifestation of the ultimate good (or Good). As to your second point, I do not believe in a Grand Manipulator who sees to it that this or that team wins the big game, this or that town is spared the wrath of the tornado, this child lives and the other dies as a result of the same illness. I do believe that we can effect both good and ill on earth, and, as pertains the question of inexplicable and arbitrary suffering, while we cannot explain it (to do so is to demean it) we can redeem it.  And redemption is a holy task. Much of the rest of the services were finding different ways to praise the Almighty—and the Almighty asking us for praise. Why would a divinity, by definition majestic, and the author of all the splendor and complex genius  of creation, be hungry for so much praise? That seems to underestimate God—cheapen Him, make him smaller with insecurity.  As does the subject of my first question. I mean, it seems organized religion too often sees God as a praise-hungry personal assistant: someone who spends all His time doing chores for people and wanting credit for it. The flip side of asking God for things is praising him for the riches we enjoy.  If the praise is at God's bidding then we are indeed cheapening (not to mention anthropomorphizing) God.  But if, as some hymns, prayers, etc., are simply meant to be communal expressions of appreciation for the gift that is life itself, then they serve to keep us apprised of the difference between gratitude and entitlement.  To the point of loving God for His sake, I think I mentioned to you my problem with heaven and hell.  If we see them as rewards and punishments then they serve to arrest our moral development.  Any time a reward is attached to a moral gesture the gesture becomes cheapened.  Charity — caritas — is reduced to a quid pro quo.  There is nothing inherently moral about feeding the poor and visiting the sick if my motivation is ultimately selfish.  Better that I do it simply because it must be done. As la Rouchefoucauld put it, "We would frequently be ashamed of our good deeds if people saw all of the motives that produced them." And compare it to Tolstoy: "It is much better to do good in a way that no one knows anything about it." I want to ask you about a Wright Morris quote:  “The purpose of religion, quite simply, is to dispense with the problem of death.” You mentioned your doubt about heaven and hell. Do you think there's a life after death-- or, more to the point, is it imperative that religion have an answer? I think there are many believers who are agnostic about the question of an "afterlife," just as there are more than a few atheists who are similarly agnostic.  We just don't know, now, do we?  I am reminded of an old story of a rabbi talking to an unborn baby and telling the baby there is a magnificent world awaiting her. The baby responds: "Here's what I know. I know that I have every need met exactly where I am. I know that I have food, shelter, and comfort. What I don't know is what, if anything, lies at the other end of that tunnel. I don't know, because I have no empirical experience of it." This is all to say that I think it naïve hope to believe in a chunk of celestial real estate with winged angels and lilting harps. (Notice that nowhere in the Bible is there any real description of what the afterlife looks like.) But I also think it betrays a kind of arrogance to argue that the only reality that exists is the reality of our own perceptual experience, that perception gleaned and sorted out by the 10 percent of the brain we actually make full use of.  I am wide open to the possibility of a "beyond," which, again, we freight when we use language like "life after death," simply because it can suggest that whatever might lie beyond in some way resembles what we understand when we use the word "life." Think Timothy Leary, at the very least.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2015 12:30