Helen H. Moore's Blog, page 843

March 6, 2016

Farewell to my face: I’m middle-aged and I look it — but don’t ask me to like it

A couple of years ago, my teenage daughter took my photograph on our front stoop, and the pictures showed a diagonal gash running down and out along either side of my nose. “The light’s bad out here,” I said, so we moved to the patio. She snapped more pictures. The gashes were still there. I rotated 90 degrees. Still there. I think you know where this is going. I’d heard of this happening to other people. “That’s around when she started losing her looks,” I remember an old boyfriend saying about Mary Tyler Moore while watching a rerun from the third season of her show. Heeding my mother’s advice, from the time I was 30 onward I purchased and used eye cream—my one concession to the sketchy age-defying-cosmetics industry—but that felt like superfluous insurance given the coup of genetics that seemed to be my birthright. I’d been telling everyone for years—literally decades—that I would age splendidly by virtue of the fact that my paternal grandmother had aged splendidly, and didn’t everyone say that I look just like my dad? “Is that your daughter or your granddaughter?” a stranger in line with us at the grocery store asked Teta when she was long a Miami retiree. And we’d all heard (because she told us) that Teta had been challenged at some AARP event because one look at her told the gatekeepers that she wasn’t old enough—except she was. It never occurred to me that looking my age would be my fate. That’s because it never occurred to me that my father looks like his father. How could this—this pole vault from one side of middle age to the other, right there on my patio—be happening when I felt so young? When someone asks me how old I am, my first thought is always “Twenty-six,” and then I have to pause and recalibrate until I get to the correct number: 48, currently. True, I dye my hair, but I’m not yet menopausal—“Still very much menstrual, thank you,” as "Ab Fab’s" Edina Monsoon would tell you—so I wasn’t ready to look middle-aged. But now that I think about it, there’s the suggestion that I saw it coming. You should get a load of the pictures that a photographer friend took to promote my book three years ago. In nearly each of several dozen shots, I’m making the face that I’ve seen women and men of a certain age make—a variation on Blue Steel, without quite so much eyebrow: a slight pucker in order to grow instant cheekbones and to suggest the precipice of smiling without actually smiling, because as everyone knows, smiling means wrinkles around the eyes. (You knew that, right?) I’ve since caught others assuming this face—a realtor neighbor who emails me her newsletter straps it on. I think I even caught the formidable Katrina vanden Heuvel doing it in the pages of the Nation. Unlike Katrina vanden Heuvel, I was never truly beautiful; what I had going for and against me was outsize features. (Having ancestors from the Middle East will do that to you.) My features are all a reasonable number of millimeters away from one another—random geometric luck, as I remember the model Paulina Porizkova describing it with semi-heroic humility. I do have a dud feature—a big, right-triangular nose—but I can view it only in profile, which requires some maneuvering of bathroom mirrors; I’m not a masochist nor especially dexterous, so I have done this rarely. Hence I grew up with a false sense of dud-feature-lessness, which probably led to an overestimation of my physical attractiveness, which worked for me, if finding myself consistently paired with good-looking guys is any gauge. None of this should matter now: I’ve been married for 15 years, so I’m no longer fishing; who cares if I have shitty bait? Self-servingly, I think Joan Didion is correct when she says that to our lifelong mates, our faces are frozen in time as they were when we met. (A few years ago I described my husband to someone as having black hair, only to have it pointed out that, actually, it’s gray.) So why am I devastated by the loss of my former face? Because it hurts to know that I’ve permanently lost something from my arsenal: it’s no longer rational to imagine myself on the receiving end of the movie cliché in which someone lays eyes on someone for the first time and is undone. I have the feeling of having been robbed of a superpower: I have lost the power to turn a stranger’s head. It happened in slow motion right before my eyes a few months ago. I had been corresponding with a 30-ish filmmaker by email, and when my husband and I met him in person a few days later, I saw something in the guy’s face, by which I mean I saw exactly nothing in the guy’s face: an utter lack of registration of anything other than “human animal” when he saw me. (Yes, I’m pretty sure he’s straight.) This didn’t used to happen. And there’s nothing I can do about it. Even the sudden onset of stupendous wealth couldn’t save the day: the best that even Marlo Thomas can achieve is a face that—forgive me, dear Marlo—looks pinched and pulled, like a taffy mask. That is not necessarily better than looking one’s age. I can usually find a gender issue in a haystack made up entirely of non-gendered issues, but for me, mourning my former face is not a feminist issue: to me, men my age are looking just as knackered as women my age are. And I notice that when my head swivels in admiration, it’s always in the direction of a guy who looks 30 or younger. I had a scare once: some years back, at a reading given by the New Yorker’s Bob Mankoff, who told us he was nearly 70, I found myself deeply attracted to him. I was seized with despair at the wrongness of this—would a straight middle-aged guy’s eyes turn into pinwheels at the sight of a pushing-70 female New Yorker editor?—until I realized that, tall and skinny, with glasses and center-parted flappy gray hair, Mankoff looks exactly like my husband. (Meaning my husband in 20 years.) So it’s not a feminist issue: It’s a shallowness issue. That’s a relief. For a year or so now I’ve been doing a Will Ferrell Watch: He’s almost exactly a month older than I am, and I figure that when he’s looking long in the tooth, I must be too. (Surely one of the reasons agents have their clients photographed as often as possible is to minimize the chance of shocking the public with what would seem to be an aged-overnight face.) And I’ve grown dogged—fiendish, even—about sizing up famous people who are approximately my age. Julia Roberts—I have two months on her. How’s she holding up? That middle-aged talking head I saw on "Charlie Rose" the other night—I’ve since forgotten his name; that’s not the point—looked uncontroversially older than I am. But I Wikipedia’d him, and guess what? I have two years on him. I just saw a recent photo of the dapper actor Robert Sean Leonard, and I would have bet a kidney that he had a year on me: in that photo, he looks just like a dad. (Is there a look that screams “middle age” any louder?) But no: I’m a year and a half older. And get this: for the entire run of "The Brady Bunch," the grandmotherly Ann B. Davis, who played Alice the housekeeper, was younger than I am now. That makes me potentially too old for Sam the butcher. Revelations like this require some uplift in the form of thoughts about actors a bit older than I am whose faces persist at their job of looking good: Robin Wright, Michael K. Williams, Cynthia Nixon, Benicio del Toro. That means there’s hope for me. Right? You may be pleased to learn that Teta, who did eventually lose her looks, lived to be 99, and toward the end of her life she was known to say, “Every day is a day against me.” She was referring to her health—her hearing was shot, her agility wasn’t what it once was—but her slogan is true for one’s looks as well. Maybe good-looking people just seem to age better than others do, just as I’ve always suspected that “photogenic” people are just people who are better looking than the rest of us, including when they’re not being photographed. Aging seems to be life’s one truly democratically administered cruelty. What feels like unfairness to me right now may in fact be the ultimate in fairness: time, that bastard, ticks away at the same rate for each of us. If you’re expecting me to end this essay on an uplifting note—“I’ve come to appreciate my inevitably middle-aged face, which shows proof of hard-won wisdom and a well-lived life”—you can forget it. I will never not blanch at photographs showing the accents grave and aigu on either side of my nose, not to mention my multi-circumflexed forehead. That’s decrepitude, not character. Fuck that. Being chagrinned to be visibly aging may not be a feminist issue for me, but what I do with my fretfulness could turn it into one, so I’m being careful. Here’s what I’m going to do: If I’ve learned nothing from Heather Locklear (and who hasn’t learned something from Heather Locklear?), I’ve learned that one should sleep on one’s back to minimize facial slumpiness over time. I’ll continue with the eye cream because otherwise I’d have to admit that it probably hasn’t worked and that I’m hundreds of dollars poorer with nothing to show for it. I could do as Candice Bergen suggests and gain some weight, which would seem to have the effect of filling in facial crevices, but I don’t like food all that much. I could grow bangs to cover the parallel bars in my forehead, but I’ve tried bangs before, and I always know exactly what I’ll be doing the day after I get them: growing them out. Do you know what’s the best psychic balm for me? Watching old movies. Did you know that Claire Trevor was younger than I am now when she played Natalie Wood’s mother in "Marjorie Morningstar"? That’s depressing, but at least I have the advantage of not being dead, and I’m sure you agree that dead isn’t a good look.A couple of years ago, my teenage daughter took my photograph on our front stoop, and the pictures showed a diagonal gash running down and out along either side of my nose. “The light’s bad out here,” I said, so we moved to the patio. She snapped more pictures. The gashes were still there. I rotated 90 degrees. Still there. I think you know where this is going. I’d heard of this happening to other people. “That’s around when she started losing her looks,” I remember an old boyfriend saying about Mary Tyler Moore while watching a rerun from the third season of her show. Heeding my mother’s advice, from the time I was 30 onward I purchased and used eye cream—my one concession to the sketchy age-defying-cosmetics industry—but that felt like superfluous insurance given the coup of genetics that seemed to be my birthright. I’d been telling everyone for years—literally decades—that I would age splendidly by virtue of the fact that my paternal grandmother had aged splendidly, and didn’t everyone say that I look just like my dad? “Is that your daughter or your granddaughter?” a stranger in line with us at the grocery store asked Teta when she was long a Miami retiree. And we’d all heard (because she told us) that Teta had been challenged at some AARP event because one look at her told the gatekeepers that she wasn’t old enough—except she was. It never occurred to me that looking my age would be my fate. That’s because it never occurred to me that my father looks like his father. How could this—this pole vault from one side of middle age to the other, right there on my patio—be happening when I felt so young? When someone asks me how old I am, my first thought is always “Twenty-six,” and then I have to pause and recalibrate until I get to the correct number: 48, currently. True, I dye my hair, but I’m not yet menopausal—“Still very much menstrual, thank you,” as "Ab Fab’s" Edina Monsoon would tell you—so I wasn’t ready to look middle-aged. But now that I think about it, there’s the suggestion that I saw it coming. You should get a load of the pictures that a photographer friend took to promote my book three years ago. In nearly each of several dozen shots, I’m making the face that I’ve seen women and men of a certain age make—a variation on Blue Steel, without quite so much eyebrow: a slight pucker in order to grow instant cheekbones and to suggest the precipice of smiling without actually smiling, because as everyone knows, smiling means wrinkles around the eyes. (You knew that, right?) I’ve since caught others assuming this face—a realtor neighbor who emails me her newsletter straps it on. I think I even caught the formidable Katrina vanden Heuvel doing it in the pages of the Nation. Unlike Katrina vanden Heuvel, I was never truly beautiful; what I had going for and against me was outsize features. (Having ancestors from the Middle East will do that to you.) My features are all a reasonable number of millimeters away from one another—random geometric luck, as I remember the model Paulina Porizkova describing it with semi-heroic humility. I do have a dud feature—a big, right-triangular nose—but I can view it only in profile, which requires some maneuvering of bathroom mirrors; I’m not a masochist nor especially dexterous, so I have done this rarely. Hence I grew up with a false sense of dud-feature-lessness, which probably led to an overestimation of my physical attractiveness, which worked for me, if finding myself consistently paired with good-looking guys is any gauge. None of this should matter now: I’ve been married for 15 years, so I’m no longer fishing; who cares if I have shitty bait? Self-servingly, I think Joan Didion is correct when she says that to our lifelong mates, our faces are frozen in time as they were when we met. (A few years ago I described my husband to someone as having black hair, only to have it pointed out that, actually, it’s gray.) So why am I devastated by the loss of my former face? Because it hurts to know that I’ve permanently lost something from my arsenal: it’s no longer rational to imagine myself on the receiving end of the movie cliché in which someone lays eyes on someone for the first time and is undone. I have the feeling of having been robbed of a superpower: I have lost the power to turn a stranger’s head. It happened in slow motion right before my eyes a few months ago. I had been corresponding with a 30-ish filmmaker by email, and when my husband and I met him in person a few days later, I saw something in the guy’s face, by which I mean I saw exactly nothing in the guy’s face: an utter lack of registration of anything other than “human animal” when he saw me. (Yes, I’m pretty sure he’s straight.) This didn’t used to happen. And there’s nothing I can do about it. Even the sudden onset of stupendous wealth couldn’t save the day: the best that even Marlo Thomas can achieve is a face that—forgive me, dear Marlo—looks pinched and pulled, like a taffy mask. That is not necessarily better than looking one’s age. I can usually find a gender issue in a haystack made up entirely of non-gendered issues, but for me, mourning my former face is not a feminist issue: to me, men my age are looking just as knackered as women my age are. And I notice that when my head swivels in admiration, it’s always in the direction of a guy who looks 30 or younger. I had a scare once: some years back, at a reading given by the New Yorker’s Bob Mankoff, who told us he was nearly 70, I found myself deeply attracted to him. I was seized with despair at the wrongness of this—would a straight middle-aged guy’s eyes turn into pinwheels at the sight of a pushing-70 female New Yorker editor?—until I realized that, tall and skinny, with glasses and center-parted flappy gray hair, Mankoff looks exactly like my husband. (Meaning my husband in 20 years.) So it’s not a feminist issue: It’s a shallowness issue. That’s a relief. For a year or so now I’ve been doing a Will Ferrell Watch: He’s almost exactly a month older than I am, and I figure that when he’s looking long in the tooth, I must be too. (Surely one of the reasons agents have their clients photographed as often as possible is to minimize the chance of shocking the public with what would seem to be an aged-overnight face.) And I’ve grown dogged—fiendish, even—about sizing up famous people who are approximately my age. Julia Roberts—I have two months on her. How’s she holding up? That middle-aged talking head I saw on "Charlie Rose" the other night—I’ve since forgotten his name; that’s not the point—looked uncontroversially older than I am. But I Wikipedia’d him, and guess what? I have two years on him. I just saw a recent photo of the dapper actor Robert Sean Leonard, and I would have bet a kidney that he had a year on me: in that photo, he looks just like a dad. (Is there a look that screams “middle age” any louder?) But no: I’m a year and a half older. And get this: for the entire run of "The Brady Bunch," the grandmotherly Ann B. Davis, who played Alice the housekeeper, was younger than I am now. That makes me potentially too old for Sam the butcher. Revelations like this require some uplift in the form of thoughts about actors a bit older than I am whose faces persist at their job of looking good: Robin Wright, Michael K. Williams, Cynthia Nixon, Benicio del Toro. That means there’s hope for me. Right? You may be pleased to learn that Teta, who did eventually lose her looks, lived to be 99, and toward the end of her life she was known to say, “Every day is a day against me.” She was referring to her health—her hearing was shot, her agility wasn’t what it once was—but her slogan is true for one’s looks as well. Maybe good-looking people just seem to age better than others do, just as I’ve always suspected that “photogenic” people are just people who are better looking than the rest of us, including when they’re not being photographed. Aging seems to be life’s one truly democratically administered cruelty. What feels like unfairness to me right now may in fact be the ultimate in fairness: time, that bastard, ticks away at the same rate for each of us. If you’re expecting me to end this essay on an uplifting note—“I’ve come to appreciate my inevitably middle-aged face, which shows proof of hard-won wisdom and a well-lived life”—you can forget it. I will never not blanch at photographs showing the accents grave and aigu on either side of my nose, not to mention my multi-circumflexed forehead. That’s decrepitude, not character. Fuck that. Being chagrinned to be visibly aging may not be a feminist issue for me, but what I do with my fretfulness could turn it into one, so I’m being careful. Here’s what I’m going to do: If I’ve learned nothing from Heather Locklear (and who hasn’t learned something from Heather Locklear?), I’ve learned that one should sleep on one’s back to minimize facial slumpiness over time. I’ll continue with the eye cream because otherwise I’d have to admit that it probably hasn’t worked and that I’m hundreds of dollars poorer with nothing to show for it. I could do as Candice Bergen suggests and gain some weight, which would seem to have the effect of filling in facial crevices, but I don’t like food all that much. I could grow bangs to cover the parallel bars in my forehead, but I’ve tried bangs before, and I always know exactly what I’ll be doing the day after I get them: growing them out. Do you know what’s the best psychic balm for me? Watching old movies. Did you know that Claire Trevor was younger than I am now when she played Natalie Wood’s mother in "Marjorie Morningstar"? That’s depressing, but at least I have the advantage of not being dead, and I’m sure you agree that dead isn’t a good look.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 15:30

A “final girl” who gets to get off: “The Witch” proves nothing’s scarier than an unapologetically liberated young woman

Note: The following contains many spoilers for "The Witch," including a description of the film's ending. “Wouldst thou like to live deliciously? Wouldst thou like to see the world?” Posed by any other than the Prince of Darkness, these questions would seem to answer themselves, especially when intoned into the ears of an emboldened young woman. Toward the end of “The Witch,” Robert Eggers' lyric flirtation with horror, the welcome response is a wide-eyed “yes,” seguing into a finale at once awesome and unsettling. Everyone knows that the virgin makes it out alive, but seldom does survival come with an orgasm. Here, that’s exactly what happens: The “final girl” finally gets to get off. Critics have already praised the film as a feminist fable about the combustibility of female sexuality, a “sinister, smart” approach to genre, and even “the year's scariest movie.” Scariest movie of the year? No scarier, I'd say, than a day in the life of a teenage girl in the throes of burgeoning womanhood and its attendant threat of sudden ostracism. Most of these superlative verdicts seem to be delivered by spooked-out men drawn to (and repulsed by) the film's loaded images, and less attentive to Eggers’ subversion of a classic horror trope: the “final girl” laced into a Puritan bodice. As Erik Piepenberg put it last year in the New York Times, “the final girl ... is the feisty character who’s left to face the killer in a horror movie. To cheers from the audience, she usually wins the climactic combat with weapons and wit, providing a cathartic end to the gore and gloom.” The difference in “The Witch” is that climactic combat concludes in midair climax, and that not everyone is cheering at such literal comeuppance. Dividing audiences since its national release two weeks ago, the Sundance favorite and self-described “folk tale” initially seems to confirm a fairly tired vision of Wanton Womanness—the “witch” in the woods shifting predictably between Snow White siren, Red Riding Hood vamp, and has-been Fredericks-of-Hollywood hag. But the real rebel—Thomasin, played by Anya Taylor-Joy—anchors the film from its opening shot, a medium close-up of her attentive face as her family awaits banishment. “What went we out into this wilderness to find?” demands her father, William (Ralph Ineson), before a chilly congregation. We will soon learn that this question is not rhetorical. Though mostly a figure of virtue and daughterly obeisance, gradually it is clear that Thomasin’s sharp mouth and modest cleavage are too much for the homestead to handle. Following orders, finishing chores and telling the truth don’t score her many points, and it is often the latter for which she is most scorned. “You’ve cursed this family,” spits her mother, Katherine (Kate Dickie), following the death of her first son (Harvey Scrimshaw), while her other two children, Arbusonian twins, scheme creepily offscreen. “The devil hath betwixt your tongue,” her father cries when Thomasin calls him a hypocrite, a liar and a poor shot in a Calvinist war of words. Before being dragged across the yard and into the home for further interrogation, she pleads, “Why have you turned against me?”—a response not only to accusations of witchcraft, but perhaps more poignantly to her parents’ ongoing plot to shop her out to another family, her sexuality too dangerous and her presence too strong to keep around the farm, marriage or servitude her only shot at survival (and that’s if she’s acquitted of wickedness). In a long take a few scenes later, Thomasin lies on the ground beneath her mother’s nightgowned body, the two conjoined as though in sleep. After 25 seconds of brutal silence, Katherine is shoved off like the dead weight she has become. Our heroine rises, blood on her blouse, filicide averted. Sound terrifying? We need not look back to the 17th century, nor to the 300 years that followed to show that independent women might prove a fright. Anita Hill, Monica Lewinsky, Hillary Clinton—America loves its witches, so long as we’ve adequate kindle nearby. (Is it any coincidence that the one thing Thomasin’s father can provide is a towering pile of firewood?) What’s remarkable about “witchiness” is how perfectly it conflates women with little in common but a sense of self-worth, a modicum of ambition, and the audacity to express the two sans permission from a man. As Rebecca Traister points out in her latest book “All the Single Ladies: Unmarried Women and the Rise of an Independent Nation,” these women continue to disrupt our country’s status quo, reshaping it in their wake. In a recent interview on "Fresh Air," Traister discusses how, regardless of whether by choice or fate, life without men has been historically seen as “scandalous” in the “best case scenario,” violating of the law of coverture wherein a woman’s selfhood was legally subsumed by the identity of either her father or husband. But the last two decades has witnessed a seismic shift in this dynamic, Traister says, with the “creation of an entirely new population: adult women who are no longer economically, socially, sexually or reproductively dependent on or defined by the men they marry.” With the rise of this “new nation,” might a new breed of final girl follow suit? The trope seem to be having a moment, if not in the most traditional vein. From Guillermo del Toro’s “Crimson Peak,” the righteous Edith outlasts her demonic hubby (Tom Middleston) and his malevolent sister-lover (a raven-haired Jessica Chastain). Though technically not a virgin by the end, in proper final girl fashion, she kills them both in the pure-white snow, her flaxen hair billowing in the wind (not unlike Thomasin’s after knifing her mom). A more contemporary incarnation of the virtuous final girl can be found in the Ma character from “Room,” in an Oscar-winning performance by Brie Larson. While not billed as horror, the film’s veritable monster, “Old Nick,” is a lot scarier than Satan, enslaving a teenage (and presumably virginal) Joy and then impregnating her with the child who narrates much of the story. A good girl to the core (she is initially tricked and abducted via sympathy for a stray dog), Joy’s ability to escape Room with her innocent son, Jack (Jacob Twemblay), follows final girl logic. Still, the fact that her father—played by a surprisingly unlikable William H. Macy—won’t acknowledge his grandson due to his rape origin, throws into relief paternal fears over sex gone wrong, the virgin/whore dichotomy rearing its ugly head. For better or for worse (and many would say for the better), “The Witch” is as much a genre piece as it is feminist psychodrama, and doesn’t shy from a certain bald (occasionally ribald) tone. In the film’s final segment, as they are trapped in a pen with “Black Phillip,” the family’s beloved but bedeviled goat, Thomasin asks the befuddled twins if either is a witch. In a kind of gallows humor (two brothers have already gruesomely perished, and the three surviving siblings are “grounded” in a manner that makes missing prom seem enviable), one twin counters back, “Are you?” But the film is also scary at times; it is bloodied apples, branded brothers, and newborns churned into body wash. It’s the threat of starvation, rotted crops, a cracked egg oozing a half-formed chick. But it also doesn’t have to be taken any more seriously than its endorsement from the Satanic Temple (a clever move for a group attempting to rebrand “Satan” as a route to individualism). And despite the strong reception to Eggers’ debut, the director announced this week that, flouting genre expectations, there will be no sequel. The film’s last shot is all we get. “What do you want?” a man asks a newly orphaned Thomasin at the film’s conclusion, imploring her to “sign the book.” It is the sole time in the entire film that her desires are validated, that the very concept of female desire is not immediately dismissed. Who can blame her, then, for selling her soul and deciding to join what one critic calls “a furious Black Sabbath nightmare”—a nightmare, I might add, that resembles a womyn’s music festival as much as anything else? Stripped of “shift” in a scene less lurid that luminous (no full frontal here, a refreshingly prudent decision, as was that to depict Taylor-Joy in medium-close-ups for most of the film), Thomasin challenges the slasher cliché that good girls might outlive their friends (and family and livestock) but never get to have fun. In the end, she floats among the pines, arms outstretched and head thrown back, as the others howl above the flames. They’re crazy and scary and having a blast. She is ravished, laughing up at the stars, her face softly glowing. “Single female life is not prescription,” claims Traister, “but its opposite: liberation.”Note: The following contains many spoilers for "The Witch," including a description of the film's ending. “Wouldst thou like to live deliciously? Wouldst thou like to see the world?” Posed by any other than the Prince of Darkness, these questions would seem to answer themselves, especially when intoned into the ears of an emboldened young woman. Toward the end of “The Witch,” Robert Eggers' lyric flirtation with horror, the welcome response is a wide-eyed “yes,” seguing into a finale at once awesome and unsettling. Everyone knows that the virgin makes it out alive, but seldom does survival come with an orgasm. Here, that’s exactly what happens: The “final girl” finally gets to get off. Critics have already praised the film as a feminist fable about the combustibility of female sexuality, a “sinister, smart” approach to genre, and even “the year's scariest movie.” Scariest movie of the year? No scarier, I'd say, than a day in the life of a teenage girl in the throes of burgeoning womanhood and its attendant threat of sudden ostracism. Most of these superlative verdicts seem to be delivered by spooked-out men drawn to (and repulsed by) the film's loaded images, and less attentive to Eggers’ subversion of a classic horror trope: the “final girl” laced into a Puritan bodice. As Erik Piepenberg put it last year in the New York Times, “the final girl ... is the feisty character who’s left to face the killer in a horror movie. To cheers from the audience, she usually wins the climactic combat with weapons and wit, providing a cathartic end to the gore and gloom.” The difference in “The Witch” is that climactic combat concludes in midair climax, and that not everyone is cheering at such literal comeuppance. Dividing audiences since its national release two weeks ago, the Sundance favorite and self-described “folk tale” initially seems to confirm a fairly tired vision of Wanton Womanness—the “witch” in the woods shifting predictably between Snow White siren, Red Riding Hood vamp, and has-been Fredericks-of-Hollywood hag. But the real rebel—Thomasin, played by Anya Taylor-Joy—anchors the film from its opening shot, a medium close-up of her attentive face as her family awaits banishment. “What went we out into this wilderness to find?” demands her father, William (Ralph Ineson), before a chilly congregation. We will soon learn that this question is not rhetorical. Though mostly a figure of virtue and daughterly obeisance, gradually it is clear that Thomasin’s sharp mouth and modest cleavage are too much for the homestead to handle. Following orders, finishing chores and telling the truth don’t score her many points, and it is often the latter for which she is most scorned. “You’ve cursed this family,” spits her mother, Katherine (Kate Dickie), following the death of her first son (Harvey Scrimshaw), while her other two children, Arbusonian twins, scheme creepily offscreen. “The devil hath betwixt your tongue,” her father cries when Thomasin calls him a hypocrite, a liar and a poor shot in a Calvinist war of words. Before being dragged across the yard and into the home for further interrogation, she pleads, “Why have you turned against me?”—a response not only to accusations of witchcraft, but perhaps more poignantly to her parents’ ongoing plot to shop her out to another family, her sexuality too dangerous and her presence too strong to keep around the farm, marriage or servitude her only shot at survival (and that’s if she’s acquitted of wickedness). In a long take a few scenes later, Thomasin lies on the ground beneath her mother’s nightgowned body, the two conjoined as though in sleep. After 25 seconds of brutal silence, Katherine is shoved off like the dead weight she has become. Our heroine rises, blood on her blouse, filicide averted. Sound terrifying? We need not look back to the 17th century, nor to the 300 years that followed to show that independent women might prove a fright. Anita Hill, Monica Lewinsky, Hillary Clinton—America loves its witches, so long as we’ve adequate kindle nearby. (Is it any coincidence that the one thing Thomasin’s father can provide is a towering pile of firewood?) What’s remarkable about “witchiness” is how perfectly it conflates women with little in common but a sense of self-worth, a modicum of ambition, and the audacity to express the two sans permission from a man. As Rebecca Traister points out in her latest book “All the Single Ladies: Unmarried Women and the Rise of an Independent Nation,” these women continue to disrupt our country’s status quo, reshaping it in their wake. In a recent interview on "Fresh Air," Traister discusses how, regardless of whether by choice or fate, life without men has been historically seen as “scandalous” in the “best case scenario,” violating of the law of coverture wherein a woman’s selfhood was legally subsumed by the identity of either her father or husband. But the last two decades has witnessed a seismic shift in this dynamic, Traister says, with the “creation of an entirely new population: adult women who are no longer economically, socially, sexually or reproductively dependent on or defined by the men they marry.” With the rise of this “new nation,” might a new breed of final girl follow suit? The trope seem to be having a moment, if not in the most traditional vein. From Guillermo del Toro’s “Crimson Peak,” the righteous Edith outlasts her demonic hubby (Tom Middleston) and his malevolent sister-lover (a raven-haired Jessica Chastain). Though technically not a virgin by the end, in proper final girl fashion, she kills them both in the pure-white snow, her flaxen hair billowing in the wind (not unlike Thomasin’s after knifing her mom). A more contemporary incarnation of the virtuous final girl can be found in the Ma character from “Room,” in an Oscar-winning performance by Brie Larson. While not billed as horror, the film’s veritable monster, “Old Nick,” is a lot scarier than Satan, enslaving a teenage (and presumably virginal) Joy and then impregnating her with the child who narrates much of the story. A good girl to the core (she is initially tricked and abducted via sympathy for a stray dog), Joy’s ability to escape Room with her innocent son, Jack (Jacob Twemblay), follows final girl logic. Still, the fact that her father—played by a surprisingly unlikable William H. Macy—won’t acknowledge his grandson due to his rape origin, throws into relief paternal fears over sex gone wrong, the virgin/whore dichotomy rearing its ugly head. For better or for worse (and many would say for the better), “The Witch” is as much a genre piece as it is feminist psychodrama, and doesn’t shy from a certain bald (occasionally ribald) tone. In the film’s final segment, as they are trapped in a pen with “Black Phillip,” the family’s beloved but bedeviled goat, Thomasin asks the befuddled twins if either is a witch. In a kind of gallows humor (two brothers have already gruesomely perished, and the three surviving siblings are “grounded” in a manner that makes missing prom seem enviable), one twin counters back, “Are you?” But the film is also scary at times; it is bloodied apples, branded brothers, and newborns churned into body wash. It’s the threat of starvation, rotted crops, a cracked egg oozing a half-formed chick. But it also doesn’t have to be taken any more seriously than its endorsement from the Satanic Temple (a clever move for a group attempting to rebrand “Satan” as a route to individualism). And despite the strong reception to Eggers’ debut, the director announced this week that, flouting genre expectations, there will be no sequel. The film’s last shot is all we get. “What do you want?” a man asks a newly orphaned Thomasin at the film’s conclusion, imploring her to “sign the book.” It is the sole time in the entire film that her desires are validated, that the very concept of female desire is not immediately dismissed. Who can blame her, then, for selling her soul and deciding to join what one critic calls “a furious Black Sabbath nightmare”—a nightmare, I might add, that resembles a womyn’s music festival as much as anything else? Stripped of “shift” in a scene less lurid that luminous (no full frontal here, a refreshingly prudent decision, as was that to depict Taylor-Joy in medium-close-ups for most of the film), Thomasin challenges the slasher cliché that good girls might outlive their friends (and family and livestock) but never get to have fun. In the end, she floats among the pines, arms outstretched and head thrown back, as the others howl above the flames. They’re crazy and scary and having a blast. She is ravished, laughing up at the stars, her face softly glowing. “Single female life is not prescription,” claims Traister, “but its opposite: liberation.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 14:30

Employers are using credit checks against otherwise qualified workers. Here’s what we can do about it.

Over the last decade, an increasing number of cities and states passed laws limiting the use of credit checks in hiring, promotion, and firing. These laws have been motivated by the reality that personal credit history is not relevant to employment and that employment credit checks prevent otherwise qualified workers with flawed credit from finding jobs, and that unemployed workers and historically disadvantaged groups, including people of color, are disproportionately harmed by credit checks. Our new report, "Bad Credit Shouldn’t Block Employment" shows that credit check laws work, but also finds that there’s far more to be done. Further, New York City’s recent, comprehensive law offers the best legislation on the books for other states to follow – and improve on. Laws banning credit checks work New research by economists Robert Clifford and Daniel Shoag suggests that credit check laws are effective at increasing employment among those with low credit scores. The authors find that credit check laws lead to a 7 to 11% reduction in uses of employment credit checks, a significant reduction, though it’s clear that many employers still use credit checks even with the laws in place (see chart). Demos’ research suggests three reasons why the use of credit checks persist. First, because state bans include many exemptions allowing employers to continue checking credit as part of the hiring process, even when these exemptions are not justified by any empirical research showing that credit is relevant. Second, because laws are insufficiently publicized, so employers and employees may not realize that employment credit checks are prohibited. Third, a lack of resources has rendered enforcement of workplace protections very weak. Despite these shortcomings, Clifford and Shoag find that laws banning credit checks succeed in increasing employment in low-credit census tracts by between 2.3% and 3.3%. They also find that the largest impacts were found in the public sector, likely because this is where compliance with the law was highest. The authors find that companies shifted away from credit information to using other indicators such as college degrees and prior work experience. Given that these factors actually contain meaningful information about job performance (while credit checks do not) this is preferable to the previous status quo. Credit check laws are not a panacea for discriminatory practices, and other regulation may be needed. SalonCreditChecks1 Laws banning credit checks could actually work even better However, in our new Demos report, "Bad Credit Shouldn’t Block Employment: How to Make State Bans on Employment Credit Checks More Effective" we show that while most credit check laws are a key step forward, they can be strengthened in a few ways: Reduce the number of unnecessary exemptions Create robust enforcement mechanisms Increase public awareness of the laws Unnecessary exemptions plague employment credit check bills, as the chart below shows. In Vermont, Helen Head, Chair of the Committee on General, Housing & Military Affairs tells Demos that, “We are concerned that the large number of exceptions may make it more difficult to limit the practice of employer credit checks.” Her concern is well-placed. Legal scholars James Phillips and David Schein note that exemptions have “virtually gutted” the restrictions, because they are often so broad. For instance, seven states allow for credit checks if such information meets the vague standard of being “substantially job related,” and six allow credit checks for positions that involve access to money. These exemptions are unwarranted. TransUnion, a major credit reporting company recently admitted, “we don’t have any research to show any statistical correlation between what’s in somebody’s credit report and their job performance or their likelihood to commit fraud.” In many cases, states were doing very little to enforce the laws, normally because they are required by law to respond to complaints, and receive very few, if any. In Connecticut, after four years there had only been two complaints, and neither were found to have merit. The two complaints Maryland had were both resolved informally (without citations or fines). Oregon provided Demos with data from all cases filed since their law’s passage, which included eight cases, one of which had been settled privately, another withdrawn to court and a final leading to negotiated conciliation. Elizabeth Funk of Colorado’s Department of Labor tells Demos that they had received between 10 and 20 complaints, about half of which had lead to investigations. As of yet, there were no fine levied, but they reported being in the middle of investigations. This is not to fault the government agencies tasked with enforcement - given the wide range of activities they oversee, credit checks are likely to be de-prioritized because there simply aren’t many complaints. Given the deep austerity afflicting state governments, resources are scarce. In addition, credit check laws don’t give the agencies the power to initiate investigations, which means that public awareness is key. But beyond brief press coverage at the passage of the laws and some information on state government websites, there has been very little effort to publicize the laws. New York City shows how to get credit-check laws right In 2015, New York City enacted the most robust employment credit check law on the books. The law was enacted with a diverse coalition including labor, community organizations, civil rights groups, students and consumer groups. While the law still contains exemptions, these are narrower than in many laws in other states. The exemptions were the result of local political compromises and should not be considered a model for future legislation. The law also includes strong penalties - up to $250,000.. In addition, New York City undertook an extensive public awareness campaign, including ads on subways and buses, as well as a social media campaign including the hashtag #CreditCheckLawNYC. The NYC Commission on Human Rights created a webpage clearly explaining the implications of the law. Conclusion: Credit checks are a racially discriminatory and unnecessary qualification for employment. States should implement laws to restrict the practice without unjustified exemptions. New laws should include robust enforcement mechanisms and should be broadly publicized so that workers, job-seekers and employers alike are aware of these important protections. SalonCreditChecks2 Read the full report: "Bad Credit Shouldn’t Block Employment: How to Make State Bans on Employment Credit Checks More Effective."

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 12:30

Hillary’s “House of Cards”: What Claire and Frank Underwood tell us about marriage, gender and the White House

We’ve no lack of power couples to examine in the English-speaking world today. Prince William and the princess formerly called Kate Middleton; Kim Kardashian and Kanye West; Beyoncé and Jay Z; President Barack Obama and first lady Michelle. Marriages that are political, professional and personal, uniting two people in ambition, sex and maybe some other stuff, too. But the 2016 election has brought one of the most controversial and fraught power couples in American history back to light, and that is the relationship between President Bill Clinton and Secretary Hillary Clinton, who have been partners in the public eye ever since 1976, just a year after they were married. The couple has straddled the line between American politics’ dated expectations of marriage and the starry-eyed ideals of ‘70s liberalism—both externally and internally; just as the media has struggled with the Clintons’ non-traditional gender roles, infidelity and otherwise seamless professional partnership, the marriage appears to have suffered some bad days as a result of the couple’s various scandals. I’ve written before about how television has pondered and anticipated the rise of Hillary Clinton’s presidency, by telling the narratives of overlooked or betrayed political wives and theorizing how each might find her own separate peace. But beyond just imagining a female president, as I argued then, much of Hollywood’s storytelling about Hillary Clinton is in fact about plumbing the depths of that presumably fraught and very public relationship between President Bill Clinton and the former secretary of state. Much of the speculation stems from the mystery of not-knowing. In the same way that William Shakespeare speculated on how Antony and Cleopatra felt about each other, and Hilary Mantel wrote about Henry VIII’s inner workings, Hollywood’s politically oriented writers speculate on the private lives of Bill and Hillary Clinton. For example, in Shonda Rhimes’ “Scandal,” Fitzgerald and Melanie Grant (Tony Goldwyn and Bellamy Young) are plagued by the same thwarted ambition and infidelity that appears to have characterized Bill Clinton’s White House years; in “The Good Wife,” Alicia Florrick (Julianna Margulies) struggles to reconcile her feminism, ambition and intelligence with her role as her husband’s wife-appendage, the woman who stands behind him during speeches. But there is no rendition of the Clinton presidency that is more pointedly about Bill and Hillary than Netflix’s “House of Cards,” the fourth season of which debuted on Friday. Unlike the characters in nearly every other political drama, the leads of “House of Cards” never really change, or grow, or evolve. It’s politics; the Underwoods have just one drive, and that is to amass power. The Netflix show—its first original scripted drama, and a successful one, based purely on audience conversation about it—has never exactly been good, but its cynical, chilly, dispassionate view of American politics and human relationships feels knowing, at the very least, about the truly craven depths of the human heart. It didn’t quite start out that way. “House of Cards,” like its British forebear, styled its murderous and ambitious leads Frank (Kevin Spacey) and Claire Underwood (Robin Wright) on “Macbeth,” in which Lady Macbeth’s queenly, surrogate ambition for her husband is the only political ambition available to her. In the original novels, Elizabeth Urquhart (Diane Fletcher, in the miniseries) isn’t a major player in her husband’s schemes until the second and third books, but in the award-winning BBC series, Elizabeth is very involved with her husband’s plots. The first season of Netflix’s “House of Cards” maps closely with the British version, and in both, it’s the husband’s search for power, in the power couple, that is the duo’s priority. (In the U.S. version, though she does in fact run her clean water initiative, Claire is relegated largely to the role of a wife with a pet charity. It ends up being subordinate to her husband’s quest for a cabinet position.) As the American show has found its own voice, though, it’s begun to focus on different things. “House of Cards” executive producer David Fincher demonstrated how cynically he views marriage with last year’s “Gone Girl,” his cinematic adaptation of Gillian Flynn’s bestselling novel. With Fincher’s influence, showrunner Beau Willimon incorporates a deep ambiguity about partnership and loyalty with the careful analysis and then breakdown of the Underwood marriage, which becomes the main source of intrigue and conflict throughout seasons two and three. It’s smart; one of the reasons the show is so exhausting to watch is because the Underwoods always win everything, because they are just that good. Pitting them against each other—in small ways at first, but blowing up into big ways by the end of season three—injects a real sense of suspense into the otherwise grim and inevitable proceedings. The third season ended with Claire walking out on Frank, after a thrilling scene midseason where she told him, at the end of an argument, that she should have never made him president. The fourth season opens with her and Frank maneuvering against each other during the already fraught primary process, as he struggles to stay president after stepping into the office at the end of season two. It is still deeply cynical, but the six episodes I watched have wonderful momentum, a propulsion that the show has lacked since season one. What comes between the unstoppable Underwoods is simply that Claire is tired of being Frank’s second fiddle. She resents his power, that the world grants to him without recognizing her contribution; she resents that she is relegated to the role of first lady, pushed to manage just place settings and flower arrangements, when what she wants is to broker deals and manipulate power. It’s perfectly understandable. A president cannot be elected, in this era of sexual panic, without a spouse, but the spouse is just an appendage, just a seal of approval. Claire has worked with Frank every step of the way, but in the eyes of the world, she has earned no particular power. So when even Frank won’t allow her to run for office, she becomes his worst enemy, the queen on the chessboard entrapping the king, and running for office anyway in the meantime. Season four utilizes more fantasy and dream sequences than the show has previously, and those images are very much in the tradition of Fincher’s previous work. In one recurring motif, Frank and Claire fight with each other as if possessed. Claire pulls out a knife and rams it into his gut. Frank wraps his hands around her neck and slams her into a mirror, shattering glass. “Gone Girl” explored how marriage made someone the perfect enemy and the perfect lover; “House of Cards” season four does similar work, making the first several episodes of the season into a tango dance of attempted annihilation. Claire’s quest for power exposes so many little hypocrisies about not just the role of women in the world, but more specifically, the expected role of the political wife. How she is both held responsible for her husband’s policies and given no credit for them; how her independence is interpreted as his weakness; how she can only strengthen him, but he can only detract from her. Claire is no feminist hero—she is Nietzschean will-to-power made flesh, with no interest in anyone’s cause except her own—but her ascendancy to the heights of power looks like feminism, because her hurdles are so much more blatantly sexist. It’s impossible not to think of the Clintons when watching the Underwoods. Both couples are seamlessly professional in public; Frank, like Bill, comes from a working-class Southern background, while Claire, like Hillary, had a bit more patrician upbringing. Both are partners in work and in life, often when they should, by all rights, completely hate each other. And both are couples where male privilege meets female resentment; the men rocketed to success, while the women have had to carefully pick their way through political minefields. At least in one area, Hillary Clinton is entirely unique: She is the only first lady, so far, to leave living in the White House to pursue running for the White House. Her marriage was a source of public amusement and scorn for years, but she decided to stay in it anyway, and direct it back toward political office. “House of Cards’" third season displays some of that same unconventional political decision-making; Frank appoints Claire his ambassador to Russia. In this upcoming season, Claire’s going to get even more ambitious in demanding her fair share of the Underwood empire. It is captivating. I’m not willing to say that “House of Cards" season four is the best yet, but watching Claire and Frank Underwood fight with each other is what that show, and Spacey and Wright in the main roles, were made to do. There’s an element of lifting behind the curtain to glimpse the worlds of other couples in power, which storytellers have loved to do from Shakespeare to “Scandal.” But it’s also a horrific kind of captivating. The private lives of these public figures who have worked so hard to attain so much power—they have enormous effect on the rest of us. Henry VIII broke with the Catholic Church to marry Anne Boleyn, which changed the belief system of an entire country overnight. Antony betrayed Rome to ally with his lover Cleopatra, and both asked scores of soldiers to die on their behalf. It’s not just marital discord when nations and billions are on the line. The marriages are just like ours, but the stakes are anything but. And though America doesn’t have quite as many salacious stories of royal couples as older countries with monarchies do, “House of Cards” is an unsettling reminder that we can still have decades-long power struggles between the king and queen, with all the rest of us used as pawns.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 11:00

“It’s shameless financial strip-mining”: Les Leopold explains how the 1 percent killed the middle class

While the fate of the presidential campaign that talks about the issue more than any other remains uncertain, this much is clear: Despite the general public's mounting anxiety and awareness, the economic inequality that's done so much to change American society over the past 40 years has not abated. It may, in fact, be getting worse. For this reason alone, "Runaway Inequality: An Activist's Guide to Economic Justice," the new book from Labor Institute executive director and president Les Leopold, would be worth reading. Thankfully, however, the book has many virtues besides its timeliness. And more than most of the other high-profile books on inequality in recent years, "Runaway Inequality" doesn't just explain where the U.S. economy went wrong; it also explains how American citizens can organize to get it back on track. Recently, Salon spoke with Leopold over the phone about the book and economic inequality in general. Our conversation has been edited for clarity and length. There are a lot of books about inequality out there now, especially in the past five or so years. What does your book bring to the conversation that was otherwise lacking? I think there were three things that I thought would differ from the ongoing conversation. The first one was that runaway inequality was accelerating. It isn't just there, it's growing. The fact that 95 percent of all the new income in the current so-called recovery is going to the top 1 percent is indicative of what's happening. I don't think that's ever happened before in American economic history that I can find. There's no recovery at the bottom, it just keeps going to the top. The second one, which I think is even more important, was that I saw runaway inequality as a core issue that linked so many diverse issues. I think it's kind of funny when someone says, "Well, Bernie Sanders is just interested in inequality or Wall Street. It's just one issue." I see it quite differently. I see it as the issue that connects so many other issues, and therefore that leads to the third reason. I thought that connective tissue could be the basis for building an analysis that could help foster a broad-based progressive-populist movement. That if people could see that their issue silos were actually connected to inequality, it could build bridges amongst various progressive groups that have gotten siloed. Much of the last generation's worth of progressive action has been within an issue category, be it identity politics or education or housing or environment or so on. There has been a fracturing of what could be a more coherent movement, and I thought "Runaway Inequality," with its focus on Wall Street and the financialization of the economy, could provide that connective tissue. I didn't see that anywhere else. How does your analysis differ from some of the other recent work on inequality? The slant on Runaway Inequality was different from Piketty and others. There tends to be a story that goes something like this: "American workers kind of got lost in the global shuffle. They don't have the skills that the more elite people have and we don't need the manual labor, et cetera. It's kind of a skill problem, a mismatch between skills and jobs." I just don't think that's true. I think, in fact, the deregulation of the financial system is the driving force of runaway inequality. I think the way to build a coherent, broad-based populist movement is to focus on runway inequality and Wall Street. That's what I'm hoping to contribute to. Why is it that so much of the recovery has gone to those at the top? That's the question that takes us to the core analysis. In the late '70s, roughly, a new economic philosophy really caught hold in both political parties. It originally came from the right, from Milton Friedman and the free marketeers. Academics call it neoliberalism; in the book, we call it the "Better Business Climate." It basically was kind of a simple model. Cut taxes, cut regulations, cut back social spending so people will be more eager to find work and be less dependent on the government, and basically undermine the power of labor unions so the economy would run more on market principles and have less inefficiencies in it. There would be more investment and profits, and therefore, all boats would rise. It would lead to kind of a boom economy. That was the theory. I was in graduate school when that was going on, and it was pretty strong, even more liberal economists were sort of giving up on Keynesianism and going in this direction. What they didn't teach us and what they never discussed is that it's one thing to deregulate trucking or airlines or telecommunications, but it's quite another thing to deregulate the financial sector. When they started deregulating the financial sector, it put in motion something that we refer to as "financial strip mining." It's an incredible, insidious process. It started with a lot of corporate raids - we know call them hedge funds, takeovers, private equity companies - financiers who use a little bit of their own money, borrow a huge amount of money, and start buying up companies. In the deregulated atmosphere they bought up thousands of them over time. The debt that was accumulated to do that was basically put on the company. It's a little bit like if you went out and bought a car with a loan, instead of you paying back the loan the car pays back the loan. That's what they were doing. How did this practice change the way those companies were run? They changed the way the CEOs were paid, so that the CEO acted in behalf of the Wall Street investors. This was really powerful. In 1980, 95 percent of the CEOs' pay was salary and bonuses, and five percent was stock incentives. Today, it's virtually reversed. About 85 to 95 percent is stock incentives, and only five percent is salaries and bonuses. So that means the price of the stock is all that matters to the CEO, and of course that's all that matters to the investors - the hedge funds, the private equity companies. They want to see the stock go up. It's a huge change in corporate culture. Now the CEO cares only about raising the stock. What's the best way to do that? In workshops, we ask working people and community activists this question, and they start talking about, "Well, you've got to create a better product, you want to get more market share," all of the things you would think would lead in that direction. In fact, they did something else. There was a rule change in 1982, under Reagan. A guy who was the former Head of E.F. Hutton became head of the Securities and Exchange Commission, and he changed the rule about companies buying back their own shares. Before 1982, it was virtually illegal to do that because it was considered stock manipulation. When a company buys back its own shares, it reduces the number of share owners, and therefore every share is worth a little bit more. If you do this, all things being equal, you're going to boost the share and manipulate the price. The free market's not doing it, you're doing it. This guy thought, "Well that's very efficient. Anyway, competition will even all of that out." CEOs and their corporate raider Wall Street partners are thinking, "Oh, this is fantastic. Let's use the company's money to raise the price of the share, and then we can cash in on our stock incentives. The outside investors can cash in and leave, 'pump and dump.' This is great." How prevalent have stock buybacks become, and what are the implications of that? In 1980, about two percent of a company's profits were used for stock buybacks. By 2007, 75 percent of all corporate profits were used to buy back their own shares. Forget about R&D, forget about workers' wages, forget about all that kind of stuff. All that matters to a CEO today is raising the prices of the shares through stock buybacks. Yesterday, I was at a United Steelworkers meeting and they were very concerned about Carrier moving to Mexico. They're negotiating and they've been making concessions and they still can't get a deal. It's a really bad situation. Donald Trump has actually been talking about it as well. The difference between the negotiations, things are $10 million, $20 million, $30 million dollars. So I quickly go to Google and look up United Technologies, which owns Carrier. In October, United Technologies bought back $9 billion of their own shares. So they're strip mining the company, and they're using the worker contracts and the moving to Mexico as a way to generate more cash flow so that they can buy back their own shares. This financial strip mining is phenomenal. The net result is, in 1970 the ratio between a top-100 CEO's pay and an average worker was 45-to-1. Which is a lot if you think about it this way: if an average worker could afford one car, the CEO could afford 45 cars. Or one home versus 45 homes, or one home that's 45 times the size of an average worker's home. We just crunched the numbers again for 2014: it's 844-to-1. You can't even conceive of how big that gap is, and it's a direct result of financial strip mining. That's what leads to that acceleration. There's nothing to stop it now. This is just what they do. When they run out of cash flow to buy back their own shares, they go deeper into debt. They'll go to the debt market and try to get more money and then turn around and buy back their own shares. This has an incredible effect on virtually every other issue. How so? I can just give you one example that really galls me, but it says a lot and shows how many different issues are connected. The Obama administration bailed out the auto industry, and it's great that they did. The industry was going under due to the Wall Street crash and there was no other reason at all at the time. It was a financial crunch that was taking General Motors under. The guy who negotiated that deal, one of the key negotiators for the Obama administration, left and went to a hedge fund. GM built up a cash cushion because it's doing better now. I think all of the American people, at the very least, hoped that when GM built up its cash reserves it would do what needed to be done, which is build the best, highest quality, most efficient cars they possibly could for the future generations. This is what we all needed. I think that was the hope. Well, this guy goes to a hedge fund and takes a position, buys a bunch of shares of GM. And what does it do? It demands that instead of that cash going to R&D, that it goes to the investors through stock buybacks. And about three weeks ago, GM also announced a $9 billion stock buyback plan. It's shameless financial strip-mining. It does nothing whatsoever for society, but it undermines other goals. What the book then does is show how this process has huge impact on the public sector. This whole Better Business Climate has a direct connection to the rise of the prison population. So we show how issue after issue is deeply connected to this process of financial strip mining and runaway inequality. Capitalism has never been particularly warm and fuzzy towards workers, but there was a time before this Better Business Climate concept when businesses were seen as part of communities and were perceived as having obligations to society. They weren't just doing financial strip mining — even if it made economic sense, hypothetically.  That's a very good point. Our story kind of starts there. I personally think that this is a transformation of capitalism. Capitalism was still capitalism from World War II to the late 1970s, but the productivity, which is output per worker hour, basically has risen every year except five from 1947 to today. The line just goes up on a 45-degree angle. Average worker wages, taking into account inflation, also grows from 1947 to around 1977. Rose every single year. When we were in grad school, they taught us this was the iron law - that corporations needed to do this. In other words, being more community minded was part of what made a corporation a corporation, and supply and demand led them, if they wanted to keep that productivity going, to pay their workers reasonably well. The philosophy at that point was basically "retain and reinvest." CEOs viewed stakeholders as labor, community and their shareholders. It wasn't that shareholders were somehow in there for the share price over everything else. Once this Better Business Climate model hit, you look at these same two lines and they just split apart. Worker wages actually go down in terms of real buying power. The gap between the two lines today is so large that if worker wages stayed on that productivity line that they taught us was an iron law, which of course they then repealed as soon as I graduated, if the two lines kept going up together the average weekly wage would be double what it is today. That's how big a gap took place. Something really big changed. Where else can we see evidence of that change? You can see the change most clearly if you look at financial sector wages and non-financial sector wages. From 1947 to 1980, the two lines go up together. There was no premium for working on Wall Street. You could work for Chase Manhattan Bank or Manufacturers Hanover or whatever, or General Motors or Ford, and given your general level of skill, education, experience and so on, you'd earn about the same. There was no premium. And then basically the lines split apart again. Financial sector wages go through the roof after deregulation takes hold, so you get a different kind of capitalism. Piketty, I think, doesn't really emphasize that. I think very few people have really stressed what a huge change that is - to have the financial sector draw so much money to itself. By 2006, 40 percent of all corporate profits were going to the financial sector. They only have five percent of the workforce, but they've got 40 percent of the profits. The strip mining of wealth was being collected there, going from everybody else to them. I think that's different than the Robber Baron era, where the industrialists were getting a lot of money, but there was a rising standard of living for everybody else. That has stopped. The average American worker knows about inequality, but they may also wonder why it should matter to them so long as their own standard of living is improving. How do you reach people who think inequality is more of a problem in the abstract than in their daily lives? Basically there are two competing narratives. "Who cares about inequality if everybody is doing better in America?" and the narrative that's in the book, which is, "It's happening at your expense." We have a couple of chapters that do nothing else but compare the United States to other countries around the world, indicator after indicator after indicator, and just show how far we've fallen. We do lead the world in inequality, military spending and number and percent of prisoners. Of all the developed countries, we are second-to-last in child poverty. Romania is the only country behind us. They did a nice study of seven countries, according to upward mobility, what are the odds that you would be in an income bracket higher than your parents', and it turned out in the United States it was about 50-50. We were at the bottom of the list. Number one was Denmark, where the odds were seven-to-one that you would do better than your parents. So even upward mobility, our most cherished dream. It goes on and on. Education - in terms of what we pay our teachers, it's low. The amount of money we invest in 3-year-old and 4-year-old education, we're near the bottom of the list. An indicator like longevity, we actually showed a decline in comparison to other countries. So there's something really going on. Of course, if you were in the top one-percent, this is the greatest country on earth, I'm sure, because you live in your own world. You have your gated community, your private schools, virtually a private healthcare system. You pay very little tax because your money is offshore, and so on. So there's this breaking apart of America. American exceptionalism, the American dream is sort of collapsing. How does this financial strip mining impact the public sector? Two things are very important to realize about this financialization of corporations. The first is that the interest payment on debt is not taxable, just like the interest payment on the mortgage on someone's home is tax deductible. There was virtually no corporate debt up until about 1980; corporations used their own resources to fund what they needed. Then, through this corporate raiding process debt became the rage and now there's like $12 trillion in corporate debt. And as corporate debt goes up, tax payments go down. For example, corporate tax contributions to state and local government have virtually fallen in half since 1980, which puts more pressure on individual tax receipts. But the wealthy have moved so much money offshore that their taxes have also gone down. So we now have, when it comes to state and local government, a perfectly regressive tax system. The top one percent pay about half the tax rate as the bottom 20 percent. As you go down the brackets, the tax rate actually goes up and up -  the actual percentage that people pay gets higher and higher. And that decline in overall revenue leads to deficits and calls to cut safety net programs. That leads to a fiscal crisis. You have this great squeeze on the public sector, because you've got workers who haven't had a raise in a generation in terms of real buying power and you've got the wealthy not paying their share. We estimate there's something like $21 trillion offshore, and we're losing at least $150 billion in tax revenues on money that should be taxed but isn't. Now you have corporate inversions, which are putting more downward pressure on corporate taxes. So the people in the middle that are paying most of the taxes are tapped out. They're easy prey to a politician who says, "We want to cut the public sector. Teachers are getting paid too much, we can't afford their pensions." Why are programs like education such an easy target for spending cuts? We've got a new philosophy that comes with the Better Business Climate, and it's total individual responsibility -- the idea of a Great Society or a New Deal taking care of people goes out the window. If you want a job, go find it. If you're poor, it's your fault. If you want to go to school, take out a loan. The idea of the government providing anything is greatly diminished. Ask yourself when was the last time a politician said, "I want to create more public sector jobs so we can create jobs for inner-city youth, where unemployment is high." I don't think there's a politician in the country that directly would say that anymore, yet that was common in the '60s and early '70s because we knew there was a crisis of employment in rural places and inner cities. There were lots of experiments to figure out how to get employment in those areas, and the public sector grew. It was the track of upward mobility for African Americans, especially African American women, that was their ticket into the middle class: public employment. Since then, the number of government jobs as a percentage of the U.S. population has gone down as the Better Business Climate model took hold. As runaway inequality accelerated, it put more and more pressure on the public sector, and we basically gave up on the idea of a poverty program.While the fate of the presidential campaign that talks about the issue more than any other remains uncertain, this much is clear: Despite the general public's mounting anxiety and awareness, the economic inequality that's done so much to change American society over the past 40 years has not abated. It may, in fact, be getting worse. For this reason alone, "Runaway Inequality: An Activist's Guide to Economic Justice," the new book from Labor Institute executive director and president Les Leopold, would be worth reading. Thankfully, however, the book has many virtues besides its timeliness. And more than most of the other high-profile books on inequality in recent years, "Runaway Inequality" doesn't just explain where the U.S. economy went wrong; it also explains how American citizens can organize to get it back on track. Recently, Salon spoke with Leopold over the phone about the book and economic inequality in general. Our conversation has been edited for clarity and length. There are a lot of books about inequality out there now, especially in the past five or so years. What does your book bring to the conversation that was otherwise lacking? I think there were three things that I thought would differ from the ongoing conversation. The first one was that runaway inequality was accelerating. It isn't just there, it's growing. The fact that 95 percent of all the new income in the current so-called recovery is going to the top 1 percent is indicative of what's happening. I don't think that's ever happened before in American economic history that I can find. There's no recovery at the bottom, it just keeps going to the top. The second one, which I think is even more important, was that I saw runaway inequality as a core issue that linked so many diverse issues. I think it's kind of funny when someone says, "Well, Bernie Sanders is just interested in inequality or Wall Street. It's just one issue." I see it quite differently. I see it as the issue that connects so many other issues, and therefore that leads to the third reason. I thought that connective tissue could be the basis for building an analysis that could help foster a broad-based progressive-populist movement. That if people could see that their issue silos were actually connected to inequality, it could build bridges amongst various progressive groups that have gotten siloed. Much of the last generation's worth of progressive action has been within an issue category, be it identity politics or education or housing or environment or so on. There has been a fracturing of what could be a more coherent movement, and I thought "Runaway Inequality," with its focus on Wall Street and the financialization of the economy, could provide that connective tissue. I didn't see that anywhere else. How does your analysis differ from some of the other recent work on inequality? The slant on Runaway Inequality was different from Piketty and others. There tends to be a story that goes something like this: "American workers kind of got lost in the global shuffle. They don't have the skills that the more elite people have and we don't need the manual labor, et cetera. It's kind of a skill problem, a mismatch between skills and jobs." I just don't think that's true. I think, in fact, the deregulation of the financial system is the driving force of runaway inequality. I think the way to build a coherent, broad-based populist movement is to focus on runway inequality and Wall Street. That's what I'm hoping to contribute to. Why is it that so much of the recovery has gone to those at the top? That's the question that takes us to the core analysis. In the late '70s, roughly, a new economic philosophy really caught hold in both political parties. It originally came from the right, from Milton Friedman and the free marketeers. Academics call it neoliberalism; in the book, we call it the "Better Business Climate." It basically was kind of a simple model. Cut taxes, cut regulations, cut back social spending so people will be more eager to find work and be less dependent on the government, and basically undermine the power of labor unions so the economy would run more on market principles and have less inefficiencies in it. There would be more investment and profits, and therefore, all boats would rise. It would lead to kind of a boom economy. That was the theory. I was in graduate school when that was going on, and it was pretty strong, even more liberal economists were sort of giving up on Keynesianism and going in this direction. What they didn't teach us and what they never discussed is that it's one thing to deregulate trucking or airlines or telecommunications, but it's quite another thing to deregulate the financial sector. When they started deregulating the financial sector, it put in motion something that we refer to as "financial strip mining." It's an incredible, insidious process. It started with a lot of corporate raids - we know call them hedge funds, takeovers, private equity companies - financiers who use a little bit of their own money, borrow a huge amount of money, and start buying up companies. In the deregulated atmosphere they bought up thousands of them over time. The debt that was accumulated to do that was basically put on the company. It's a little bit like if you went out and bought a car with a loan, instead of you paying back the loan the car pays back the loan. That's what they were doing. How did this practice change the way those companies were run? They changed the way the CEOs were paid, so that the CEO acted in behalf of the Wall Street investors. This was really powerful. In 1980, 95 percent of the CEOs' pay was salary and bonuses, and five percent was stock incentives. Today, it's virtually reversed. About 85 to 95 percent is stock incentives, and only five percent is salaries and bonuses. So that means the price of the stock is all that matters to the CEO, and of course that's all that matters to the investors - the hedge funds, the private equity companies. They want to see the stock go up. It's a huge change in corporate culture. Now the CEO cares only about raising the stock. What's the best way to do that? In workshops, we ask working people and community activists this question, and they start talking about, "Well, you've got to create a better product, you want to get more market share," all of the things you would think would lead in that direction. In fact, they did something else. There was a rule change in 1982, under Reagan. A guy who was the former Head of E.F. Hutton became head of the Securities and Exchange Commission, and he changed the rule about companies buying back their own shares. Before 1982, it was virtually illegal to do that because it was considered stock manipulation. When a company buys back its own shares, it reduces the number of share owners, and therefore every share is worth a little bit more. If you do this, all things being equal, you're going to boost the share and manipulate the price. The free market's not doing it, you're doing it. This guy thought, "Well that's very efficient. Anyway, competition will even all of that out." CEOs and their corporate raider Wall Street partners are thinking, "Oh, this is fantastic. Let's use the company's money to raise the price of the share, and then we can cash in on our stock incentives. The outside investors can cash in and leave, 'pump and dump.' This is great." How prevalent have stock buybacks become, and what are the implications of that? In 1980, about two percent of a company's profits were used for stock buybacks. By 2007, 75 percent of all corporate profits were used to buy back their own shares. Forget about R&D, forget about workers' wages, forget about all that kind of stuff. All that matters to a CEO today is raising the prices of the shares through stock buybacks. Yesterday, I was at a United Steelworkers meeting and they were very concerned about Carrier moving to Mexico. They're negotiating and they've been making concessions and they still can't get a deal. It's a really bad situation. Donald Trump has actually been talking about it as well. The difference between the negotiations, things are $10 million, $20 million, $30 million dollars. So I quickly go to Google and look up United Technologies, which owns Carrier. In October, United Technologies bought back $9 billion of their own shares. So they're strip mining the company, and they're using the worker contracts and the moving to Mexico as a way to generate more cash flow so that they can buy back their own shares. This financial strip mining is phenomenal. The net result is, in 1970 the ratio between a top-100 CEO's pay and an average worker was 45-to-1. Which is a lot if you think about it this way: if an average worker could afford one car, the CEO could afford 45 cars. Or one home versus 45 homes, or one home that's 45 times the size of an average worker's home. We just crunched the numbers again for 2014: it's 844-to-1. You can't even conceive of how big that gap is, and it's a direct result of financial strip mining. That's what leads to that acceleration. There's nothing to stop it now. This is just what they do. When they run out of cash flow to buy back their own shares, they go deeper into debt. They'll go to the debt market and try to get more money and then turn around and buy back their own shares. This has an incredible effect on virtually every other issue. How so? I can just give you one example that really galls me, but it says a lot and shows how many different issues are connected. The Obama administration bailed out the auto industry, and it's great that they did. The industry was going under due to the Wall Street crash and there was no other reason at all at the time. It was a financial crunch that was taking General Motors under. The guy who negotiated that deal, one of the key negotiators for the Obama administration, left and went to a hedge fund. GM built up a cash cushion because it's doing better now. I think all of the American people, at the very least, hoped that when GM built up its cash reserves it would do what needed to be done, which is build the best, highest quality, most efficient cars they possibly could for the future generations. This is what we all needed. I think that was the hope. Well, this guy goes to a hedge fund and takes a position, buys a bunch of shares of GM. And what does it do? It demands that instead of that cash going to R&D, that it goes to the investors through stock buybacks. And about three weeks ago, GM also announced a $9 billion stock buyback plan. It's shameless financial strip-mining. It does nothing whatsoever for society, but it undermines other goals. What the book then does is show how this process has huge impact on the public sector. This whole Better Business Climate has a direct connection to the rise of the prison population. So we show how issue after issue is deeply connected to this process of financial strip mining and runaway inequality. Capitalism has never been particularly warm and fuzzy towards workers, but there was a time before this Better Business Climate concept when businesses were seen as part of communities and were perceived as having obligations to society. They weren't just doing financial strip mining — even if it made economic sense, hypothetically.  That's a very good point. Our story kind of starts there. I personally think that this is a transformation of capitalism. Capitalism was still capitalism from World War II to the late 1970s, but the productivity, which is output per worker hour, basically has risen every year except five from 1947 to today. The line just goes up on a 45-degree angle. Average worker wages, taking into account inflation, also grows from 1947 to around 1977. Rose every single year. When we were in grad school, they taught us this was the iron law - that corporations needed to do this. In other words, being more community minded was part of what made a corporation a corporation, and supply and demand led them, if they wanted to keep that productivity going, to pay their workers reasonably well. The philosophy at that point was basically "retain and reinvest." CEOs viewed stakeholders as labor, community and their shareholders. It wasn't that shareholders were somehow in there for the share price over everything else. Once this Better Business Climate model hit, you look at these same two lines and they just split apart. Worker wages actually go down in terms of real buying power. The gap between the two lines today is so large that if worker wages stayed on that productivity line that they taught us was an iron law, which of course they then repealed as soon as I graduated, if the two lines kept going up together the average weekly wage would be double what it is today. That's how big a gap took place. Something really big changed. Where else can we see evidence of that change? You can see the change most clearly if you look at financial sector wages and non-financial sector wages. From 1947 to 1980, the two lines go up together. There was no premium for working on Wall Street. You could work for Chase Manhattan Bank or Manufacturers Hanover or whatever, or General Motors or Ford, and given your general level of skill, education, experience and so on, you'd earn about the same. There was no premium. And then basically the lines split apart again. Financial sector wages go through the roof after deregulation takes hold, so you get a different kind of capitalism. Piketty, I think, doesn't really emphasize that. I think very few people have really stressed what a huge change that is - to have the financial sector draw so much money to itself. By 2006, 40 percent of all corporate profits were going to the financial sector. They only have five percent of the workforce, but they've got 40 percent of the profits. The strip mining of wealth was being collected there, going from everybody else to them. I think that's different than the Robber Baron era, where the industrialists were getting a lot of money, but there was a rising standard of living for everybody else. That has stopped. The average American worker knows about inequality, but they may also wonder why it should matter to them so long as their own standard of living is improving. How do you reach people who think inequality is more of a problem in the abstract than in their daily lives? Basically there are two competing narratives. "Who cares about inequality if everybody is doing better in America?" and the narrative that's in the book, which is, "It's happening at your expense." We have a couple of chapters that do nothing else but compare the United States to other countries around the world, indicator after indicator after indicator, and just show how far we've fallen. We do lead the world in inequality, military spending and number and percent of prisoners. Of all the developed countries, we are second-to-last in child poverty. Romania is the only country behind us. They did a nice study of seven countries, according to upward mobility, what are the odds that you would be in an income bracket higher than your parents', and it turned out in the United States it was about 50-50. We were at the bottom of the list. Number one was Denmark, where the odds were seven-to-one that you would do better than your parents. So even upward mobility, our most cherished dream. It goes on and on. Education - in terms of what we pay our teachers, it's low. The amount of money we invest in 3-year-old and 4-year-old education, we're near the bottom of the list. An indicator like longevity, we actually showed a decline in comparison to other countries. So there's something really going on. Of course, if you were in the top one-percent, this is the greatest country on earth, I'm sure, because you live in your own world. You have your gated community, your private schools, virtually a private healthcare system. You pay very little tax because your money is offshore, and so on. So there's this breaking apart of America. American exceptionalism, the American dream is sort of collapsing. How does this financial strip mining impact the public sector? Two things are very important to realize about this financialization of corporations. The first is that the interest payment on debt is not taxable, just like the interest payment on the mortgage on someone's home is tax deductible. There was virtually no corporate debt up until about 1980; corporations used their own resources to fund what they needed. Then, through this corporate raiding process debt became the rage and now there's like $12 trillion in corporate debt. And as corporate debt goes up, tax payments go down. For example, corporate tax contributions to state and local government have virtually fallen in half since 1980, which puts more pressure on individual tax receipts. But the wealthy have moved so much money offshore that their taxes have also gone down. So we now have, when it comes to state and local government, a perfectly regressive tax system. The top one percent pay about half the tax rate as the bottom 20 percent. As you go down the brackets, the tax rate actually goes up and up -  the actual percentage that people pay gets higher and higher. And that decline in overall revenue leads to deficits and calls to cut safety net programs. That leads to a fiscal crisis. You have this great squeeze on the public sector, because you've got workers who haven't had a raise in a generation in terms of real buying power and you've got the wealthy not paying their share. We estimate there's something like $21 trillion offshore, and we're losing at least $150 billion in tax revenues on money that should be taxed but isn't. Now you have corporate inversions, which are putting more downward pressure on corporate taxes. So the people in the middle that are paying most of the taxes are tapped out. They're easy prey to a politician who says, "We want to cut the public sector. Teachers are getting paid too much, we can't afford their pensions." Why are programs like education such an easy target for spending cuts? We've got a new philosophy that comes with the Better Business Climate, and it's total individual responsibility -- the idea of a Great Society or a New Deal taking care of people goes out the window. If you want a job, go find it. If you're poor, it's your fault. If you want to go to school, take out a loan. The idea of the government providing anything is greatly diminished. Ask yourself when was the last time a politician said, "I want to create more public sector jobs so we can create jobs for inner-city youth, where unemployment is high." I don't think there's a politician in the country that directly would say that anymore, yet that was common in the '60s and early '70s because we knew there was a crisis of employment in rural places and inner cities. There were lots of experiments to figure out how to get employment in those areas, and the public sector grew. It was the track of upward mobility for African Americans, especially African American women, that was their ticket into the middle class: public employment. Since then, the number of government jobs as a percentage of the U.S. population has gone down as the Better Business Climate model took hold. As runaway inequality accelerated, it put more and more pressure on the public sector, and we basically gave up on the idea of a poverty program.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 10:02

These are the 20 hardest working cities in America

AlterNet Americans work a lot. Despite a certain failed GOP presidential contender’s suggestion that we should be putting in more hours at the office, the numbers show we’re already logging plenty. A CNN Money roundup of stats on work habits shows Americans work more than people in other rich countries; that the average working week is actually 47 hours, not 40; and that nearly 40 percent of workers rack up more than 50 working hours each week. Yet our culturally ingrained notions allow for Americans to be constantly asked to work more all the time, without complaint, a system that seems to be holding up nicely. An ABCNews.com poll from 2015 found just 26 percent of Americans say they work too hard. So who’s toiling the hardest among U.S. denizens? The GOP would likely state that those in the heartland are pulling all the weight while coastal elites read the New Yorker, listen to NPR and sip lattes. But WalletHub's look at the labor forces in 116 largest American cities finds the hardest workers are scattered around the country. The assessments are based on a number of factors, including, but not limited to average workweek hours, commute time and number of workers with multiple jobs. I suspect the nature of the work being done, a factor that doesn’t seem to be included, might skew these findings. However subjective work intensity might be, it seems objectively true that eight hours a day of coal-mining or fruit-picking is probably more taxing, in certain critical ways, than eight hours of restaurant reviewing. Not to take potshots or anything. Have a look for yourself at how the top 20 breaks down. To see the full list of 116, visit WalletHub. Anchorage, AK Virginia Beach, VA Plano, TX Sioux Falls, SD Irving, TX Scottsdale, AZ San Francisco, CA Cheyenne, WY Washington, DC Charlotte, NC Gilbert, AZ Corpus Christi, TX Denver, CO Billings, MT Chandler, AZ Jersey City, NJ Chesapeake, VA Garland, TX Oklahoma City, OK Houston, TX (h/t WalletHub)

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 10:00

Former first lady Nancy Reagan dies at 94

Former first lady Nancy Reagan has died at 94 in Bel-Air, California. Assistant Allison Borio says Mrs. Reagan died Sunday at her home of congestive heart failure. Her marriage to Ronald Reagan lasted 52 years until his death in 2004. A former actress, she was Reagan's closest adviser and fierce protector on his journey from actor to governor of California to president of the United States. She rushed to his side after he was shot in 1981 by a would-be assassin, and later endured his nearly decade-long battle with Alzheimer's disease. In recent years she broke with fellow Republicans in backing stem cell research as a way to possibly find a cure for Alzheimer's.  Former first lady Nancy Reagan has died at 94 in Bel-Air, California. Assistant Allison Borio says Mrs. Reagan died Sunday at her home of congestive heart failure. Her marriage to Ronald Reagan lasted 52 years until his death in 2004. A former actress, she was Reagan's closest adviser and fierce protector on his journey from actor to governor of California to president of the United States. She rushed to his side after he was shot in 1981 by a would-be assassin, and later endured his nearly decade-long battle with Alzheimer's disease. In recent years she broke with fellow Republicans in backing stem cell research as a way to possibly find a cure for Alzheimer's.  Former first lady Nancy Reagan has died at 94 in Bel-Air, California. Assistant Allison Borio says Mrs. Reagan died Sunday at her home of congestive heart failure. Her marriage to Ronald Reagan lasted 52 years until his death in 2004. A former actress, she was Reagan's closest adviser and fierce protector on his journey from actor to governor of California to president of the United States. She rushed to his side after he was shot in 1981 by a would-be assassin, and later endured his nearly decade-long battle with Alzheimer's disease. In recent years she broke with fellow Republicans in backing stem cell research as a way to possibly find a cure for Alzheimer's.  Former first lady Nancy Reagan has died at 94 in Bel-Air, California. Assistant Allison Borio says Mrs. Reagan died Sunday at her home of congestive heart failure. Her marriage to Ronald Reagan lasted 52 years until his death in 2004. A former actress, she was Reagan's closest adviser and fierce protector on his journey from actor to governor of California to president of the United States. She rushed to his side after he was shot in 1981 by a would-be assassin, and later endured his nearly decade-long battle with Alzheimer's disease. In recent years she broke with fellow Republicans in backing stem cell research as a way to possibly find a cure for Alzheimer's.  Former first lady Nancy Reagan has died at 94 in Bel-Air, California. Assistant Allison Borio says Mrs. Reagan died Sunday at her home of congestive heart failure. Her marriage to Ronald Reagan lasted 52 years until his death in 2004. A former actress, she was Reagan's closest adviser and fierce protector on his journey from actor to governor of California to president of the United States. She rushed to his side after he was shot in 1981 by a would-be assassin, and later endured his nearly decade-long battle with Alzheimer's disease. In recent years she broke with fellow Republicans in backing stem cell research as a way to possibly find a cure for Alzheimer's.  Former first lady Nancy Reagan has died at 94 in Bel-Air, California. Assistant Allison Borio says Mrs. Reagan died Sunday at her home of congestive heart failure. Her marriage to Ronald Reagan lasted 52 years until his death in 2004. A former actress, she was Reagan's closest adviser and fierce protector on his journey from actor to governor of California to president of the United States. She rushed to his side after he was shot in 1981 by a would-be assassin, and later endured his nearly decade-long battle with Alzheimer's disease. In recent years she broke with fellow Republicans in backing stem cell research as a way to possibly find a cure for Alzheimer's.  

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2016 09:14

March 5, 2016

Cruz, Trump win two states each on Super Saturday; Sanders triumphs in Nebraska and Kansas to stay alive against Clinton

In a split decision, Ted Cruz and Donald Trump each captured two victories in Saturday's four-state round of voting, fresh evidence that there's no quick end in sight to the fractious GOP race for president. On the Democratic side, Bernie Sanders notched wins in Nebraska and Kansas, while front-runner Hillary Clinton snagged Louisiana, another divided verdict from the American people.

Cruz claimed Kansas and Maine, and declared it "a manifestation of a real shift in momentum." Trump, still the front-runner in the hunt for delegates, bagged Louisiana and Kentucky. Despite strong support from the GOP establishment, Florida Sen. Marco Rubio had another disappointing night, raising serious questions about his viability in the race.

Cruz, a tea party favorite, said the results should send a loud message that the GOP contest for the nomination is far from over, and that the status quo is in trouble.

"The scream you hear, the howl that comes from Washington D.C., is utter terror at what we the people are doing together," he declared during a rally in Idaho, which votes in three days.

With the GOP race in chaos, establishment figures frantically are looking for any way to derail Trump, perhaps at a contested convention if no candidate can get enough delegates to lock up the nomination in advance. Party leaders - including 2012 nominee Mitt Romney and 2008 nominee Sen. John McCain - are fearful a Trump victory would lead to a disastrous November election, with losses up and down the GOP ticket.

"Everyone's trying to figure out how to stop Trump," the billionaire marveled at an afternoon rally in Orlando, Florida, where he had supporters raise their hands and swear to vote for him.

Trump prevailed in the home state of Senate Majority Leader Mitch McConnell.

Rubio, who finished no better than third anywhere and has only one win so far, insisted the upcoming schedule of primaries is "better for us," and renewed his vow to win his home state of Florida, claiming all 99 delegates there on March 15.

But Cruz suggested it was time for Rubio and Ohio Gov. John Kasich to go.

"As long as the field remains divided, it gives Donald an advantage," he said.

Campaigning in Detroit, Clinton said she was thrilled to add to her delegate count and expected to do well in Michigan's primary on Tuesday.

"No matter who wins this Democratic nomination," she said, "I have not the slightest doubt that on our worst day we will be infinitely better than the Republicans on their best day."

Tara Evans, a 52-year-old quilt maker from Bellevue, Nebraska, said she was caucusing for Clinton, and happy to know that the former first lady could bring her husband back to the White House.

"I like Bernie, but I think Hillary had the best chance of winning," she said.

Sanders won by solid margins in Nebraska and Kansas, giving him seven victories so far in the nominating season, compared to 11 for Clinton, who still maintains a commanding lead in competition for delegates.

Sanders, in an interview with The Associated Press, pointed to his wide margins of victory and called it evidence that his political revolution is coming to pass.

Stressing the important of voter turnout, he said, "when large numbers of people come - working people, young people who have not been involved in the political process - we will do well and I think that is bearing out tonight."

With Republican front-runner Trump yet to win states by the margins he'll need in order to secure the nomination before the GOP convention, every one of the 155 GOP delegates at stake on Saturday was worth fighting for.

Count Wichita's Barb Berry among those who propelled Cruz to victory in Kansas, where GOP officials reported extremely high turnout. Overall, Cruz has won seven states so far, to 12 for Trump.

"I believe that he is a true fighter for conservatives," said Berry, a 67-year-old retired AT&T manager. As for Trump, Berry said, "he is a little too narcissistic."

Like Rubio, Kasich has pinned his hopes on the winner-take-all contest March 15 in his home state.

Clinton picked up at least 51 delegates to Sanders' 45 in Saturday's contests, with delegates yet to be allocated.

Overall, Clinton had at least 1,117 delegates to Sanders' 477, including superdelegates - members of Congress, governors and party officials who can support the candidate of their choice. It takes 2,383 delegates to win the Democratic nomination.

Cruz will collect at least 36 delegates for winning the Republican caucuses in Kansas and Maine, Trump at least 18 and Rubio at least six and Kasich three.

In the overall race for GOP delegates, Trump led with at least 347 and Cruz had at least 267. Rubio had 116 delegates and Kasich had 28.

It takes 1,237 delegates to win the Republican nomination for president.

In a split decision, Ted Cruz and Donald Trump each captured two victories in Saturday's four-state round of voting, fresh evidence that there's no quick end in sight to the fractious GOP race for president. On the Democratic side, Bernie Sanders notched wins in Nebraska and Kansas, while front-runner Hillary Clinton snagged Louisiana, another divided verdict from the American people.

Cruz claimed Kansas and Maine, and declared it "a manifestation of a real shift in momentum." Trump, still the front-runner in the hunt for delegates, bagged Louisiana and Kentucky. Despite strong support from the GOP establishment, Florida Sen. Marco Rubio had another disappointing night, raising serious questions about his viability in the race.

Cruz, a tea party favorite, said the results should send a loud message that the GOP contest for the nomination is far from over, and that the status quo is in trouble.

"The scream you hear, the howl that comes from Washington D.C., is utter terror at what we the people are doing together," he declared during a rally in Idaho, which votes in three days.

With the GOP race in chaos, establishment figures frantically are looking for any way to derail Trump, perhaps at a contested convention if no candidate can get enough delegates to lock up the nomination in advance. Party leaders - including 2012 nominee Mitt Romney and 2008 nominee Sen. John McCain - are fearful a Trump victory would lead to a disastrous November election, with losses up and down the GOP ticket.

"Everyone's trying to figure out how to stop Trump," the billionaire marveled at an afternoon rally in Orlando, Florida, where he had supporters raise their hands and swear to vote for him.

Trump prevailed in the home state of Senate Majority Leader Mitch McConnell.

Rubio, who finished no better than third anywhere and has only one win so far, insisted the upcoming schedule of primaries is "better for us," and renewed his vow to win his home state of Florida, claiming all 99 delegates there on March 15.

But Cruz suggested it was time for Rubio and Ohio Gov. John Kasich to go.

"As long as the field remains divided, it gives Donald an advantage," he said.

Campaigning in Detroit, Clinton said she was thrilled to add to her delegate count and expected to do well in Michigan's primary on Tuesday.

"No matter who wins this Democratic nomination," she said, "I have not the slightest doubt that on our worst day we will be infinitely better than the Republicans on their best day."

Tara Evans, a 52-year-old quilt maker from Bellevue, Nebraska, said she was caucusing for Clinton, and happy to know that the former first lady could bring her husband back to the White House.

"I like Bernie, but I think Hillary had the best chance of winning," she said.

Sanders won by solid margins in Nebraska and Kansas, giving him seven victories so far in the nominating season, compared to 11 for Clinton, who still maintains a commanding lead in competition for delegates.

Sanders, in an interview with The Associated Press, pointed to his wide margins of victory and called it evidence that his political revolution is coming to pass.

Stressing the important of voter turnout, he said, "when large numbers of people come - working people, young people who have not been involved in the political process - we will do well and I think that is bearing out tonight."

With Republican front-runner Trump yet to win states by the margins he'll need in order to secure the nomination before the GOP convention, every one of the 155 GOP delegates at stake on Saturday was worth fighting for.

Count Wichita's Barb Berry among those who propelled Cruz to victory in Kansas, where GOP officials reported extremely high turnout. Overall, Cruz has won seven states so far, to 12 for Trump.

"I believe that he is a true fighter for conservatives," said Berry, a 67-year-old retired AT&T manager. As for Trump, Berry said, "he is a little too narcissistic."

Like Rubio, Kasich has pinned his hopes on the winner-take-all contest March 15 in his home state.

Clinton picked up at least 51 delegates to Sanders' 45 in Saturday's contests, with delegates yet to be allocated.

Overall, Clinton had at least 1,117 delegates to Sanders' 477, including superdelegates - members of Congress, governors and party officials who can support the candidate of their choice. It takes 2,383 delegates to win the Democratic nomination.

Cruz will collect at least 36 delegates for winning the Republican caucuses in Kansas and Maine, Trump at least 18 and Rubio at least six and Kasich three.

In the overall race for GOP delegates, Trump led with at least 347 and Cruz had at least 267. Rubio had 116 delegates and Kasich had 28.

It takes 1,237 delegates to win the Republican nomination for president.

In a split decision, Ted Cruz and Donald Trump each captured two victories in Saturday's four-state round of voting, fresh evidence that there's no quick end in sight to the fractious GOP race for president. On the Democratic side, Bernie Sanders notched wins in Nebraska and Kansas, while front-runner Hillary Clinton snagged Louisiana, another divided verdict from the American people.

Cruz claimed Kansas and Maine, and declared it "a manifestation of a real shift in momentum." Trump, still the front-runner in the hunt for delegates, bagged Louisiana and Kentucky. Despite strong support from the GOP establishment, Florida Sen. Marco Rubio had another disappointing night, raising serious questions about his viability in the race.

Cruz, a tea party favorite, said the results should send a loud message that the GOP contest for the nomination is far from over, and that the status quo is in trouble.

"The scream you hear, the howl that comes from Washington D.C., is utter terror at what we the people are doing together," he declared during a rally in Idaho, which votes in three days.

With the GOP race in chaos, establishment figures frantically are looking for any way to derail Trump, perhaps at a contested convention if no candidate can get enough delegates to lock up the nomination in advance. Party leaders - including 2012 nominee Mitt Romney and 2008 nominee Sen. John McCain - are fearful a Trump victory would lead to a disastrous November election, with losses up and down the GOP ticket.

"Everyone's trying to figure out how to stop Trump," the billionaire marveled at an afternoon rally in Orlando, Florida, where he had supporters raise their hands and swear to vote for him.

Trump prevailed in the home state of Senate Majority Leader Mitch McConnell.

Rubio, who finished no better than third anywhere and has only one win so far, insisted the upcoming schedule of primaries is "better for us," and renewed his vow to win his home state of Florida, claiming all 99 delegates there on March 15.

But Cruz suggested it was time for Rubio and Ohio Gov. John Kasich to go.

"As long as the field remains divided, it gives Donald an advantage," he said.

Campaigning in Detroit, Clinton said she was thrilled to add to her delegate count and expected to do well in Michigan's primary on Tuesday.

"No matter who wins this Democratic nomination," she said, "I have not the slightest doubt that on our worst day we will be infinitely better than the Republicans on their best day."

Tara Evans, a 52-year-old quilt maker from Bellevue, Nebraska, said she was caucusing for Clinton, and happy to know that the former first lady could bring her husband back to the White House.

"I like Bernie, but I think Hillary had the best chance of winning," she said.

Sanders won by solid margins in Nebraska and Kansas, giving him seven victories so far in the nominating season, compared to 11 for Clinton, who still maintains a commanding lead in competition for delegates.

Sanders, in an interview with The Associated Press, pointed to his wide margins of victory and called it evidence that his political revolution is coming to pass.

Stressing the important of voter turnout, he said, "when large numbers of people come - working people, young people who have not been involved in the political process - we will do well and I think that is bearing out tonight."

With Republican front-runner Trump yet to win states by the margins he'll need in order to secure the nomination before the GOP convention, every one of the 155 GOP delegates at stake on Saturday was worth fighting for.

Count Wichita's Barb Berry among those who propelled Cruz to victory in Kansas, where GOP officials reported extremely high turnout. Overall, Cruz has won seven states so far, to 12 for Trump.

"I believe that he is a true fighter for conservatives," said Berry, a 67-year-old retired AT&T manager. As for Trump, Berry said, "he is a little too narcissistic."

Like Rubio, Kasich has pinned his hopes on the winner-take-all contest March 15 in his home state.

Clinton picked up at least 51 delegates to Sanders' 45 in Saturday's contests, with delegates yet to be allocated.

Overall, Clinton had at least 1,117 delegates to Sanders' 477, including superdelegates - members of Congress, governors and party officials who can support the candidate of their choice. It takes 2,383 delegates to win the Democratic nomination.

Cruz will collect at least 36 delegates for winning the Republican caucuses in Kansas and Maine, Trump at least 18 and Rubio at least six and Kasich three.

In the overall race for GOP delegates, Trump led with at least 347 and Cruz had at least 267. Rubio had 116 delegates and Kasich had 28.

It takes 1,237 delegates to win the Republican nomination for president.

In a split decision, Ted Cruz and Donald Trump each captured two victories in Saturday's four-state round of voting, fresh evidence that there's no quick end in sight to the fractious GOP race for president. On the Democratic side, Bernie Sanders notched wins in Nebraska and Kansas, while front-runner Hillary Clinton snagged Louisiana, another divided verdict from the American people.

Cruz claimed Kansas and Maine, and declared it "a manifestation of a real shift in momentum." Trump, still the front-runner in the hunt for delegates, bagged Louisiana and Kentucky. Despite strong support from the GOP establishment, Florida Sen. Marco Rubio had another disappointing night, raising serious questions about his viability in the race.

Cruz, a tea party favorite, said the results should send a loud message that the GOP contest for the nomination is far from over, and that the status quo is in trouble.

"The scream you hear, the howl that comes from Washington D.C., is utter terror at what we the people are doing together," he declared during a rally in Idaho, which votes in three days.

With the GOP race in chaos, establishment figures frantically are looking for any way to derail Trump, perhaps at a contested convention if no candidate can get enough delegates to lock up the nomination in advance. Party leaders - including 2012 nominee Mitt Romney and 2008 nominee Sen. John McCain - are fearful a Trump victory would lead to a disastrous November election, with losses up and down the GOP ticket.

"Everyone's trying to figure out how to stop Trump," the billionaire marveled at an afternoon rally in Orlando, Florida, where he had supporters raise their hands and swear to vote for him.

Trump prevailed in the home state of Senate Majority Leader Mitch McConnell.

Rubio, who finished no better than third anywhere and has only one win so far, insisted the upcoming schedule of primaries is "better for us," and renewed his vow to win his home state of Florida, claiming all 99 delegates there on March 15.

But Cruz suggested it was time for Rubio and Ohio Gov. John Kasich to go.

"As long as the field remains divided, it gives Donald an advantage," he said.

Campaigning in Detroit, Clinton said she was thrilled to add to her delegate count and expected to do well in Michigan's primary on Tuesday.

"No matter who wins this Democratic nomination," she said, "I have not the slightest doubt that on our worst day we will be infinitely better than the Republicans on their best day."

Tara Evans, a 52-year-old quilt maker from Bellevue, Nebraska, said she was caucusing for Clinton, and happy to know that the former first lady could bring her husband back to the White House.

"I like Bernie, but I think Hillary had the best chance of winning," she said.

Sanders won by solid margins in Nebraska and Kansas, giving him seven victories so far in the nominating season, compared to 11 for Clinton, who still maintains a commanding lead in competition for delegates.

Sanders, in an interview with The Associated Press, pointed to his wide margins of victory and called it evidence that his political revolution is coming to pass.

Stressing the important of voter turnout, he said, "when large numbers of people come - working people, young people who have not been involved in the political process - we will do well and I think that is bearing out tonight."

With Republican front-runner Trump yet to win states by the margins he'll need in order to secure the nomination before the GOP convention, every one of the 155 GOP delegates at stake on Saturday was worth fighting for.

Count Wichita's Barb Berry among those who propelled Cruz to victory in Kansas, where GOP officials reported extremely high turnout. Overall, Cruz has won seven states so far, to 12 for Trump.

"I believe that he is a true fighter for conservatives," said Berry, a 67-year-old retired AT&T manager. As for Trump, Berry said, "he is a little too narcissistic."

Like Rubio, Kasich has pinned his hopes on the winner-take-all contest March 15 in his home state.

Clinton picked up at least 51 delegates to Sanders' 45 in Saturday's contests, with delegates yet to be allocated.

Overall, Clinton had at least 1,117 delegates to Sanders' 477, including superdelegates - members of Congress, governors and party officials who can support the candidate of their choice. It takes 2,383 delegates to win the Democratic nomination.

Cruz will collect at least 36 delegates for winning the Republican caucuses in Kansas and Maine, Trump at least 18 and Rubio at least six and Kasich three.

In the overall race for GOP delegates, Trump led with at least 347 and Cruz had at least 267. Rubio had 116 delegates and Kasich had 28.

It takes 1,237 delegates to win the Republican nomination for president.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 05, 2016 20:01

We’re not meant to do this alone: American individualism is destroying our families

Like many Americans, I'm raising my children far from family. My father is only an hour away, but the rest of us are spread wide across America: my mother in Texas; my sister in Oklahoma; my brother in New York City; my in-laws in Buffalo. Babysitters and after-school programs and summer camps are the village that helps me get the business of life done, and while having more family around would certainly help with the grind, what I miss most is simply time spent with them. The spontaneity of coffee with my mom, how fun it would be for the kids to see a movie with their cousins, enjoying a family barbecue on the weekend.

Nine years ago my husband got a job offer in San Francisco, and without a second thought we left New York City. We loaded the U-Haul covered wagon and did what Americans have been doing since Europeans came to this continent: we said goodbye to loved ones and headed west into the great unknown, forging a future for ourselves alone. Today, it is an American rite of passage to leave your family for college, leave your college for a job, and so on and so on, until opportunities abound but you need Sprint's Unlimited Plan to feel connected to blood. America's modern Manifest Destiny is no longer about physically expanding the boundaries of the continent, it is about self-expansionism.

If you remember high school history at all, Manifest Destiny was the mid-19th-century American belief that settlers were destined to expand throughout the continent, and it was characterized by  the virtues of the American people, their mission to remake the West and their destiny to fulfill this duty.

What I find so astounding is that Manifest Destiny is not history at all. It is alive and well, a continued belief pervasive in families everywhere. If John Winthrop's "City Upon a Hill" sermon in 1630 called for this young nation to be an example to the Old World, then it only makes sense that almost 400 years later we look to ourselves to be an example to our parents, to take what they gave us and be even better, to remake our past and achieve individual success. We believe it to be our destiny, just as that job offer in San Francisco was my husband's destiny. It was an essential duty we needed to accomplish for our family, but after almost a decade, one house, two kids, starting a new company, and facing a glowing future, I wonder if there wasn't a way we could have done it differently. To somehow have circled the family wagons, keeping out the savage solitude of this brave new suburban frontier together.

Ironically, it was just this circle of family togetherness from which I was trying to escape as a young adult. Savage solitude was exactly what I was looking for, especially if it would help me write poetry like Sylvia Plath or Anne Sexton. Fleeing what I felt to be the suffocation of Texas and a pattern of the expected -- staying in-state with familiar faces and family close by -- I went to college in upstate New York. I wanted to prove to myself that I could succeed in a wild survival experiment of rigorous academics, mountains of snow and thousands of students. I did not chose a small, intimate, familial institution. I chose a university with almost 20,000 undergrad and grad students. Usually, students create a new “family” in college, a broad social safety net of really good friends, but I did not. Family was overrated. My parents had recently combusted in a hellish fireball of divorce; my sister escaped to college, and my brother was left to survive middle school with a shitload of shrapnel. I had a couple of great friends, but mainly I turned to bad therapists and carbs.

Four years and a diploma later (individual success!), I was ready to continue my self-expansion and conquer New York City. If there’s one place in today’s America that represents the wild, untamed West of the 19th century, this is it -- not necessarily cowboys and Indians, but rather the naked cowboy in Times Square and an Indian cab driver. It was the next step in my survival experiment -- not just to live and work in New York City, but like Sinatra sang, to really make it there. No way in hell would I suffer a forced relocation in a trail of tears (and credit card debt) back to Texas and my family.

Because at this point, I still couldn’t comprehend the full value of family. My work in a big publishing house gave me a paycheck, but it was also fun and vibrant and socially fulfilling. Outside of work, I valued my solitude and sought out connection when I needed it. I called my mom every Sunday, I loved visiting family, but did I miss them? Not really.

No. I couldn’t comprehend the value of family until I had my own, eight years later and 2,905 miles away in San Francisco. I thought I was so prepared to have a child, but raising a baby is perhaps the greatest exploration of all boundaries. The true frontier -- and my destiny, as I saw it. When my husband’s paternity leave ended after two weeks, I sat there on that couch, feeding and burping the baby and not moving until my husband came home 11 hours later. I lived alone for almost a decade, but I never actually felt alone until I had children. And this, for many, is the stay-at-home mother’s plight. Especially the stay-at-home mother who has no family nearby.

America’s cultural glorification of individualism and freedom do not prepare women for the intense need for family after giving birth. We prepare our babies with the softest swaddling cloths, organic diapers and the perfect nursery, but we are not encouraged to anticipate our own needs, especially that of simple connection with others. I equated my own crushing loneliness, my dependency on my husband and phone calls with my mother -- or any adult who listened kindly, for that matter -- to be weakness. Like any good fool with Finnish blood, I stoically buckled under exhaustion, isolation and the anxiety of being a new mother, by myself. I’m ashamed to say that for the first 10 months of my son’s life, I did nothing for myself: no exercise; no dinners out with my husband; no time to myself; no sleep. Why? Because individual success, man! This was the continent I'd timed my ovulation cycle to conquer!

The false assumption that I could parent alone is not just mine. It is societal. In Sebastian Junger’s Vanity Fair piece entitled “PTSD: The War Disorder that Goes Far Beyond the Battlefield,” he talks about the extreme isolationism in America and how soldiers suffer when they come home because they have lost their community. I have never been more profoundly affected by or able to relate to something. Why do people become firemen? Policemen? Join the Coast Guard? There are a lot of reasons, but the simple answer is that being together makes people happy. Combine that with sacrifice for the survival of the group and you get oxytocin. It’s a brain reward system uniquely connected to our evolution. For the rest of us schmucks following the Simon & Garfunkel “I Am an Island” philosophy, Junger says, “personal gain almost completely eclipses collective good.”

All I know is that in the trenches of motherhood, I don’t want to battle alone. And what a shock it was for me to discover that.

My boys are now ages 7 and 4 and my loneliness is both more manageable and more painful at the same time. The kids are in school, I stay active, I have lots of friends, but we are all spinning in different orbits: different carpools, different extracurriculars, different schools, endless errands, endless driving. With no family and only hard-fought  playdates or drinks together, the isolation is profound. I miss my friends, who are right next door or down the street, and with each passing year, I miss my family, the missing limb whose phantom pain only increases.

The vast array of choices that make us and our children more successful, more educated, more athletic, is like being at a Thanksgiving table that’s too long: here I am with a goddamn cornucopia of awesomeness, but I can’t see anyone! Yell once if you’re behind the “Guide to 5,000 Essential Summer Camps For Kids”! When I complain, my mom regales me with stories of living on the University of Petroleum and Minerals compound in Saudi Arabia: nothing to do but get together with her 10 neighbors who all had infants the same age as me and do playdates, family meals, and a nanny-share system that gave each woman some alone time. She says it was one of the happiest times in her life.

In my town, just north of San Francisco, transplants from all over America greatly outnumber the native San Franciscans. Missing our families is a common complaint. On the mother’s club forum, we ponder leaving the great jobs and amazing weather for the comfort of a close-knit family, weighing the pros and cons in lengthy debates. We have achieved personal and professional success, we are exceptional employees and exceptional parents. A few hundred years ago, the heart of American exceptionalism was, of course, that we were different than other nations. We were free from the historical forces that impacted other countries, but today, all we are is exceptionally lonely, the Isolated States of America. We are untethered by historical forces all right, free from mom's hugs, dad's homemade chili, and the pillars of extended family.

Talking to my Hispanic dental hygienist over a garbled spew of spearmint tooth polish about my lack of blood-related village, "exceptional" was not the word she used to describe the way white America lives, it was "strange." She explained, "We all live close together and watch each other's kids and cook for each other. I mean, it's crazy but we're together, you know?" Yes, I know. This is true of all my family's international friends, from those in France who live within a 15-mile radius of each other, to those in Saudi Arabia, who live in a family compound by the dozens.

Families are together in the Middle East, Latin America, Europe, Africa and Asia -- all of over the world, except in America, where the premium is not placed on proximity. It's as if Americans must always be Lewis and Clark on a brave embarkation, and if we're not, we are provincial, frightened and uneducated. Unlike our ancestors, young people today are not concerned with America's place in the world. Instead, we ask ourselves, "What is my place in the world?"

We grow up with the belief that self-expansionism -- high school, college, career -- means pushing boundaries toward accomplishment and away from family. So off we go and hey, there's always FaceTime!

What Americans fail to recognize about global family patterns is that children, should they have the means and ability, are encouraged to leave the nest, to seek education outside their homeland, perhaps even to find a life partner, and then -- then! -- to return to their extended family. Living a life near family does not mean sacrificing a life of exploration, travel and learning. Whereas Americans have perfected the art of the rocket, projecting themselves on lonely journeys, the rest of the world practices the boomerang, recognizing the value of leaving and returning.

Manifest Destiny was a contested concept in the 1800s because people didn't feel it reflected the national spirit, and I don't think its modern equivalent is doing us any favors either. The gain of personal enfranchisement doesn't seem to justify a detached form of living.

I don't need to study the research on how a lack of community affects the individual. I am that research. Now if only I could figure out how to turn this rocket around.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 05, 2016 16:30

The devil’s bargain: Washington is full of people who have prospered thanks to 9/11 — and I was one of them

About this time every year I realize that the Sept. 11 attacks changed my life, and the lives of millions of other Americans. For the better. There. I admitted it. Strange to say the words so baldly, but the truth is the truth. Four years after the attacks, I wrote a novel called "The Faithful Spy," about a CIA officer who had infiltrated al-Qaida but couldn’t stop them. It became a best-seller. I’ve now written 10, all focused on espionage and terrorism. They come out each February. I never know what to feel about this devil’s bargain. I didn’t choose it, but I didn’t walk away either. And though the story sounds like an outlier, whenever I visit Washington, as I did last week to promote my new book, I am reminded that it isn’t – and of its real cost. The national capital is filled with people who have prospered thanks to Sept. 11, whether they admit that reality to themselves or not. They are coders at the National Security Agency, case officers at the Central Intelligence Agency, war planners in the Pentagon, satellite engineers at the National Reconnaissance Office, and the endless dragoons of consultants at Boeing and a thousand other companies. They do their jobs in heavily guarded office campuses with names like Liberty Crossing. They work hard and mean well, mostly. And I’ve grown convinced they are a large part of the reason that ordinary Americans feel so alienated from the government that is supposed to be theirs. Almost 1 million people now have top-secret level security clearances. More than 3,000 companies and agencies work on counterterror, homeland security and intelligence programs. But despite Edward Snowden and other leakers, we know little about what all those people actually do, how much money they make, or what liberties they take with ours. The veil of secrecy is too thick, the criminal penalties for talking too severe. Meanwhile, the people who run the agencies – and the politicians who oversee them – still will not discuss in plain English whom they target, how they keep themselves from making mistakes, and what happens if they do. To say nothing of the broad outlines of their capabilities, or the size of their budgets and their workforces. No one needs to know exactly how many drones the United States has. But shouldn’t the government acknowledge when they strike? How long can we fight this war, and how many people can we kill, without admitting what we’ve done? At the same time, the perpetual secret war is corroding the relationship between government and governed. The symbolism of the perpetually expanding security core around the White House can’t be ignored. Downtown Washington reminds me more of central Moscow every time I see it. The monuments and memorials are pleasant distractions for tourists. But the buildings where the real work is done might as well have force fields around them. I am old enough to remember walking through the White House with my parents. We didn’t donate a million dollars. We got in line, walked through a metal detector, and saw the public spaces in what actually seemed to be the people’s house. That term seems worse than ironic now, with Secret Service officers in black uniforms and Kevlar vests keeping anyone even from reaching the sidewalk on Pennsylvania Avenue. Somewhere the Secret Service has calculated the lethal radius of a 10-kiloton nuclear bomb, figured the odds that terrorists could smuggle in nerve gas components. Those fears are real. But the pendulum has swung much too far in the direction of secrecy and self-preservation. It feels sometimes as though the federal security apparatus, from top to bottom, exists mainly to protect itself and to provide jobs for its members. Even asking the right questions is next to impossible when all the relevant facts are classified. A government that keeps so many secrets is hard to trust, easy to fear. And in this meanest of political seasons, the fear many ordinary Americans feel is palpable. The attacks happened. We can’t go back. I wouldn’t unwrite my books, even if I could. But all of us – especially the people in the secret world who owe their careers or fortunes to Sept. 11 – need to find a way forward. Before our security strangles us.  Alex Berenson, a former reporter for the New York Times, writes the John Wells series of spy novels.  Putnam published the most recent, "The Wolves," earlier this month.About this time every year I realize that the Sept. 11 attacks changed my life, and the lives of millions of other Americans. For the better. There. I admitted it. Strange to say the words so baldly, but the truth is the truth. Four years after the attacks, I wrote a novel called "The Faithful Spy," about a CIA officer who had infiltrated al-Qaida but couldn’t stop them. It became a best-seller. I’ve now written 10, all focused on espionage and terrorism. They come out each February. I never know what to feel about this devil’s bargain. I didn’t choose it, but I didn’t walk away either. And though the story sounds like an outlier, whenever I visit Washington, as I did last week to promote my new book, I am reminded that it isn’t – and of its real cost. The national capital is filled with people who have prospered thanks to Sept. 11, whether they admit that reality to themselves or not. They are coders at the National Security Agency, case officers at the Central Intelligence Agency, war planners in the Pentagon, satellite engineers at the National Reconnaissance Office, and the endless dragoons of consultants at Boeing and a thousand other companies. They do their jobs in heavily guarded office campuses with names like Liberty Crossing. They work hard and mean well, mostly. And I’ve grown convinced they are a large part of the reason that ordinary Americans feel so alienated from the government that is supposed to be theirs. Almost 1 million people now have top-secret level security clearances. More than 3,000 companies and agencies work on counterterror, homeland security and intelligence programs. But despite Edward Snowden and other leakers, we know little about what all those people actually do, how much money they make, or what liberties they take with ours. The veil of secrecy is too thick, the criminal penalties for talking too severe. Meanwhile, the people who run the agencies – and the politicians who oversee them – still will not discuss in plain English whom they target, how they keep themselves from making mistakes, and what happens if they do. To say nothing of the broad outlines of their capabilities, or the size of their budgets and their workforces. No one needs to know exactly how many drones the United States has. But shouldn’t the government acknowledge when they strike? How long can we fight this war, and how many people can we kill, without admitting what we’ve done? At the same time, the perpetual secret war is corroding the relationship between government and governed. The symbolism of the perpetually expanding security core around the White House can’t be ignored. Downtown Washington reminds me more of central Moscow every time I see it. The monuments and memorials are pleasant distractions for tourists. But the buildings where the real work is done might as well have force fields around them. I am old enough to remember walking through the White House with my parents. We didn’t donate a million dollars. We got in line, walked through a metal detector, and saw the public spaces in what actually seemed to be the people’s house. That term seems worse than ironic now, with Secret Service officers in black uniforms and Kevlar vests keeping anyone even from reaching the sidewalk on Pennsylvania Avenue. Somewhere the Secret Service has calculated the lethal radius of a 10-kiloton nuclear bomb, figured the odds that terrorists could smuggle in nerve gas components. Those fears are real. But the pendulum has swung much too far in the direction of secrecy and self-preservation. It feels sometimes as though the federal security apparatus, from top to bottom, exists mainly to protect itself and to provide jobs for its members. Even asking the right questions is next to impossible when all the relevant facts are classified. A government that keeps so many secrets is hard to trust, easy to fear. And in this meanest of political seasons, the fear many ordinary Americans feel is palpable. The attacks happened. We can’t go back. I wouldn’t unwrite my books, even if I could. But all of us – especially the people in the secret world who owe their careers or fortunes to Sept. 11 – need to find a way forward. Before our security strangles us.  Alex Berenson, a former reporter for the New York Times, writes the John Wells series of spy novels.  Putnam published the most recent, "The Wolves," earlier this month.About this time every year I realize that the Sept. 11 attacks changed my life, and the lives of millions of other Americans. For the better. There. I admitted it. Strange to say the words so baldly, but the truth is the truth. Four years after the attacks, I wrote a novel called "The Faithful Spy," about a CIA officer who had infiltrated al-Qaida but couldn’t stop them. It became a best-seller. I’ve now written 10, all focused on espionage and terrorism. They come out each February. I never know what to feel about this devil’s bargain. I didn’t choose it, but I didn’t walk away either. And though the story sounds like an outlier, whenever I visit Washington, as I did last week to promote my new book, I am reminded that it isn’t – and of its real cost. The national capital is filled with people who have prospered thanks to Sept. 11, whether they admit that reality to themselves or not. They are coders at the National Security Agency, case officers at the Central Intelligence Agency, war planners in the Pentagon, satellite engineers at the National Reconnaissance Office, and the endless dragoons of consultants at Boeing and a thousand other companies. They do their jobs in heavily guarded office campuses with names like Liberty Crossing. They work hard and mean well, mostly. And I’ve grown convinced they are a large part of the reason that ordinary Americans feel so alienated from the government that is supposed to be theirs. Almost 1 million people now have top-secret level security clearances. More than 3,000 companies and agencies work on counterterror, homeland security and intelligence programs. But despite Edward Snowden and other leakers, we know little about what all those people actually do, how much money they make, or what liberties they take with ours. The veil of secrecy is too thick, the criminal penalties for talking too severe. Meanwhile, the people who run the agencies – and the politicians who oversee them – still will not discuss in plain English whom they target, how they keep themselves from making mistakes, and what happens if they do. To say nothing of the broad outlines of their capabilities, or the size of their budgets and their workforces. No one needs to know exactly how many drones the United States has. But shouldn’t the government acknowledge when they strike? How long can we fight this war, and how many people can we kill, without admitting what we’ve done? At the same time, the perpetual secret war is corroding the relationship between government and governed. The symbolism of the perpetually expanding security core around the White House can’t be ignored. Downtown Washington reminds me more of central Moscow every time I see it. The monuments and memorials are pleasant distractions for tourists. But the buildings where the real work is done might as well have force fields around them. I am old enough to remember walking through the White House with my parents. We didn’t donate a million dollars. We got in line, walked through a metal detector, and saw the public spaces in what actually seemed to be the people’s house. That term seems worse than ironic now, with Secret Service officers in black uniforms and Kevlar vests keeping anyone even from reaching the sidewalk on Pennsylvania Avenue. Somewhere the Secret Service has calculated the lethal radius of a 10-kiloton nuclear bomb, figured the odds that terrorists could smuggle in nerve gas components. Those fears are real. But the pendulum has swung much too far in the direction of secrecy and self-preservation. It feels sometimes as though the federal security apparatus, from top to bottom, exists mainly to protect itself and to provide jobs for its members. Even asking the right questions is next to impossible when all the relevant facts are classified. A government that keeps so many secrets is hard to trust, easy to fear. And in this meanest of political seasons, the fear many ordinary Americans feel is palpable. The attacks happened. We can’t go back. I wouldn’t unwrite my books, even if I could. But all of us – especially the people in the secret world who owe their careers or fortunes to Sept. 11 – need to find a way forward. Before our security strangles us.  Alex Berenson, a former reporter for the New York Times, writes the John Wells series of spy novels.  Putnam published the most recent, "The Wolves," earlier this month.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on March 05, 2016 16:30