Helen H. Moore's Blog, page 910

December 28, 2015

Behind the Ronald Reagan myth: “No one had ever entered the White House so grossly ill informed”

No one had ever entered the White House so grossly ill informed. At presidential news conferences, especially in his first year, Ronald Reagan embarrassed himself. On one occasion, asked why he advocated putting missiles in vulnerable places, he responded, his face registering bewilderment, “I don’t know but what maybe you haven’t gotten into the area that I’m going to turn over to the secretary of defense.” Frequently, he knew nothing about events that had been headlined in the morning newspaper. In 1984, when asked a question he should have fielded easily, Reagan looked befuddled, and his wife had to step in to rescue him. “Doing everything we can,” she whispered. “Doing everything we can,” the president echoed. To be sure, his detractors sometimes exaggerated his ignorance. The publication of his radio addresses of the 1950s revealed a considerable command of facts, though in a narrow range. But nothing suggested profundity. “You could walk through Ronald Reagan’s deepest thoughts,” a California legislator said, “and not get your ankles wet.” In all fields of public affairs—from diplomacy to the economy—the president stunned Washington policymakers by how little basic information he commanded. His mind, said the well-disposed Peggy Noonan, was “barren terrain.” Speaking of one far-ranging discussion on the MX missile, the Indiana congressman Lee Hamilton, an authority on national defense, reported, “Reagan’s only contribution throughout the entire hour and a half was to interrupt somewhere at midpoint to tell us he’d watched a movie the night before, and he gave us the plot from War Games.” The president “cut ribbons and made speeches. He did these things beautifully,” Congressman Jim Wright of Texas acknowledged. “But he never knew frijoles from pralines about the substantive facts of issues.” Some thought him to be not only ignorant but, in the word of a former CIA director, “stupid.” Clark Clifford called the president an “amiable dunce,” and the usually restrained columnist David Broder wrote, “The task of watering the arid desert between Reagan’s ears is a challenging one for his aides.” No Democratic adversary would ever constitute as great a peril to the president’s political future, his advisers concluded, as Reagan did himself. Therefore, they protected him by severely restricting situations where he might blurt out a fantasy. His staff, one study reported, wrapped him “in excelsior,” while “keeping the press at shouting distance or beyond.” In his first year as president, he held only six news conferences—fewest ever in the modern era. Aides also prepared scores of cue cards, so that he would know how to greet visitors and respond to interviewers. His secretary of the treasury and later chief of staff said of the president: “Every moment of every public appearance was scheduled, every word scripted, every place where Reagan was expected to stand was chalked with toe marks.” Those manipulations, he added, seemed customary to Reagan, for “he had been learning his lines, composing his facial expressions, hitting his toe marks for half a century.” Each night, before turning in, he took comfort in a shooting schedule for the next day’s television- focused events that was laid out for him at his bedside, just as it had been in Hollywood. His White House staff found it difficult, often impossible, to get him to stir himself to follow even this rudimentary routine. When he was expected to read briefing papers, he lazed on a couch watching old movies. On the day before a summit meeting with world leaders about the future of the economy, he was given a briefing book. The next morning, his chief of staff asked him why he had not even opened it. “Well, Jim,” the president explained, “The Sound of Music was on last night.” “Reagan,” his principal biographer, Lou Cannon, has written, “may have been the one president in the history of the republic who saw his election as a chance to get some rest.” (He spent nearly a full year of his tenure not in the White House but at his Rancho del Cielo in the hills above Santa Barbara.) Cabinet officials had to accommodate themselves to Reagan’s slumbering during discussions of pressing issues, and on a multination European trip, he nodded off so often at meetings with heads of state, among them French president François Mitterand, that reporters, borrowing the title of a film noir, designated the journey “The Big Sleep.” He even dozed during a televised audience at the Vatican while the pope was speaking to him. A satirist lampooned Reagan by transmuting Dolly Parton’s “Workin’ 9 to 5” into “Workin’ 9 to 10,” and TV’s Johnny Carson quipped, “There are only two reasons you wake President Reagan: World War III and if Hellcats of the Navy is on the Late Show.” Reagan tossed off criticism of his napping on the job with drollery. He told the White House press corps, “I am concerned about what is happening in government—and it’s caused me many a sleepless afternoon,” and he jested that posterity would place a marker on his chair in the Cabinet Room: “Reagan Slept Here.” His team devised ingenious ways to get him to pay attention. Aware that he was obsessed with movies, his national security adviser had the CIA put together a film on world leaders the president was scheduled to encounter. His defense secretary stooped lower. He got Reagan to sign off on production of the MX missile by showing him a cartoon. Once again, the president made a joke of his lack of involvement: “It’s true that hard work never killed anybody, but why take a chance?” Cannon, who had observed him closely for years and with considerable admiration, took his lapses more seriously. “Seen either in military or economic terms,” he concluded, “the nation paid a high price for a president who skimped on preparation, avoided complexities and news conferences and depended far too heavily on anecdotes, charts, graphics and cartoons.” Subordinates also found Reagan to be an exasperatingly disengaged administrator. “Trying to forge policy,” said George Shultz, his longest- serving secretary of state, was “like walking through a swamp.” Donald Regan recalled: “In the four years that I served as secretary of the treasury, I never saw President Reagan alone and never discussed economic philosophy....I had to figure these things out like any other American, by studying his speeches and reading the newspapers. . . . After I accepted the job, he simply hung up and vanished.” One of his national security advisers, General Colin Powell, recalled that “the President’s passive management style placed a tremendous burden on us,” and another national security adviser, Frank Carlucci, observed: “The Great Communicator wasn’t always the greatest communicator in the private sessions; you didn’t always get clean and crisp decisions. You assumed a lot. . . . You had to.” Numbers of observers contended that Reagan conducted himself not as a ruler but as a ceremonial monarch. In the midst of heated exchanges, a diplomat noted, Reagan behaved like a “remote sort of king . . . just not there.” After taking in the president’s performance during a discussion of the budget in 1981, one of his top aides remarked that Reagan looked like “a king . . . who had assembled his subalterns to listen to what they had to say and to preside, sort of,” and another said, “He made decisions like an ancient king or a Turkish pasha, passively letting his subjects serve him, selecting only those morsels of public policy that were especially tasty. Rarely did he ask searching questions and demand to know why someone had or had not done something.” As a consequence, a Republican senator went so far as to say: “With Ronald Reagan, no one is there. The sad fact is that we don’t have a president.” Instead of designating one person as his top aide, as Eisenhower had with Sherman Adams, Reagan set up a “troika”: James A. Baker III as chief of staff, Edwin Meese as counselor, and Michael Deaver as deputy chief of staff in charge of public relations—an arrangement that, for a time, left other appointees perplexed. The Reagan White House, said his first secretary of state, Alexander M. Haig Jr., was “as mysterious as a ghost ship; you heard the creak of the rigging and the groan of the timbers and sometimes even glimpsed the crew on deck. But which of the crew had the helm? Was it Meese, was it Baker, was it someone else? It was impossible to know for sure.” Similarly, Peggy Noonan ruminated: “Who’s in charge here? I could never understand where power was in that White House; it kept moving. I’d see men in suits huddled in a hall twenty paces from the Oval Office, and I’d think, there it is, that’s where they’re making the decisions. But the next day they were gone and the hall was empty.” The first lady made her own contribution to the diffusion of authority. No one of his appointees, not even his chief of staff, exercised so much power. The New York Times, discussing Nancy Reagan, even wrote of an “Associate Presidency.” She understood her husband’s limitations and did all she could to make sure that he was well served. Their son Michael said, “Dad looks at half a glass of water and says: Look at this! It’s half full! Nancy is always trying to figure out: Who stole the other half from my husband?” She sometimes influenced Reagan’s policies, notably when she pushed for arms control, and she was thought to have been responsible for the removal of two cabinet officials and of the president’s latter-day chief of staff. During his tenure, she dismissed accounts of her impact, but in her memoir, she acknowledged: “For eight years I was sleeping with the president, and if that doesn’t give you special access, I don’t know what does.” Reagan’s staff found especially exasperating the need to clear the president’s schedule with a first lady who placed so much reliance upon a West Coast astrologer, Joan Quigley. That had been true since the beginning in Sacramento when Reagan was inaugurated as governor at midnight because, it was reported, that was the hour this woman set after perusing the zodiac. On a number of occasions, Deaver would spend days working out an intricate itinerary for the president’s travels down to the last detail only to be told that he had to scrap everything because the astrologer had determined that the stars were not properly aligned. Horoscopes fixed the day and hour of such major events as presidential debates and summit meetings with Soviet leaders. The president’s most important aide said, “We were paralyzed by this craziness.” In these unpropitious circumstances, the troika managed much better than anticipated. Public administration theorists likened this three-headed makeshift to the mock definition of a camel: a horse put together by a committee. But Baker proved to be a highly effective chief of staff and Deaver a masterful maestro of staged events. Secretary Haig later remarked, “You couldn’t serve in this administration without knowing that Reagan was a cipher and that these men were running the government.” That judgment, however, failed to credit Reagan’s perspicacity. In setting up his team, he succeeded in taking to Washington two men who had served him faithfully in Sacramento—Meese and Deaver—while acknowledging that, since they and he had no experience inside the Beltway, he needed to salt his inner corps with a veteran of the Ford era. In choosing Baker, moreover, Reagan, stereotyped as a rigid ideologue, showed unexpected flexibility. Baker, a moderate, had been floor manager for Ford’s effort to deny Reagan the 1976 presidential nomination, and in 1980 he had run George Bush’s campaign against Governor Reagan. From the start of his political career, commentators, especially liberals, had been underestimating Reagan. When he announced that he was planning to run for governor of California, he encountered ridicule. At a time when Robert Cummings was a prominent film star, the Hollywood mogul Jack Warner responded, “No, Bob Cummings for governor, Ronald Reagan as his best friend.” Yet Reagan easily defeated the former mayor of San Francisco to win the Republican nomination, then stunned Democrats by prevailing over the incumbent governor, Pat Brown, by nearly a million votes. Furthermore, he went on to gain reelection to a second term. Reagan’s performance in Sacramento surprised both adversaries and followers. While continuing to proclaim his undying hostility to government intervention, he stepped up taxes on banks and corporations, increased benefits to welfare recipients, more than doubled funds for higher education, and safeguarded “wild and scenic rivers” from exploitation. A vocal advocate of “the right to life,” he nevertheless signed a bill in 1967 that resulted in a rise in legal abortions in the state from 518 in that year to nearly 100,000 in 1980. He was able to forge agreements with Democrats in the capital because he had the advantage, as a veteran of Screen Actors Guild battles, of being an experienced negotiator. (In later years, he said of his haggling with Mikhail Gorbachev: “It was easier than dealing with Jack Warner.”) His chief Democratic opponent in the legislature, who started out viewing Reagan with contempt, wound up concluding that he had been a pretty good governor, “better than Pat Brown, miles and planets and universes better than Jerry Brown”—the two most conspicuous Democratic leaders of the period. Scrutiny of his record, however, also raised disquieting features. Months after he took office as governor, a reporter asked him about his priorities. Disconcerted, Reagan turned toward an assistant and said, “I could take some coaching from the sidelines, if anyone can recall my legislative program.” Expected to decide between conflicting views on the abortion issue, “Reagan,” Cannon noted, “behaved as if lost at sea.” His aides often found it difficult to get him to concentrate. On one occasion, in the midst of a vital discussion about the budget, he wandered off: “Do you know how hard it is to mispronounce ‘psychiatric’ once you know how to pronounce it right? I had to do it in Kings Row and at first I couldn’t do it.” He especially alarmed members of his staff by flying into a rage if the press reported that he had changed his position on an issue, even when he undoubtedly had. All of his disabilities—gross misperceptions and knowledge gaps—he carried into the White House. Yet he was to leave office regarded as a consequential president, and a number of scholars were even to write of an “Age of Reagan.” Excerpted from "The American President: From Teddy Roosevelt to Bill Clinton" by William E. Leuchtenburg. Published by Oxford University Press. Copyright 2016 by William E. Leuchtenburg. Reprinted with permission of the publisher. All rights reserved. Republicans Were Obsessed With Reagan In 2015

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 28, 2015 14:15

Bristol Palin’s hateful hypocrisy: She’s now a mother of two, but still the biggest child in the family

It must be very strange to be a former highly compensated abstinence spokeswoman who is now the never-married mother of two children by two different fathers. It must take a fair degree of mental reconciling to be closely aligned with a political party that is so consistently and so vehemently disparaging of single mothers — well, at least when they're not the daughters of their own former governors. But Bristol Palin somehow makes it look easy. On Dec. 24, the 25-year-old "Dancing With the Stars" veteran announced she'd given birth the day before to daughter Sailor Grace, and that "my heart just doubled."  Saying that "our family couldn’t be more complete," she also shared a photo of her Tripp — who turned 7 this week — meeting his baby sister. The whole family seems delighted at the new arrival — Palin's mother, Sarah, took time out of her burgeoning comedy career to share a message declaring Sailor "The best gift ever!" and saying, "Thank you, Bristol, for your strength and good heart and your love of life. The most important people in Bristol’s life were there to witness the miracle of Sailor Grace Palin’s arrival last night. Thank you for sharing the miracle with Piper, Marina, and me, Bri! And we thank Todd for taking care of the rest of the family during this most precious beginning of a great new chapter!" Though Bristol's ex-fiancé wasn't included in Sarah Palin's litany of her daughter's "most important people" or given a last name credit, Dakota Meyer seemed to confirm his paternity on Christmas Eve with a tweet saying, "Best Christmas present ever!! I couldn't be more proud of this little blessing." Bristol Palin is likely going to have her hands full for some time now, raising her two young children. But wouldn't it terrific if now that she's got a daughter of her own, she could spend a little time to finally get more realistic about the world our girls and women live in? I know it's a long shot. When she announced, back in June, that she was expecting, Palin knew she was in for a heap of accusations of hypocrisy, and she stated firmly, "I do not want any lectures and I do not want any sympathy." She also admitted, "I know this has been, and will be, a huge disappointment to my family, to my close friends, and to many of you….Tripp, this new baby, and I will all be fine, because God is merciful." It was a plea for compassion — but one undermined when she abruptly turned around and lashed out at "these giddy a$$holes" and insisted, "I have never been a paid 'abstinence spokesperson.'" That's a claim that seems to ignore her gig with the Candie's Foundation and her speaker's bureau listing, which put "abstinence" as her primary topic. Palin then soon after wrote a straight up ignorant, facts-challenged blog post about a Washington state birth control program. In August, she unsurprisingly jumped on the anti-Planned Parenthood juggernaut, with a post about "dead babies." And just earlier this month, she listed "Eleven selfish excuses women give to justify ending their pregnancies." Her Instagram, meanwhile, contains jokey retweets imploring poor people, "Instead of burning the American flag, why don't you burn your welfare checks?" How does she have time to pick so many fights and spew so many half truths while raising a family? On Monday morning, she found time to post a video on Facebook mocking Kwanzaa. Talk about leaning in. In spite of the obstacles she's faced, in spite of the unjust shaming for her private life, Palin seems resolutely determined to demand she not be judged while judging others. She seems fine with ignoring how unsustainable the abstinence she built a tidy side-career preaching is. She is no longer the teen America first met when her mother was on the campaign trail. She is a woman and a mother of two. Maybe now it's time for her to grow up. Bristol Palin Announces Birth of Her Second ChildIt must be very strange to be a former highly compensated abstinence spokeswoman who is now the never-married mother of two children by two different fathers. It must take a fair degree of mental reconciling to be closely aligned with a political party that is so consistently and so vehemently disparaging of single mothers — well, at least when they're not the daughters of their own former governors. But Bristol Palin somehow makes it look easy. On Dec. 24, the 25-year-old "Dancing With the Stars" veteran announced she'd given birth the day before to daughter Sailor Grace, and that "my heart just doubled."  Saying that "our family couldn’t be more complete," she also shared a photo of her Tripp — who turned 7 this week — meeting his baby sister. The whole family seems delighted at the new arrival — Palin's mother, Sarah, took time out of her burgeoning comedy career to share a message declaring Sailor "The best gift ever!" and saying, "Thank you, Bristol, for your strength and good heart and your love of life. The most important people in Bristol’s life were there to witness the miracle of Sailor Grace Palin’s arrival last night. Thank you for sharing the miracle with Piper, Marina, and me, Bri! And we thank Todd for taking care of the rest of the family during this most precious beginning of a great new chapter!" Though Bristol's ex-fiancé wasn't included in Sarah Palin's litany of her daughter's "most important people" or given a last name credit, Dakota Meyer seemed to confirm his paternity on Christmas Eve with a tweet saying, "Best Christmas present ever!! I couldn't be more proud of this little blessing." Bristol Palin is likely going to have her hands full for some time now, raising her two young children. But wouldn't it terrific if now that she's got a daughter of her own, she could spend a little time to finally get more realistic about the world our girls and women live in? I know it's a long shot. When she announced, back in June, that she was expecting, Palin knew she was in for a heap of accusations of hypocrisy, and she stated firmly, "I do not want any lectures and I do not want any sympathy." She also admitted, "I know this has been, and will be, a huge disappointment to my family, to my close friends, and to many of you….Tripp, this new baby, and I will all be fine, because God is merciful." It was a plea for compassion — but one undermined when she abruptly turned around and lashed out at "these giddy a$$holes" and insisted, "I have never been a paid 'abstinence spokesperson.'" That's a claim that seems to ignore her gig with the Candie's Foundation and her speaker's bureau listing, which put "abstinence" as her primary topic. Palin then soon after wrote a straight up ignorant, facts-challenged blog post about a Washington state birth control program. In August, she unsurprisingly jumped on the anti-Planned Parenthood juggernaut, with a post about "dead babies." And just earlier this month, she listed "Eleven selfish excuses women give to justify ending their pregnancies." Her Instagram, meanwhile, contains jokey retweets imploring poor people, "Instead of burning the American flag, why don't you burn your welfare checks?" How does she have time to pick so many fights and spew so many half truths while raising a family? On Monday morning, she found time to post a video on Facebook mocking Kwanzaa. Talk about leaning in. In spite of the obstacles she's faced, in spite of the unjust shaming for her private life, Palin seems resolutely determined to demand she not be judged while judging others. She seems fine with ignoring how unsustainable the abstinence she built a tidy side-career preaching is. She is no longer the teen America first met when her mother was on the campaign trail. She is a woman and a mother of two. Maybe now it's time for her to grow up. Bristol Palin Announces Birth of Her Second Child

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 28, 2015 13:45

“Rahm failed us”: Calls for Rahm Emanuel to resign reach fever pitch as Chicago cops “accidentally” kill black mother of five

Calls for the resignation of Chicago Mayor Rahm Emanuel have swelled after police in the city "accidentally struck and tragically killed" a mother of five over the weekend. The police were responding to a domestic disturbance at an apartment when they opened fire on Saturday morning. Quintonio LeGrier, 19, was also killed in the shooting, along with the 55-year-old mother Bettie Jones. Jones was an activist involved in encouraging peace in the community. LeGrier's father said the young man was a  "whiz kid" who was studying electrical engineering technology at Northern Illinois University. Family and activists amounting to roughly 100 people held a vigil for the victims on Sunday. Those attending stressed that police should use nonlethal methods of policing like stun guns. One activist held a sign that said "stop killing us," while others chanted "This is not an accident!" LeGrier's mother, Janet Cooksey, said at a news conference on Sunday, "I used to watch the news daily and I would grieve for other mothers, other family members, and now today I'm grieving myself." At the news conference, Cooksey was wearing a shirt emblazoned with Mayor Emanuel's face and the words "Rahm Failed Us." The AP reported that one Jones' friends at the news conference lamented that Chicago police "shoot first and ask questions later." "It's ridiculous," she added. The Chicago Police Department says the shootings are being investigated. The lawyer for the Jones family said police confiscated the hard drive of a home-security camera from across the street that captured the shootings. Emanuel said in a statement, "Anytime an officer uses force the public deserves answers, and regardless of the circumstances, we all grieve anytime there is a loss of life in our city." The killings come at a time while a federal civil rights investigation is being conducted into the Chicago Police Department's practices. The investigation began in November, when, due to FOIA requests from journalists, police dash cam video from Oct. 20 2014 was released showing white Chicago police officer Jason Van Dyke shooting black 17-year-old Laquan McDonald 16 times. Police had previously said they shot McDonald in self-defense, claiming he had threatened them. The footage, nevertheless, shows that the department lied about the killing, and that young man was in fact feet away from them, walking away at the time he was killed. When the video was made public, activists accused the local government of orchestrating a citywide cover up. At the vigil, the Chicago-based Reverend Marshall Hatch remarked, "Something is seriously wrong. How, in the middle of all this scrutiny, [can you] have a trigger-happy policeman?" For years, activists have called for the resignation of Emanuel, who has overseen a police department that is rife with controversy and often accused of overly aggressive policies. The killings have reinvigorated calls for his ouster. The Rev. Al Sharpton has joined in the calls for Emanuel's resignation. "You talk about a crisis on steroids," he said Monday on MSNBC's "Morning Joe." "You are in the middle of a recall vote, they are circulating petitions in Chicago to recall you, the state legislature is going to have to deal with it and you don’t even come back? This is the height of either insensitivity or lack of intelligence or arrogance, or a reasonable combination of all three." The Chicago Police Department has a long history of brutality, and has been embroiled in numerous scandals over the past decades. After years of organized protests and action, activists pressured the Chicago government in earlier this year to create a $5.5 million reparations fund for victims of police brutality. Responding to the pressure from civil rights activists, Emanuel admitted that the practices of former Chicago Police Commander Jon Burge, who served from the 1970s to the '90s and was notorious for racist policing and extreme abuse of black residents, were a "stain that cannot be removed from our city's history." Burge and his detectives systemically tortured suspects. They resorted to electric shocks, beatings, suffocation, and even Russian roulette to force suspects to confess to crimes they didn't commit. Burge, who had served in the military before becoming a Chicago police commander, governed much more like a military power than a civilian one. His department's use of torture to procure confessions led former Illinois Gov. George Ryan in 2000 to declare a moratorium on the death penalty. In January 2011, Burge was sentenced to four-and-a-half years in federal prison. He was released early, however, in October 2014. The reparations fund created this year came after the city government had already paid more than $100 million in legal settlements to victims of police brutality. Activists from the Black Lives Matter civil rights movement — which was launched in Ferguson, Missouri in 2014 after the police killing of unarmed black teen Michael Brown — see the police killings of Jones, LeGrier, and McDonald, and the subsequent government reactions to these shootings, as symptomatic of a larger pattern of racially charged police brutality and systemic racism. Families Speak Out After Chicago Police Shootings Leave 2 DeadCalls for the resignation of Chicago Mayor Rahm Emanuel have swelled after police in the city "accidentally struck and tragically killed" a mother of five over the weekend. The police were responding to a domestic disturbance at an apartment when they opened fire on Saturday morning. Quintonio LeGrier, 19, was also killed in the shooting, along with the 55-year-old mother Bettie Jones. Jones was an activist involved in encouraging peace in the community. LeGrier's father said the young man was a  "whiz kid" who was studying electrical engineering technology at Northern Illinois University. Family and activists amounting to roughly 100 people held a vigil for the victims on Sunday. Those attending stressed that police should use nonlethal methods of policing like stun guns. One activist held a sign that said "stop killing us," while others chanted "This is not an accident!" LeGrier's mother, Janet Cooksey, said at a news conference on Sunday, "I used to watch the news daily and I would grieve for other mothers, other family members, and now today I'm grieving myself." At the news conference, Cooksey was wearing a shirt emblazoned with Mayor Emanuel's face and the words "Rahm Failed Us." The AP reported that one Jones' friends at the news conference lamented that Chicago police "shoot first and ask questions later." "It's ridiculous," she added. The Chicago Police Department says the shootings are being investigated. The lawyer for the Jones family said police confiscated the hard drive of a home-security camera from across the street that captured the shootings. Emanuel said in a statement, "Anytime an officer uses force the public deserves answers, and regardless of the circumstances, we all grieve anytime there is a loss of life in our city." The killings come at a time while a federal civil rights investigation is being conducted into the Chicago Police Department's practices. The investigation began in November, when, due to FOIA requests from journalists, police dash cam video from Oct. 20 2014 was released showing white Chicago police officer Jason Van Dyke shooting black 17-year-old Laquan McDonald 16 times. Police had previously said they shot McDonald in self-defense, claiming he had threatened them. The footage, nevertheless, shows that the department lied about the killing, and that young man was in fact feet away from them, walking away at the time he was killed. When the video was made public, activists accused the local government of orchestrating a citywide cover up. At the vigil, the Chicago-based Reverend Marshall Hatch remarked, "Something is seriously wrong. How, in the middle of all this scrutiny, [can you] have a trigger-happy policeman?" For years, activists have called for the resignation of Emanuel, who has overseen a police department that is rife with controversy and often accused of overly aggressive policies. The killings have reinvigorated calls for his ouster. The Rev. Al Sharpton has joined in the calls for Emanuel's resignation. "You talk about a crisis on steroids," he said Monday on MSNBC's "Morning Joe." "You are in the middle of a recall vote, they are circulating petitions in Chicago to recall you, the state legislature is going to have to deal with it and you don’t even come back? This is the height of either insensitivity or lack of intelligence or arrogance, or a reasonable combination of all three." The Chicago Police Department has a long history of brutality, and has been embroiled in numerous scandals over the past decades. After years of organized protests and action, activists pressured the Chicago government in earlier this year to create a $5.5 million reparations fund for victims of police brutality. Responding to the pressure from civil rights activists, Emanuel admitted that the practices of former Chicago Police Commander Jon Burge, who served from the 1970s to the '90s and was notorious for racist policing and extreme abuse of black residents, were a "stain that cannot be removed from our city's history." Burge and his detectives systemically tortured suspects. They resorted to electric shocks, beatings, suffocation, and even Russian roulette to force suspects to confess to crimes they didn't commit. Burge, who had served in the military before becoming a Chicago police commander, governed much more like a military power than a civilian one. His department's use of torture to procure confessions led former Illinois Gov. George Ryan in 2000 to declare a moratorium on the death penalty. In January 2011, Burge was sentenced to four-and-a-half years in federal prison. He was released early, however, in October 2014. The reparations fund created this year came after the city government had already paid more than $100 million in legal settlements to victims of police brutality. Activists from the Black Lives Matter civil rights movement — which was launched in Ferguson, Missouri in 2014 after the police killing of unarmed black teen Michael Brown — see the police killings of Jones, LeGrier, and McDonald, and the subsequent government reactions to these shootings, as symptomatic of a larger pattern of racially charged police brutality and systemic racism. Families Speak Out After Chicago Police Shootings Leave 2 DeadCalls for the resignation of Chicago Mayor Rahm Emanuel have swelled after police in the city "accidentally struck and tragically killed" a mother of five over the weekend. The police were responding to a domestic disturbance at an apartment when they opened fire on Saturday morning. Quintonio LeGrier, 19, was also killed in the shooting, along with the 55-year-old mother Bettie Jones. Jones was an activist involved in encouraging peace in the community. LeGrier's father said the young man was a  "whiz kid" who was studying electrical engineering technology at Northern Illinois University. Family and activists amounting to roughly 100 people held a vigil for the victims on Sunday. Those attending stressed that police should use nonlethal methods of policing like stun guns. One activist held a sign that said "stop killing us," while others chanted "This is not an accident!" LeGrier's mother, Janet Cooksey, said at a news conference on Sunday, "I used to watch the news daily and I would grieve for other mothers, other family members, and now today I'm grieving myself." At the news conference, Cooksey was wearing a shirt emblazoned with Mayor Emanuel's face and the words "Rahm Failed Us." The AP reported that one Jones' friends at the news conference lamented that Chicago police "shoot first and ask questions later." "It's ridiculous," she added. The Chicago Police Department says the shootings are being investigated. The lawyer for the Jones family said police confiscated the hard drive of a home-security camera from across the street that captured the shootings. Emanuel said in a statement, "Anytime an officer uses force the public deserves answers, and regardless of the circumstances, we all grieve anytime there is a loss of life in our city." The killings come at a time while a federal civil rights investigation is being conducted into the Chicago Police Department's practices. The investigation began in November, when, due to FOIA requests from journalists, police dash cam video from Oct. 20 2014 was released showing white Chicago police officer Jason Van Dyke shooting black 17-year-old Laquan McDonald 16 times. Police had previously said they shot McDonald in self-defense, claiming he had threatened them. The footage, nevertheless, shows that the department lied about the killing, and that young man was in fact feet away from them, walking away at the time he was killed. When the video was made public, activists accused the local government of orchestrating a citywide cover up. At the vigil, the Chicago-based Reverend Marshall Hatch remarked, "Something is seriously wrong. How, in the middle of all this scrutiny, [can you] have a trigger-happy policeman?" For years, activists have called for the resignation of Emanuel, who has overseen a police department that is rife with controversy and often accused of overly aggressive policies. The killings have reinvigorated calls for his ouster. The Rev. Al Sharpton has joined in the calls for Emanuel's resignation. "You talk about a crisis on steroids," he said Monday on MSNBC's "Morning Joe." "You are in the middle of a recall vote, they are circulating petitions in Chicago to recall you, the state legislature is going to have to deal with it and you don’t even come back? This is the height of either insensitivity or lack of intelligence or arrogance, or a reasonable combination of all three." The Chicago Police Department has a long history of brutality, and has been embroiled in numerous scandals over the past decades. After years of organized protests and action, activists pressured the Chicago government in earlier this year to create a $5.5 million reparations fund for victims of police brutality. Responding to the pressure from civil rights activists, Emanuel admitted that the practices of former Chicago Police Commander Jon Burge, who served from the 1970s to the '90s and was notorious for racist policing and extreme abuse of black residents, were a "stain that cannot be removed from our city's history." Burge and his detectives systemically tortured suspects. They resorted to electric shocks, beatings, suffocation, and even Russian roulette to force suspects to confess to crimes they didn't commit. Burge, who had served in the military before becoming a Chicago police commander, governed much more like a military power than a civilian one. His department's use of torture to procure confessions led former Illinois Gov. George Ryan in 2000 to declare a moratorium on the death penalty. In January 2011, Burge was sentenced to four-and-a-half years in federal prison. He was released early, however, in October 2014. The reparations fund created this year came after the city government had already paid more than $100 million in legal settlements to victims of police brutality. Activists from the Black Lives Matter civil rights movement — which was launched in Ferguson, Missouri in 2014 after the police killing of unarmed black teen Michael Brown — see the police killings of Jones, LeGrier, and McDonald, and the subsequent government reactions to these shootings, as symptomatic of a larger pattern of racially charged police brutality and systemic racism. Families Speak Out After Chicago Police Shootings Leave 2 DeadCalls for the resignation of Chicago Mayor Rahm Emanuel have swelled after police in the city "accidentally struck and tragically killed" a mother of five over the weekend. The police were responding to a domestic disturbance at an apartment when they opened fire on Saturday morning. Quintonio LeGrier, 19, was also killed in the shooting, along with the 55-year-old mother Bettie Jones. Jones was an activist involved in encouraging peace in the community. LeGrier's father said the young man was a  "whiz kid" who was studying electrical engineering technology at Northern Illinois University. Family and activists amounting to roughly 100 people held a vigil for the victims on Sunday. Those attending stressed that police should use nonlethal methods of policing like stun guns. One activist held a sign that said "stop killing us," while others chanted "This is not an accident!" LeGrier's mother, Janet Cooksey, said at a news conference on Sunday, "I used to watch the news daily and I would grieve for other mothers, other family members, and now today I'm grieving myself." At the news conference, Cooksey was wearing a shirt emblazoned with Mayor Emanuel's face and the words "Rahm Failed Us." The AP reported that one Jones' friends at the news conference lamented that Chicago police "shoot first and ask questions later." "It's ridiculous," she added. The Chicago Police Department says the shootings are being investigated. The lawyer for the Jones family said police confiscated the hard drive of a home-security camera from across the street that captured the shootings. Emanuel said in a statement, "Anytime an officer uses force the public deserves answers, and regardless of the circumstances, we all grieve anytime there is a loss of life in our city." The killings come at a time while a federal civil rights investigation is being conducted into the Chicago Police Department's practices. The investigation began in November, when, due to FOIA requests from journalists, police dash cam video from Oct. 20 2014 was released showing white Chicago police officer Jason Van Dyke shooting black 17-year-old Laquan McDonald 16 times. Police had previously said they shot McDonald in self-defense, claiming he had threatened them. The footage, nevertheless, shows that the department lied about the killing, and that young man was in fact feet away from them, walking away at the time he was killed. When the video was made public, activists accused the local government of orchestrating a citywide cover up. At the vigil, the Chicago-based Reverend Marshall Hatch remarked, "Something is seriously wrong. How, in the middle of all this scrutiny, [can you] have a trigger-happy policeman?" For years, activists have called for the resignation of Emanuel, who has overseen a police department that is rife with controversy and often accused of overly aggressive policies. The killings have reinvigorated calls for his ouster. The Rev. Al Sharpton has joined in the calls for Emanuel's resignation. "You talk about a crisis on steroids," he said Monday on MSNBC's "Morning Joe." "You are in the middle of a recall vote, they are circulating petitions in Chicago to recall you, the state legislature is going to have to deal with it and you don’t even come back? This is the height of either insensitivity or lack of intelligence or arrogance, or a reasonable combination of all three." The Chicago Police Department has a long history of brutality, and has been embroiled in numerous scandals over the past decades. After years of organized protests and action, activists pressured the Chicago government in earlier this year to create a $5.5 million reparations fund for victims of police brutality. Responding to the pressure from civil rights activists, Emanuel admitted that the practices of former Chicago Police Commander Jon Burge, who served from the 1970s to the '90s and was notorious for racist policing and extreme abuse of black residents, were a "stain that cannot be removed from our city's history." Burge and his detectives systemically tortured suspects. They resorted to electric shocks, beatings, suffocation, and even Russian roulette to force suspects to confess to crimes they didn't commit. Burge, who had served in the military before becoming a Chicago police commander, governed much more like a military power than a civilian one. His department's use of torture to procure confessions led former Illinois Gov. George Ryan in 2000 to declare a moratorium on the death penalty. In January 2011, Burge was sentenced to four-and-a-half years in federal prison. He was released early, however, in October 2014. The reparations fund created this year came after the city government had already paid more than $100 million in legal settlements to victims of police brutality. Activists from the Black Lives Matter civil rights movement — which was launched in Ferguson, Missouri in 2014 after the police killing of unarmed black teen Michael Brown — see the police killings of Jones, LeGrier, and McDonald, and the subsequent government reactions to these shootings, as symptomatic of a larger pattern of racially charged police brutality and systemic racism. Families Speak Out After Chicago Police Shootings Leave 2 Dead

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 28, 2015 12:35

December 27, 2015

Consumerism is ruining our kids: from $350 kicks to four-figure strollers, we’re making our children into product zombies

Two weeks before my son’s 16th birthday, after shrugging off my suggestions that we do something to celebrate, he emailed me a wish list that included $350 sneakers. The apparently coveted Y-3 Retro Boost looked, to me, exactly like regular black sneakers with white soles, but also included some complicated loop at the heel.

Perhaps I should have been appalled, but I was not. Or rather, I was appalled but not surprised. The tension over what to buy and what not to buy started the Halloween he was 5. I knew Prefab vs. DIY was not a new or original generational battlefield, but I also thought relenting with a store-bought Flash costume would gain me some leverage (i.e., pick your battles). I was 23 when he was born, too young to understand how little control I would actually have as he grew. My son’s father and I were living in rural India when I’d gotten pregnant and returned there after our baby turned 1. I had not had a thousand-dollar stroller or a designer layette. Instead, I had 12 cloth diapers we washed out at the hand pump. Because we’d spent so much of my son’s early years avoiding American consumerism, I was shocked to see how quickly it surfaced once we were back and how unequipped I was for all the judgment aimed at me for what I bought for my son or didn’t.

Which is to say that as he’s grown, the stakes have gotten higher – more expensive and more emotionally loaded – and I still don’t know what the right thing is. Because just avoiding this pair of $350 sneakers doesn’t  solve the problem of trying to raise a self-reliant kid at a time when we’re all bombarded with the capitalist mantra that products are, in fact, essential to our sense of self and self-worth.

My son is a sensitive, observant boy with a kind heart and a sharp sense of humor. He takes good care of his little sister, is respectful to his teachers, and worked hard this summer at his first job, “landscaping” (or picking up trash) at a local park. But, born in 1999, he is also very much a product of his generation, who, in a sweeping generalization that also feels quite specific, seem to exist on this fulcrum between consumerism and technology – using their phones to trawl for products, to buy the products, to post pictures of the products on their phones.

In addition to my angst over increasing sweatshop labor, corporate profits and landfill waste, I lie awake in the middle of the night most anxious about something that happened when my son was about 10: he explained his certainty that an expensive watch would make anyone happy. “If you had that watch, people would want to be around you,” he said with the logic of a child who was hard at work interpreting the world around him. Which is exactly why I know that he can hardly be blamed for emulating the very principles of the wider American culture in which he’s been raised.

In the spirit of full disclosure, I will admit that as a 6-year-old in 1982, I named my tabby cat Gucci after a classmate’s father traveled to Italy and brought her home the purse our second-grade class agreed changed everything. As one of only a handful of lower-middle-class kids at my private, all-girl Catholic school, I got an inadvertent crash course in expensive preppy. It was also the 1980s, when the whole U.S. stopped going to church and started going to the mall instead. GREAT!

But raised on equal parts Catholic guilt and Protestant work ethic, I also grew up ashamed about wanting the things I did. My grandparents grew up during the Depression and 50 years later, thrift was still promoted in my family not only as practical, but as a display of national, even ethical pride. When my grandma explained that she had to forgo both piano and ballet lessons, unlike her older sisters, I interpreted this to mean that self-denial equaled a kind of moral superiority. And even though the girls in my Catholic school evaluated one another’s ESPRIT quotient on a daily basis (uniforms, by the way, do nothing to curb judgment or meanness), our teachers, all nuns, worked tirelessly to bring the realities of the Central American poor into our classroom. Every week, we donated extra milk money to the street children of San Salvador. The way I learned to pray had very little to do with God and everything to do with recognizing the suffering of others.   

While I am grateful for to have been raised with this hazy awareness of privilege, which did keep my own mother’s anxiety over the grocery bill somewhat in perspective, other consequences have included a lifetime of tormented indecision. I have a really hard time spending money on myself, even when it comes to socks and winter coats. Meaning that on top of the already fraught combat zone of teen materialism, in trying to figure out what to buy for my son, I’m also trying to get over my own vexed relationship with buying anything ever.

Our family listens to records on a record player, sits on chairs found on the street, and makes calls on a rotary phone. For reasons beyond my parsimonious childhood, I believe in frugality and things that last, in flea markets over Ikea, in resoling my clogs rather than buying new ones. We don’t own a TV, a microwave, or video games. I say these things not with pride or a misguided confidence in their impact, but as context. As testimony that my son has not been raised with a bottomless allowance or spoon-fed luxury.

But in my deepest moments of self-doubt, I worry that upholding my values has actually fueled my son’s determination to reject them or, like any kid experimenting with self-definition or rebellion, to at least look elsewhere for fulfillment or individuality.

When we were in Paris the summer he was 11, my son complained more than once that everything was “old.” Back in India the summer he was 8, he couldn’t get past how “dirty” and “broken” everything was long enough to see anything other than the poverty. In both cases, I couldn’t really argue – Paris is indeed old; much of India is dirty. And yet, I was still surprised to see how deeply ingrained the American ethos of "new equals better" was.

So while I’m ashamed to admit I would even consider $350 sneakers for anyone, especially a 6’1” growing teenager, I also worry that denying him the shoes make them that much more alluring. It’s a sort of damned if you do, damned if you don’t scenario – force your kid to stay too far outside the mainstream and you might make him a zealot. But let him embrace it and you create another mindless consumer.

In 2004, the American Psychological Association pinpointed the increase in marketing directed at teens as having a profound effect on how kids forged an identity and grappled with the vulnerability of adolescence. Advocating for more empirical research in order to push for federal oversight of commercial marketing aimed at children, many concerned psychologists voiced an urgent anxiety over the ways in which teenagers were persuaded to establish “brand loyalty” from younger and younger ages. Dr. Susan Linn, of the Harvard Medical School, said at the time that “comparing the marketing of today with the marketing of yesteryear is like comparing a BB gun to a smartbomb.” 

Eleven years later, the smartbomb has become a nuclear meltdown, with radioactive fallout landing in places we never would have imagined. When Kylie Jenner turned 18 this summer and received an $11,000 Birkin bag as a gift, 1.3 million people “liked” her Instagram post of the purse (to say nothing of the Ferrari she also received). While none of that would normally register on my own metric as relevant, I know for a fact that my son, who attends (public) high school in lower Manhattan, has spent many a lunch period loitering outside the Trump Soho hotel, staring hard at every black Escalade that pulls up, hoping for a Kardashian sighting. To this, he also feels entitled. Of this, he also imagines something meaningful.

Dr. Ellen Jacobs, a Manhattan-based psychotherapist who often works with families around issues of consumerism, agrees that the teenage demand for designer products has been steadily increasing over the last decade. Like everyone else, she also underscores the role of technology in drastically changing what kids are exposed to and their desire to acquire it. Celebrities and their wealth are “infiltrating our culture,” Jacobs says, and we’re all buying into it. “Everyone has a Louis Vuitton bag these days, whether they have money or not.” Or at least, that’s how it seems, because if you do have a bag, you’re publicizing it on social media.

As for how to respond to this shift as a parent, however, there is no silver bullet. Jacobs encourages parents to empathize with their children while also setting limits, which, she says, need to start well before adolescence. Giving to charities and volunteering can help counterbalance the impulse to spend, too, she points out, as does opening a bank account, to make real the satisfaction and practicality of saving money.

In "The Opposite of Spoiled," New York Times financial columnist Ron Leiber also advises parents to engage their kids in fiscal literacy from a young age and to speak frankly about costs of living. In a section of the book aimed specifically at teens, Leiber describes the “Materialism Intervention,” based on the brainchild of a nonprofit organization called Share Save Spend, which encourages kids and families to think of their money in those three categories. Most interesting to me is a study Lieber cites, conducted on two groups of teens, one that was guided through the share, save, spend model and one that was not. Not surprisingly, the kids who adopted the program showed “better self-esteem” than before the intervention.

I champion all this advice as logical and necessary and actually plan to make some “share,” “save,” and “spend” jars at home. But I also bristle at the claims that individual families are solely responsible for, or even capable of, undoing all the vapid promises of finding contentment in new shoes, promises heaped on our kids from every direction. Precisely because this message is as pervasive as it is destructive, I want to partake in something far more radical than opening a bank account or curbing what the tooth fairy brings. I want collective refusal. I want political attention paid to the adolescent health crisis of the isolation that comes with spending hours a day on a device. I want celebrities who endorse the opposite of lavish as chic or heroic. During WWII, Rita Hayworth looked beautiful and conscientious in photos promoting national scrap metal recycling. Rationing efforts then were patriotic.

I recognize my responsibility to teach my son fiscal responsibility. And to describe the direct impact the global rise in capitalism has had on destruction of the environment. I know it’s my obligation to help him recognize that shopping doesn’t bring anything more enduring than the urge to keep shopping. But I also wish that we were having these conversations as part of and against the backdrop of much more public and dramatic action, education and debate.

Recently, my son took a class entitled “Economic Doomsday,” which included an overview of the housing crisis. For the first time in his life, he came home wanting to talk about who had what and why. Fortunately, he had a great teacher and, in the midst of SoHo real estate insanity, an up close and personal look at the multimillion-dollar penthouse getting built across the street from his school. But I also suspect that he grappled with the topic as thoughtfully as he did precisely because the lecture didn’t come from me. Which is why I’m still not convinced that refusing my son’s occasional request for big-ticket presents will result in anything other than resentment.

In the end, I compromised, at least with myself. I didn’t buy the sneakers, but I did get him a gift certificate to Barney’s for almost as much money. While this solves exactly nothing, I am introducing him to buying on clearance. Hoping he might begin to ask more complicated questions about value and worth the more he’s in charge of his own money and his own wardrobe, I have to hope the critical conversations around our dinner table will echo back eventually. In the meantime, he’ll dress for his Kardashian sightings, not picking up when I dial his cell from our rotary phone to ask what he’s doing.

Two weeks before my son’s 16th birthday, after shrugging off my suggestions that we do something to celebrate, he emailed me a wish list that included $350 sneakers. The apparently coveted Y-3 Retro Boost looked, to me, exactly like regular black sneakers with white soles, but also included some complicated loop at the heel.

Perhaps I should have been appalled, but I was not. Or rather, I was appalled but not surprised. The tension over what to buy and what not to buy started the Halloween he was 5. I knew Prefab vs. DIY was not a new or original generational battlefield, but I also thought relenting with a store-bought Flash costume would gain me some leverage (i.e., pick your battles). I was 23 when he was born, too young to understand how little control I would actually have as he grew. My son’s father and I were living in rural India when I’d gotten pregnant and returned there after our baby turned 1. I had not had a thousand-dollar stroller or a designer layette. Instead, I had 12 cloth diapers we washed out at the hand pump. Because we’d spent so much of my son’s early years avoiding American consumerism, I was shocked to see how quickly it surfaced once we were back and how unequipped I was for all the judgment aimed at me for what I bought for my son or didn’t.

Which is to say that as he’s grown, the stakes have gotten higher – more expensive and more emotionally loaded – and I still don’t know what the right thing is. Because just avoiding this pair of $350 sneakers doesn’t  solve the problem of trying to raise a self-reliant kid at a time when we’re all bombarded with the capitalist mantra that products are, in fact, essential to our sense of self and self-worth.

My son is a sensitive, observant boy with a kind heart and a sharp sense of humor. He takes good care of his little sister, is respectful to his teachers, and worked hard this summer at his first job, “landscaping” (or picking up trash) at a local park. But, born in 1999, he is also very much a product of his generation, who, in a sweeping generalization that also feels quite specific, seem to exist on this fulcrum between consumerism and technology – using their phones to trawl for products, to buy the products, to post pictures of the products on their phones.

In addition to my angst over increasing sweatshop labor, corporate profits and landfill waste, I lie awake in the middle of the night most anxious about something that happened when my son was about 10: he explained his certainty that an expensive watch would make anyone happy. “If you had that watch, people would want to be around you,” he said with the logic of a child who was hard at work interpreting the world around him. Which is exactly why I know that he can hardly be blamed for emulating the very principles of the wider American culture in which he’s been raised.

In the spirit of full disclosure, I will admit that as a 6-year-old in 1982, I named my tabby cat Gucci after a classmate’s father traveled to Italy and brought her home the purse our second-grade class agreed changed everything. As one of only a handful of lower-middle-class kids at my private, all-girl Catholic school, I got an inadvertent crash course in expensive preppy. It was also the 1980s, when the whole U.S. stopped going to church and started going to the mall instead. GREAT!

But raised on equal parts Catholic guilt and Protestant work ethic, I also grew up ashamed about wanting the things I did. My grandparents grew up during the Depression and 50 years later, thrift was still promoted in my family not only as practical, but as a display of national, even ethical pride. When my grandma explained that she had to forgo both piano and ballet lessons, unlike her older sisters, I interpreted this to mean that self-denial equaled a kind of moral superiority. And even though the girls in my Catholic school evaluated one another’s ESPRIT quotient on a daily basis (uniforms, by the way, do nothing to curb judgment or meanness), our teachers, all nuns, worked tirelessly to bring the realities of the Central American poor into our classroom. Every week, we donated extra milk money to the street children of San Salvador. The way I learned to pray had very little to do with God and everything to do with recognizing the suffering of others.   

While I am grateful for to have been raised with this hazy awareness of privilege, which did keep my own mother’s anxiety over the grocery bill somewhat in perspective, other consequences have included a lifetime of tormented indecision. I have a really hard time spending money on myself, even when it comes to socks and winter coats. Meaning that on top of the already fraught combat zone of teen materialism, in trying to figure out what to buy for my son, I’m also trying to get over my own vexed relationship with buying anything ever.

Our family listens to records on a record player, sits on chairs found on the street, and makes calls on a rotary phone. For reasons beyond my parsimonious childhood, I believe in frugality and things that last, in flea markets over Ikea, in resoling my clogs rather than buying new ones. We don’t own a TV, a microwave, or video games. I say these things not with pride or a misguided confidence in their impact, but as context. As testimony that my son has not been raised with a bottomless allowance or spoon-fed luxury.

But in my deepest moments of self-doubt, I worry that upholding my values has actually fueled my son’s determination to reject them or, like any kid experimenting with self-definition or rebellion, to at least look elsewhere for fulfillment or individuality.

When we were in Paris the summer he was 11, my son complained more than once that everything was “old.” Back in India the summer he was 8, he couldn’t get past how “dirty” and “broken” everything was long enough to see anything other than the poverty. In both cases, I couldn’t really argue – Paris is indeed old; much of India is dirty. And yet, I was still surprised to see how deeply ingrained the American ethos of "new equals better" was.

So while I’m ashamed to admit I would even consider $350 sneakers for anyone, especially a 6’1” growing teenager, I also worry that denying him the shoes make them that much more alluring. It’s a sort of damned if you do, damned if you don’t scenario – force your kid to stay too far outside the mainstream and you might make him a zealot. But let him embrace it and you create another mindless consumer.

In 2004, the American Psychological Association pinpointed the increase in marketing directed at teens as having a profound effect on how kids forged an identity and grappled with the vulnerability of adolescence. Advocating for more empirical research in order to push for federal oversight of commercial marketing aimed at children, many concerned psychologists voiced an urgent anxiety over the ways in which teenagers were persuaded to establish “brand loyalty” from younger and younger ages. Dr. Susan Linn, of the Harvard Medical School, said at the time that “comparing the marketing of today with the marketing of yesteryear is like comparing a BB gun to a smartbomb.” 

Eleven years later, the smartbomb has become a nuclear meltdown, with radioactive fallout landing in places we never would have imagined. When Kylie Jenner turned 18 this summer and received an $11,000 Birkin bag as a gift, 1.3 million people “liked” her Instagram post of the purse (to say nothing of the Ferrari she also received). While none of that would normally register on my own metric as relevant, I know for a fact that my son, who attends (public) high school in lower Manhattan, has spent many a lunch period loitering outside the Trump Soho hotel, staring hard at every black Escalade that pulls up, hoping for a Kardashian sighting. To this, he also feels entitled. Of this, he also imagines something meaningful.

Dr. Ellen Jacobs, a Manhattan-based psychotherapist who often works with families around issues of consumerism, agrees that the teenage demand for designer products has been steadily increasing over the last decade. Like everyone else, she also underscores the role of technology in drastically changing what kids are exposed to and their desire to acquire it. Celebrities and their wealth are “infiltrating our culture,” Jacobs says, and we’re all buying into it. “Everyone has a Louis Vuitton bag these days, whether they have money or not.” Or at least, that’s how it seems, because if you do have a bag, you’re publicizing it on social media.

As for how to respond to this shift as a parent, however, there is no silver bullet. Jacobs encourages parents to empathize with their children while also setting limits, which, she says, need to start well before adolescence. Giving to charities and volunteering can help counterbalance the impulse to spend, too, she points out, as does opening a bank account, to make real the satisfaction and practicality of saving money.

In "The Opposite of Spoiled," New York Times financial columnist Ron Leiber also advises parents to engage their kids in fiscal literacy from a young age and to speak frankly about costs of living. In a section of the book aimed specifically at teens, Leiber describes the “Materialism Intervention,” based on the brainchild of a nonprofit organization called Share Save Spend, which encourages kids and families to think of their money in those three categories. Most interesting to me is a study Lieber cites, conducted on two groups of teens, one that was guided through the share, save, spend model and one that was not. Not surprisingly, the kids who adopted the program showed “better self-esteem” than before the intervention.

I champion all this advice as logical and necessary and actually plan to make some “share,” “save,” and “spend” jars at home. But I also bristle at the claims that individual families are solely responsible for, or even capable of, undoing all the vapid promises of finding contentment in new shoes, promises heaped on our kids from every direction. Precisely because this message is as pervasive as it is destructive, I want to partake in something far more radical than opening a bank account or curbing what the tooth fairy brings. I want collective refusal. I want political attention paid to the adolescent health crisis of the isolation that comes with spending hours a day on a device. I want celebrities who endorse the opposite of lavish as chic or heroic. During WWII, Rita Hayworth looked beautiful and conscientious in photos promoting national scrap metal recycling. Rationing efforts then were patriotic.

I recognize my responsibility to teach my son fiscal responsibility. And to describe the direct impact the global rise in capitalism has had on destruction of the environment. I know it’s my obligation to help him recognize that shopping doesn’t bring anything more enduring than the urge to keep shopping. But I also wish that we were having these conversations as part of and against the backdrop of much more public and dramatic action, education and debate.

Recently, my son took a class entitled “Economic Doomsday,” which included an overview of the housing crisis. For the first time in his life, he came home wanting to talk about who had what and why. Fortunately, he had a great teacher and, in the midst of SoHo real estate insanity, an up close and personal look at the multimillion-dollar penthouse getting built across the street from his school. But I also suspect that he grappled with the topic as thoughtfully as he did precisely because the lecture didn’t come from me. Which is why I’m still not convinced that refusing my son’s occasional request for big-ticket presents will result in anything other than resentment.

In the end, I compromised, at least with myself. I didn’t buy the sneakers, but I did get him a gift certificate to Barney’s for almost as much money. While this solves exactly nothing, I am introducing him to buying on clearance. Hoping he might begin to ask more complicated questions about value and worth the more he’s in charge of his own money and his own wardrobe, I have to hope the critical conversations around our dinner table will echo back eventually. In the meantime, he’ll dress for his Kardashian sightings, not picking up when I dial his cell from our rotary phone to ask what he’s doing.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 15:30

Beyond Mike Brady: Finally, stepfathers are getting their pop culture moment

Are stepfathers finally about to have their cultural moment? If so, it’s richly deserved. Let’s be clear, right now isn’t that moment, but changes in the popular consciousness often happen one movie character, one TV relationship or one viral video at a time. The Will Ferrell comedy “Daddy’s Home” (tagline: “It’s Dad vs. Step-Dad”) arrives in theatres Christmas Day. While I can’t yet call it a victory for the millions of guys raising stepchildren — early reviews have not been kind — the movie is the latest in a series of tentative signs that Hollywood may finally be ready to give these men their due on screen. Stepfathers are nothing new, of course. George Washington was one. Jesus had one. But in our popular stories, they’ve held a tenuous place at best. While stepmothers have to deal with their own wicked stereotypes, stepfathers' typical screen depictions range from moron to molester to maniac. Sure, there are exceptions. Liam Neeson is a loving stepfather/widower in “Love, Actually.” Dwayne Johnson brings both heart and pecs to the role of stepdad in “Journey 2: The Mysterious Island.” And of course, there’s the Godfather of stepfathers, Mike Brady, who brought up his bunch of three biological sons along with stepdaughters Cindy, Jan and Marcia, Marcia, Marcia. Worth noting about these fictional stepdads: by and large, they married widows or their wives’ ex-husbands stayed conveniently off-camera. This contrasts with reality for millions of blended real-life families, but not so many on-screen, at least until extremely recently. Case in point: “Ant-Man,” – yes, “Ant-Man.” For much of the movie, Scott/Ant-Man (Paul Rudd) clashes with his daughter’s stepfather, Paxton (Bobby Cannavale). But at the end, Paxton joins him to protect Scott’s daughter Cassie (Abby Ryder Fortson) from – oh, don’t make me explain – the point is, shortly before the credits, Paxton, Scott, Cassie and Scott’s ex-wife Maggie (Judy Greer) have an amiable dinner together. The two men express genuine respect and appreciation for one another. Then: SCOTT: This is awkward. PAXTON: Yeah. MAGGIE: Yeah. CASSIE: Yeah. Then they all coo over a video of Cassie doing cartwheels. I won’t claim to be a scholar of the stepfather in TV and cinema, but I’ve logged my share of screen time and when has that ever happened – where father and stepfather both, in a sense, help each other win? My family never evolved that far. I didn’t meet my biological father until I was 18. (In my two-father family, I call my biological one by his first name – Jimmy – and my stepfather Dad.) The two of them stood face to face for the first time when I was maybe 20, and it was agony. We smiled so hard it was painful, especially after Jimmy made a joke about being taller than Dad. Years later, at my wedding, Dad had his camera and wound up taking a picture of Jimmy and me. My dad’s a decent photographer, but somehow that picture wound up out of focus – and Jimmy’s head was cut off. That’s why the tension between Jay (Ed O’Neill) and Javier (Benjamin Bratt) on “Modern Family” feels so familiar. Jay and Manny (Rico Rodriguez) have what may be the sweetest stepfather-stepson relationship on TV. Their affection for and exasperation with one another feels rich and genuine, and I love the episodes where Manny gets a visit from his father, Javier. Manny adores his dad, despite the fact that Javier is impetuous and unreliable. And Javier and the much older, stodgier Jay grudgingly accept one another’s presence. Javier even says at one point, “Maybe it’s a good thing that (Manny) has the two of us.” In one of the show’s signature segments when a character speaks directly into the camera, Jay characterizes their relationship this way, “I don’t owe this guy anything. He stops by a couple of times a year to see his kid. It used to be a relief, give me a nice break. Now, Manny and I, we got our own thing. I know I’m not his dad. Maybe I don’t like to remind him.” “Daddy’s Home,” plays this dynamic for broader laughs. The ubiquitous trailers (the movie had a bigger TV ad-spend than "Star Wars: The Force Awakens") show good-hearted stepdad Brad (Ferrell) getting decidedly out-butched by his motorcycle-riding ab-crunching rival Dusty (Mark Wahlburg). Brad can’t ride or skateboard. He and his wife (Linda Cardellini) are having trouble conceiving and when he tries to punch Dusty, he only hurts himself. Laura Tropp teaches communications and media at Marymount Manhattan College and is co-editor of the new book “Deconstructing Dads: Changing Images of Fathers in Popular Culture.” Tropp says lack of macho isn’t unique in stepfather stories. “You definitely see this notion of emasculation. There’s still this very fundamental association between masculinity and paternity.” Underneath that, perhaps, is the notion that people see stepfathers as caretakers of another man’s child, and Tropp says, “caretaking is traditionally seen as womanly work.” I’m not claiming “Daddy’s Home” is going to be revolutionary, but I think its appearance this year does mean something, particularly given how rare stepfather characters are on screen. According to the Pew Research Center, 16 percent of American children are living in blended families, and that number’s been steady for more than 20 years. And yet, stepfathers are relatively rare on screen, and the social and academic research on them isn’t terribly robust, either. It’s as if we have a kind of cultural blind spot for the stepfather, even though everybody knows one, was raised by one or is one. In fact, I feel strangely disloyal to my Dad, even now, by referring to him as my stepfather, a word we never used. (He did legally adopt me when I was 5.) It’s as if the term, despite its strict accuracy, is somehow diminishing to him and his role in my life. But why? University of Florida sociologist William Marsiglio has written extensively about stepfathers and says that at a conceptual level, adding the prefix step- suggest some lack of authenticity. “Just like we have negative views about hand-me-down clothes and used cars, stepfathers are (often seen as being) less than the ideal.  It’s consistent in many respects with our desire to want the ‘real’ thing—whatever that thing may be.” Hmm. I’ve always hated the term “real dad” or “real father” – what does that mean, anyway? I’m not saying DNA doesn’t matter. For whatever reason, it does. But how much? Ron Deal is a marriage and family therapist and author of “The Smart Stepdad” and other books. He says the comedic competition Brad and Dusty stage for the children’s love is a terrible idea in real life. “It takes a tremendous toll on the well-being of the children,” he says. Plus, in general, kids will choose their biological dad. Still, “stepchildren who have a good relationship with both their father and stepfather have better outcomes than children who don’t.” If Brad and Dusty were real people, Deal would tell Dusty, “Give your children permission to like, love or at least get along with their stepfather.” Convince them they don’t have to automatically reject their stepfather out of loyalty to their dad. And for Brad? “Let him know that you’re not trying to replace him, that you’ll always respect his relationship with his children.” If Deal’s right, many of us could take a lesson from Ohio’s Todd Bachman and Todd Cendrosky. In a September wedding, Bachman was walking his daughter Brittany Peck down the aisle when he paused and extended a hand to Cendrosky, her stepfather, who joined them. The three walked together arm in arm in arm toward the altar and, apparently, into our hearts. The photographer’s Facebook post featuring their pictures and story was viewed more than 70 million times and earned coverage from the New York Times, USA Today and the Today show, among many others. (“Daddy’s Home” should be so lucky.) Cendrosky told a TV reporter, “He grabbed my hand and said, ‘You’ve worked just as hard for this as I have. You deserve this just as much as I do.’” The shot of the two men holding hands, tears forming in stepdad Cendrosky’s eyes, is hard to forget. The woman who took that picture, Delia D. Blackburn, described the moment in terms that ring true for stepfathers and everyone else emerging from various cultural closets and cul-de-sacs into a world where sperm donors, surrogate parents, transgendered parents and gay marriage and adoption have all upended the traditional restrictions of what it could mean to be somebody’s parent or child. “Families,” she wrote, “are what we make them.”  Are stepfathers finally about to have their cultural moment? If so, it’s richly deserved. Let’s be clear, right now isn’t that moment, but changes in the popular consciousness often happen one movie character, one TV relationship or one viral video at a time. The Will Ferrell comedy “Daddy’s Home” (tagline: “It’s Dad vs. Step-Dad”) arrives in theatres Christmas Day. While I can’t yet call it a victory for the millions of guys raising stepchildren — early reviews have not been kind — the movie is the latest in a series of tentative signs that Hollywood may finally be ready to give these men their due on screen. Stepfathers are nothing new, of course. George Washington was one. Jesus had one. But in our popular stories, they’ve held a tenuous place at best. While stepmothers have to deal with their own wicked stereotypes, stepfathers' typical screen depictions range from moron to molester to maniac. Sure, there are exceptions. Liam Neeson is a loving stepfather/widower in “Love, Actually.” Dwayne Johnson brings both heart and pecs to the role of stepdad in “Journey 2: The Mysterious Island.” And of course, there’s the Godfather of stepfathers, Mike Brady, who brought up his bunch of three biological sons along with stepdaughters Cindy, Jan and Marcia, Marcia, Marcia. Worth noting about these fictional stepdads: by and large, they married widows or their wives’ ex-husbands stayed conveniently off-camera. This contrasts with reality for millions of blended real-life families, but not so many on-screen, at least until extremely recently. Case in point: “Ant-Man,” – yes, “Ant-Man.” For much of the movie, Scott/Ant-Man (Paul Rudd) clashes with his daughter’s stepfather, Paxton (Bobby Cannavale). But at the end, Paxton joins him to protect Scott’s daughter Cassie (Abby Ryder Fortson) from – oh, don’t make me explain – the point is, shortly before the credits, Paxton, Scott, Cassie and Scott’s ex-wife Maggie (Judy Greer) have an amiable dinner together. The two men express genuine respect and appreciation for one another. Then: SCOTT: This is awkward. PAXTON: Yeah. MAGGIE: Yeah. CASSIE: Yeah. Then they all coo over a video of Cassie doing cartwheels. I won’t claim to be a scholar of the stepfather in TV and cinema, but I’ve logged my share of screen time and when has that ever happened – where father and stepfather both, in a sense, help each other win? My family never evolved that far. I didn’t meet my biological father until I was 18. (In my two-father family, I call my biological one by his first name – Jimmy – and my stepfather Dad.) The two of them stood face to face for the first time when I was maybe 20, and it was agony. We smiled so hard it was painful, especially after Jimmy made a joke about being taller than Dad. Years later, at my wedding, Dad had his camera and wound up taking a picture of Jimmy and me. My dad’s a decent photographer, but somehow that picture wound up out of focus – and Jimmy’s head was cut off. That’s why the tension between Jay (Ed O’Neill) and Javier (Benjamin Bratt) on “Modern Family” feels so familiar. Jay and Manny (Rico Rodriguez) have what may be the sweetest stepfather-stepson relationship on TV. Their affection for and exasperation with one another feels rich and genuine, and I love the episodes where Manny gets a visit from his father, Javier. Manny adores his dad, despite the fact that Javier is impetuous and unreliable. And Javier and the much older, stodgier Jay grudgingly accept one another’s presence. Javier even says at one point, “Maybe it’s a good thing that (Manny) has the two of us.” In one of the show’s signature segments when a character speaks directly into the camera, Jay characterizes their relationship this way, “I don’t owe this guy anything. He stops by a couple of times a year to see his kid. It used to be a relief, give me a nice break. Now, Manny and I, we got our own thing. I know I’m not his dad. Maybe I don’t like to remind him.” “Daddy’s Home,” plays this dynamic for broader laughs. The ubiquitous trailers (the movie had a bigger TV ad-spend than "Star Wars: The Force Awakens") show good-hearted stepdad Brad (Ferrell) getting decidedly out-butched by his motorcycle-riding ab-crunching rival Dusty (Mark Wahlburg). Brad can’t ride or skateboard. He and his wife (Linda Cardellini) are having trouble conceiving and when he tries to punch Dusty, he only hurts himself. Laura Tropp teaches communications and media at Marymount Manhattan College and is co-editor of the new book “Deconstructing Dads: Changing Images of Fathers in Popular Culture.” Tropp says lack of macho isn’t unique in stepfather stories. “You definitely see this notion of emasculation. There’s still this very fundamental association between masculinity and paternity.” Underneath that, perhaps, is the notion that people see stepfathers as caretakers of another man’s child, and Tropp says, “caretaking is traditionally seen as womanly work.” I’m not claiming “Daddy’s Home” is going to be revolutionary, but I think its appearance this year does mean something, particularly given how rare stepfather characters are on screen. According to the Pew Research Center, 16 percent of American children are living in blended families, and that number’s been steady for more than 20 years. And yet, stepfathers are relatively rare on screen, and the social and academic research on them isn’t terribly robust, either. It’s as if we have a kind of cultural blind spot for the stepfather, even though everybody knows one, was raised by one or is one. In fact, I feel strangely disloyal to my Dad, even now, by referring to him as my stepfather, a word we never used. (He did legally adopt me when I was 5.) It’s as if the term, despite its strict accuracy, is somehow diminishing to him and his role in my life. But why? University of Florida sociologist William Marsiglio has written extensively about stepfathers and says that at a conceptual level, adding the prefix step- suggest some lack of authenticity. “Just like we have negative views about hand-me-down clothes and used cars, stepfathers are (often seen as being) less than the ideal.  It’s consistent in many respects with our desire to want the ‘real’ thing—whatever that thing may be.” Hmm. I’ve always hated the term “real dad” or “real father” – what does that mean, anyway? I’m not saying DNA doesn’t matter. For whatever reason, it does. But how much? Ron Deal is a marriage and family therapist and author of “The Smart Stepdad” and other books. He says the comedic competition Brad and Dusty stage for the children’s love is a terrible idea in real life. “It takes a tremendous toll on the well-being of the children,” he says. Plus, in general, kids will choose their biological dad. Still, “stepchildren who have a good relationship with both their father and stepfather have better outcomes than children who don’t.” If Brad and Dusty were real people, Deal would tell Dusty, “Give your children permission to like, love or at least get along with their stepfather.” Convince them they don’t have to automatically reject their stepfather out of loyalty to their dad. And for Brad? “Let him know that you’re not trying to replace him, that you’ll always respect his relationship with his children.” If Deal’s right, many of us could take a lesson from Ohio’s Todd Bachman and Todd Cendrosky. In a September wedding, Bachman was walking his daughter Brittany Peck down the aisle when he paused and extended a hand to Cendrosky, her stepfather, who joined them. The three walked together arm in arm in arm toward the altar and, apparently, into our hearts. The photographer’s Facebook post featuring their pictures and story was viewed more than 70 million times and earned coverage from the New York Times, USA Today and the Today show, among many others. (“Daddy’s Home” should be so lucky.) Cendrosky told a TV reporter, “He grabbed my hand and said, ‘You’ve worked just as hard for this as I have. You deserve this just as much as I do.’” The shot of the two men holding hands, tears forming in stepdad Cendrosky’s eyes, is hard to forget. The woman who took that picture, Delia D. Blackburn, described the moment in terms that ring true for stepfathers and everyone else emerging from various cultural closets and cul-de-sacs into a world where sperm donors, surrogate parents, transgendered parents and gay marriage and adoption have all upended the traditional restrictions of what it could mean to be somebody’s parent or child. “Families,” she wrote, “are what we make them.”  

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 14:30

The plutocrats are winning: American democracy is being sold off, piece by piece

Dear Readers: In the fall of 2001, in the aftermath of 9/11, as families grieved and the nation mourned, Washington swarmed with locusts of the human kind: wartime opportunists, lobbyists, lawyers, ex-members of Congress, bagmen for big donors: all of them determined to grab what they could for their corporate clients and rich donors while no one was looking. Across the land, the faces of Americans of every stripe were stained with tears. Here in New York, we still were attending memorial services for our firemen and police. But in the nation’s capital, within sight of a smoldering Pentagon that had been struck by one of the hijacked planes, the predator class was hard at work pursuing private plunder at public expense, gold-diggers in the ashes of tragedy exploiting our fear, sorrow, and loss. What did they want? The usual: tax cuts for the wealthy and big breaks for corporations. They even made an effort to repeal the alternative minimum tax that for fifteen years had prevented companies from taking so many credits and deductions that they owed little if any taxes. And it wasn’t only repeal the mercenaries sought; they wanted those corporations to get back all the minimum tax they had ever been assessed. They sought a special tax break for mighty General Electric, although you would never have heard about it if you were watching GE’s news divisions — NBC News, CNBC, or MSNBC, all made sure to look the other way. They wanted to give coal producers more freedom to pollute, open the Alaskan wilderness to drilling, empower the president to keep trade favors for corporations a secret while enabling many of those same corporations to run roughshod over local communities trying the protect the environment and their citizens’ health. It was a disgusting bipartisan spectacle. With words reminding us of Harry Truman’s description of the GOP as “guardians of privilege,” the Republican majority leader of the House dared to declare that “it wouldn’t be commensurate with the American spirit” to provide unemployment and other benefits to laid-off airline workers. As for post 9/11 Democrats, their national committee used the crisis to call for widening the soft-money loophole in our election laws. America had just endured a sneak attack that killed thousands of our citizens, was about to go to war against terror, and would soon send an invading army to the Middle East. If ever there was a moment for shared sacrifice, for putting patriotism over profits, this was it. But that fall, operating deep within the shadows of Washington’s Beltway, American business and political mercenaries wrapped themselves in red, white and blue and went about ripping off a country in crisis. H.L. Mencken got it right: “Whenever you hear a man speak of his love for his country, it is a sign that he expects to be paid for it.” Fourteen years later, we can see more clearly the implications. After three decades of engineering a winner-take-all economy, and buying the political power to consummate their hold on the wealth created by the system they had rigged in their favor, they were taking the final and irrevocable step of separating themselves permanently from the common course of American life. They would occupy a gated stratosphere far above the madding crowd while their political hirelings below look after their earthly interests. The $1.15 trillion spending bill passed by Congress last Friday and quickly signed by President Obama is just the latest triumph in the plutocratic management of politics that has accelerated since 9/11. As Michael Winship and I described here last Thursday, the bill is a bonanza for the donor class – that powerful combine of corporate executives and superrich individuals whose money drives our electoral process. Within minutes of its passage, congressional leaders of both parties and the president rushed to the television cameras to praise each other for a bipartisan bill that they claimed signaled the end of dysfunction; proof that Washington can work. Mainstream media (including public television and radio), especially the networks and cable channels owned and operated by the conglomerates, didn’t stop to ask: “Yes, but work for whom?” Instead, the anchors acted as amplifiers for official spin — repeating the mantra-of-the-hour that while this is not “a perfect bill,” it does a lot of good things. “But for whom? At what price?” went unasked. Now we’re learning. Like the drip-drip-drip of a faucet, over the weekend other provisions in the more than 2000-page bill began to leak. Many of the bad ones we mentioned on Thursday are there — those extended tax breaks for big business, more gratuities to the fossil fuel industry, the provision to forbid the Securities & Exchange Commission from requiring corporations to disclose their political spending, even to their own shareholders. That one’s a slap in the face even to Anthony Kennedy, the justice who wrote the Supreme Court’s majority opinion in Citizens United. He said: “With the advent of the Internet, prompt disclosure of expenditures can provide shareholders and citizens with the information needed to hold corporations and elected officials accountable for their positions.” Over our dead body, Congress declared last Friday, proclaiming instead: Secrecy today. Secrecy tomorrow. Secrecy forever. They are determined that we not know who owns them. The horrors mount. As Eric Lipton and Liz Moyer reported for The New York Times on Sunday, in the last days before the bill’s passage “lobbyists swooped in” to save, at least for now, a loophole worth more than $1 billion to Wall Street investors and the hotel, restaurant and gambling industries. Lobbyists even helped draft crucial language that the Senate Democratic leader Harry Reid furtively inserted into the bill. Lipton and Moyer wrote that, “The small changes, and the enormous windfall they generated, show the power of connected corporate lobbyists to alter a huge bill that is being put together with little time for lawmakers to consider. Throughout the legislation, there were thousands of other add-ons and hard to decipher tax changes.” No surprise to read that “some executives at companies with the most at stake are also big campaign donors.” The Times reports that “the family of David Bonderman, a co-founder of TPG Capital, has donated $1.2 million since 2014 to the Senate Majority PAC, a campaign fund with close ties to Mr. Reid and other Senate Democrats.” Senator Reid, lest we forget, is from Nevada. As he approaches retirement at the end of 2016, perhaps he’s hedging his bets at taxpayer expense. Consider just two other provisions: One, insisted upon by Republican Senator Thad Cochran, directs the Coast Guard to build a $640 million National Security Cutter in Cochran’s home state of Mississippi, a ship that the Coast Guard says it does not need. The other: A demand by Maine Republican Senator Susan Collins for an extra $1 billion for a Navy destroyer that probably will be built at her state’s Bath Iron Works – again, a vessel our military says is unnecessary. So it goes: The selling off of the Republic, piece by piece. What was it Mark Twain said? “There is no distinctive native American criminal class except Congress.” Can we at least face the truth? The plutocrats and oligarchs are winning. The vast inequality they are creating is a death sentence for government by consent of the people at large. Did any voter in any district or state in the last Congressional election vote to give that billion dollar loophole to a handful of billionaires? To allow corporations to hide their political contributions? To add $1.4 trillion to the national debt? Of course not. It is now the game: Candidates ask citizens for their votes, then go to Washington to do the bidding of their donors. And since one expectation is that they will cut the taxes of those donors, we now have a permanent class that is afforded representation without taxation. A plutocracy, says my old friend, the historian Bernard Weisberger, “has a natural instinct to perpetuate and enlarge its own powers and by doing so slams the door of opportunity to challengers and reduces elections to theatrical duels between politicians who are marionettes worked by invisible strings.” Where does it end? By coincidence, this past weekend I watched the final episode of the British television seriesSecret State, a 2012 remake of an earlier version based on the popular novel A Very British Coup. This is white-knuckle political drama. Gabriel Byrne plays an accidental prime minister – thrust into office by the death of the incumbent, only to discover himself facing something he never imagined: a shadowy coalition of forces, some within his own government, working against him. With some of his own ministers secretly in the service of powerful corporations and bankers, his own party falling away from him, press lords daily maligning him, the opposition emboldened, and a public confused by misinformation, deceit, and vicious political rhetoric, the prime minister is told by Parliament to immediately invade Iran (on unproven, even false premises) or resign. In the climactic scene, he defies the “Secret State” that is manipulating all this and confronts Parliament with this challenge:
Let’s forget party allegiance, forget vested interests, forget votes of confidence. Let each and every one of us think only of this: Is this war justified? Is it what the people of this country want? Is it going to achieve what we want it to achieve? And if not, then what next? Well, I tell you what I think we should do. We should represent the people of this country. Not the lobby companies that wine and dine us. Or the banks and the big businesses that tell us how the world goes ‘round. Or the trade unions that try and call the shots. Not the civil servants nor the war-mongering generals or the security chiefs. Not the press magnates and multibillion dollar donors… [We must return] democracy to this House and the country it represents.
Do they? The movie doesn’t tell us. We are left to imagine how the crisis — the struggle for democracy — will end. As we are reminded by this season, there is more to life than politics. There are families, friends, music, worship, sports, the arts, reading, conversation, laughter, celebrations of love and fellowship and partridges in pear trees. But without healthy democratic politics serving a moral order, all these are imperiled by the ferocious appetites of private power and greed. So enjoy the holidays, including Star Wars. Then come back after New Year’s and find a place for yourself, at whatever level, wherever you are, in the struggle for democracy. This is the fight of our lives and how it ends is up to us.Dear Readers: In the fall of 2001, in the aftermath of 9/11, as families grieved and the nation mourned, Washington swarmed with locusts of the human kind: wartime opportunists, lobbyists, lawyers, ex-members of Congress, bagmen for big donors: all of them determined to grab what they could for their corporate clients and rich donors while no one was looking. Across the land, the faces of Americans of every stripe were stained with tears. Here in New York, we still were attending memorial services for our firemen and police. But in the nation’s capital, within sight of a smoldering Pentagon that had been struck by one of the hijacked planes, the predator class was hard at work pursuing private plunder at public expense, gold-diggers in the ashes of tragedy exploiting our fear, sorrow, and loss. What did they want? The usual: tax cuts for the wealthy and big breaks for corporations. They even made an effort to repeal the alternative minimum tax that for fifteen years had prevented companies from taking so many credits and deductions that they owed little if any taxes. And it wasn’t only repeal the mercenaries sought; they wanted those corporations to get back all the minimum tax they had ever been assessed. They sought a special tax break for mighty General Electric, although you would never have heard about it if you were watching GE’s news divisions — NBC News, CNBC, or MSNBC, all made sure to look the other way. They wanted to give coal producers more freedom to pollute, open the Alaskan wilderness to drilling, empower the president to keep trade favors for corporations a secret while enabling many of those same corporations to run roughshod over local communities trying the protect the environment and their citizens’ health. It was a disgusting bipartisan spectacle. With words reminding us of Harry Truman’s description of the GOP as “guardians of privilege,” the Republican majority leader of the House dared to declare that “it wouldn’t be commensurate with the American spirit” to provide unemployment and other benefits to laid-off airline workers. As for post 9/11 Democrats, their national committee used the crisis to call for widening the soft-money loophole in our election laws. America had just endured a sneak attack that killed thousands of our citizens, was about to go to war against terror, and would soon send an invading army to the Middle East. If ever there was a moment for shared sacrifice, for putting patriotism over profits, this was it. But that fall, operating deep within the shadows of Washington’s Beltway, American business and political mercenaries wrapped themselves in red, white and blue and went about ripping off a country in crisis. H.L. Mencken got it right: “Whenever you hear a man speak of his love for his country, it is a sign that he expects to be paid for it.” Fourteen years later, we can see more clearly the implications. After three decades of engineering a winner-take-all economy, and buying the political power to consummate their hold on the wealth created by the system they had rigged in their favor, they were taking the final and irrevocable step of separating themselves permanently from the common course of American life. They would occupy a gated stratosphere far above the madding crowd while their political hirelings below look after their earthly interests. The $1.15 trillion spending bill passed by Congress last Friday and quickly signed by President Obama is just the latest triumph in the plutocratic management of politics that has accelerated since 9/11. As Michael Winship and I described here last Thursday, the bill is a bonanza for the donor class – that powerful combine of corporate executives and superrich individuals whose money drives our electoral process. Within minutes of its passage, congressional leaders of both parties and the president rushed to the television cameras to praise each other for a bipartisan bill that they claimed signaled the end of dysfunction; proof that Washington can work. Mainstream media (including public television and radio), especially the networks and cable channels owned and operated by the conglomerates, didn’t stop to ask: “Yes, but work for whom?” Instead, the anchors acted as amplifiers for official spin — repeating the mantra-of-the-hour that while this is not “a perfect bill,” it does a lot of good things. “But for whom? At what price?” went unasked. Now we’re learning. Like the drip-drip-drip of a faucet, over the weekend other provisions in the more than 2000-page bill began to leak. Many of the bad ones we mentioned on Thursday are there — those extended tax breaks for big business, more gratuities to the fossil fuel industry, the provision to forbid the Securities & Exchange Commission from requiring corporations to disclose their political spending, even to their own shareholders. That one’s a slap in the face even to Anthony Kennedy, the justice who wrote the Supreme Court’s majority opinion in Citizens United. He said: “With the advent of the Internet, prompt disclosure of expenditures can provide shareholders and citizens with the information needed to hold corporations and elected officials accountable for their positions.” Over our dead body, Congress declared last Friday, proclaiming instead: Secrecy today. Secrecy tomorrow. Secrecy forever. They are determined that we not know who owns them. The horrors mount. As Eric Lipton and Liz Moyer reported for The New York Times on Sunday, in the last days before the bill’s passage “lobbyists swooped in” to save, at least for now, a loophole worth more than $1 billion to Wall Street investors and the hotel, restaurant and gambling industries. Lobbyists even helped draft crucial language that the Senate Democratic leader Harry Reid furtively inserted into the bill. Lipton and Moyer wrote that, “The small changes, and the enormous windfall they generated, show the power of connected corporate lobbyists to alter a huge bill that is being put together with little time for lawmakers to consider. Throughout the legislation, there were thousands of other add-ons and hard to decipher tax changes.” No surprise to read that “some executives at companies with the most at stake are also big campaign donors.” The Times reports that “the family of David Bonderman, a co-founder of TPG Capital, has donated $1.2 million since 2014 to the Senate Majority PAC, a campaign fund with close ties to Mr. Reid and other Senate Democrats.” Senator Reid, lest we forget, is from Nevada. As he approaches retirement at the end of 2016, perhaps he’s hedging his bets at taxpayer expense. Consider just two other provisions: One, insisted upon by Republican Senator Thad Cochran, directs the Coast Guard to build a $640 million National Security Cutter in Cochran’s home state of Mississippi, a ship that the Coast Guard says it does not need. The other: A demand by Maine Republican Senator Susan Collins for an extra $1 billion for a Navy destroyer that probably will be built at her state’s Bath Iron Works – again, a vessel our military says is unnecessary. So it goes: The selling off of the Republic, piece by piece. What was it Mark Twain said? “There is no distinctive native American criminal class except Congress.” Can we at least face the truth? The plutocrats and oligarchs are winning. The vast inequality they are creating is a death sentence for government by consent of the people at large. Did any voter in any district or state in the last Congressional election vote to give that billion dollar loophole to a handful of billionaires? To allow corporations to hide their political contributions? To add $1.4 trillion to the national debt? Of course not. It is now the game: Candidates ask citizens for their votes, then go to Washington to do the bidding of their donors. And since one expectation is that they will cut the taxes of those donors, we now have a permanent class that is afforded representation without taxation. A plutocracy, says my old friend, the historian Bernard Weisberger, “has a natural instinct to perpetuate and enlarge its own powers and by doing so slams the door of opportunity to challengers and reduces elections to theatrical duels between politicians who are marionettes worked by invisible strings.” Where does it end? By coincidence, this past weekend I watched the final episode of the British television seriesSecret State, a 2012 remake of an earlier version based on the popular novel A Very British Coup. This is white-knuckle political drama. Gabriel Byrne plays an accidental prime minister – thrust into office by the death of the incumbent, only to discover himself facing something he never imagined: a shadowy coalition of forces, some within his own government, working against him. With some of his own ministers secretly in the service of powerful corporations and bankers, his own party falling away from him, press lords daily maligning him, the opposition emboldened, and a public confused by misinformation, deceit, and vicious political rhetoric, the prime minister is told by Parliament to immediately invade Iran (on unproven, even false premises) or resign. In the climactic scene, he defies the “Secret State” that is manipulating all this and confronts Parliament with this challenge:
Let’s forget party allegiance, forget vested interests, forget votes of confidence. Let each and every one of us think only of this: Is this war justified? Is it what the people of this country want? Is it going to achieve what we want it to achieve? And if not, then what next? Well, I tell you what I think we should do. We should represent the people of this country. Not the lobby companies that wine and dine us. Or the banks and the big businesses that tell us how the world goes ‘round. Or the trade unions that try and call the shots. Not the civil servants nor the war-mongering generals or the security chiefs. Not the press magnates and multibillion dollar donors… [We must return] democracy to this House and the country it represents.
Do they? The movie doesn’t tell us. We are left to imagine how the crisis — the struggle for democracy — will end. As we are reminded by this season, there is more to life than politics. There are families, friends, music, worship, sports, the arts, reading, conversation, laughter, celebrations of love and fellowship and partridges in pear trees. But without healthy democratic politics serving a moral order, all these are imperiled by the ferocious appetites of private power and greed. So enjoy the holidays, including Star Wars. Then come back after New Year’s and find a place for yourself, at whatever level, wherever you are, in the struggle for democracy. This is the fight of our lives and how it ends is up to us.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 13:30

Breaking down the “Bro country” walls: The year country music’s women fought back against “take females out”

It must have been fate that “Tomatogate” would happen in the same year that women inadvertently kicked down the doors of the boys club that is country music. Earlier this year, esteemed radio consultant Keith Hill walked into a firestorm when he suggested in an interview that country music radio stations must limit the number of women on the airwaves in order to both appeal to their listeners and be profitable. His directive was quite simple: “take females out.” The backlash from artists and fans alike made “Tomatogate” one of the biggest scandals in recent country music history. Female artists like Lee Ann Womack, Miranda Lambert, Maddie & Tae and Martina McBride responded quickly to the insult, and were quick to note that some of country’s most legendary (and successful) artists have always been women. Despite the scandal, this outcry was particularly tone-deaf when you consider exactly how much women in country music accomplished this year. There is no disputing that country music has long been fighting a “war on women,” to use the New York Times’ words. But in 2015, a slew of hyper-talented female artists made it damn near impossible for everyone from critics to radio DJs to refuse to pay attention to their contributions. This year, women in country music grabbed the proverbial microphone and refused to give it back. 2015 was the year of the woman in country music, and that has incredible implications for country music as a whole and for the women who make all types of music. To be sure, the biggest successes for women in country music came this year from artists that few people had heard of before 2015. Mickey Guyton became the first African-American woman to make real waves on the charts in well more than a decade, shattering records when her debut single “Better Than You Left Me.” With “Love Me Like You Mean It,” Kelsea Ballerini became the first woman to top the Billboard Country Airplay chart with a debut in nearly a decade. Continuing that trend, Cam, Maddie & Tae, Kacey Musgraves and a host of other up-and-coming female artists joined genre mainstays like Miranda Lambert and Carrie Underwood to elbow their way into the conversation about this perpetually male-dominated music. And most of these artists did it without the industry support – radio airplay, namely – that their male counterparts enjoyed. In doing that, they were able to forcibly change the tone of country music in a broader sense. The “bro” domination has, to varying degrees, had a host of negative impacts on female artists. In addition to the oversexualization and overt sexism inherent in many of the genre’s most popular songs over the past few years, it’s also been markedly more difficult for female artists to find their way to the top of the charts. As a 2015 analysis of the Billboard charts found, female artists comprised only 8 percent of all charting singles in 2014, down from (an also paltry) 16 percent in the mid 1990s. In bringing a more feminine perspective to the charts in 2015, female artists have been able to dramatically change the conversation. This year, there were fewer songs about beer, trucks and T&A, and more independent ballads like Maddie & Tae’s “Shut Up & Fish.” The most popular country song of the year was arguably Little Big Town’s “Girl Crush.” The track, a sort of “love song” to an ex-lover’s new flame, stayed at #1 on the Billboard Hot Country chart for more than 11 weeks. It also had an impressive performance on the pop charts, and has since been nominated for consideration as Song of the Year at the 2016 Grammys. Despite the minor controversy that surrounded “Girl Crush,” it quickly became the most universally beloved country song of 2015. With this track, Little Big Town also raised important questions about the place of female sexuality in the genre, and for a change, made a female voice the most dominant in the country conversation. The tone is gradually changing in country music, drifting away from the hyper-masculinity that made bro-country a household term and toward a more inclusive lyrical aesthetic. Maddie & Tae’s “Girl in a Country Song” once again made it okay for women to poke fun at the dominant narrative. It’s also worth considering that it has been the women in country music over the past year who have brought the genre its widest acclaim from people who otherwise think that country music is terrible, namely music critics who happily ignore these albums. Releases from Lindi Ortega, Kacey Musgraves and Lee Ann Womack have helped spotlight the more authentic, traditional country sound that represents the genre at its best. With that, everyone from the New York Times to SPIN to National Public Radio began to take notice. In 2015, country music was also more inclusive than ever, as Mickey Guyton and Lindi Ortega brought more diversity both in sound and in demographics. There is still an incredible amount of work to be done in ensuring that women of all races, sexualities and identities have a place in country music, but at this point talented women are going to find their place in country music, via social media and YouTube and Spotify, even if the label executives and radio DJs take another decade or so to catch on to the very real fact that when there are more women (particularly women of color) in country music, country music is better. Stylistically, the presence of more women in country music means only good things. When women dominate country music, the genre is inarguably better. From Tammy Wynette to Terri Clark to Ballerini, women in country have always had to work harder and produce better music than the boys to get on the charts, much less hit #1. Even Luke Bryan, king of the bro-country phenomenon, has acknowledged as much. “They kind of have to be able to hang with the guys,” he told Entertainment Weekly last year. “But also be feminine and pretty.” As disgusting as that double standard truly is, the one benefit is that the music produced by women in country music decidedly makes the genre better. That’s proven by history – Musgraves’ authentic ’50s aesthetic alongside Brandy Clark’s outlaw persona is just a natural progression of Loretta Lynn singing about birth control to Reba McEntire recording a country song about the AIDS epidemic. The same could be said for country’s aesthetic, too. As much credit as Sturgill Simpson and Chris Stapleton get for returning country to its classic roots, let’s not forget that Musgraves and Alison Krauss and Mary Chapin Carpenter have always exemplified the traditional sound. Women’s rise in country music has a lot of broader implications. As the success of “Girl Crush” demonstrated, the lines between pop and country are more blurred than they’ve ever been. Compare that to the crossover success of Florida-Georgia Line’s “Cruise,” and it’s pretty obvious who most fans of the genre would rather have as its ambassadors. Instead of spilling over these archaic notions about women and overt objectification into pop music, there’s a newer, more progressive country sound that will have a great deal of influence on what pop-country sounds like in the future. Perhaps even more important than that, it’s impossible to ignore the sheer numbers of women who listen to country music. Of the more than 95 million Americans who listen to country music regularly, 52 percent are women who have been deprived of hearing female voices on the radio whilst being bombarded with sexist lyrics. This is a genre that reaches an incredible amount of women each year, and when there are more women on the radio, that means there’s less room on the airwaves for misogyny. There is also still an incredible amount of work to be done in terms of representation. Country music needs more women, period. There need to be more women at the record labels, on the radio and on the charts. The perspectives and voices and stories of women need to be valued more by the industry. The industry needs to stop profiting from sexism and the marginalization of women. In 2015, though, women in country demonstrated that they are committed to making these changes a reality. To be sure, the tide is changing in country music. 2015 was the year that women in country music became more than just a garnish in some sexist salad. This is a change that artists, critics and fans have been begging for since long before the bro-country takeover.It must have been fate that “Tomatogate” would happen in the same year that women inadvertently kicked down the doors of the boys club that is country music. Earlier this year, esteemed radio consultant Keith Hill walked into a firestorm when he suggested in an interview that country music radio stations must limit the number of women on the airwaves in order to both appeal to their listeners and be profitable. His directive was quite simple: “take females out.” The backlash from artists and fans alike made “Tomatogate” one of the biggest scandals in recent country music history. Female artists like Lee Ann Womack, Miranda Lambert, Maddie & Tae and Martina McBride responded quickly to the insult, and were quick to note that some of country’s most legendary (and successful) artists have always been women. Despite the scandal, this outcry was particularly tone-deaf when you consider exactly how much women in country music accomplished this year. There is no disputing that country music has long been fighting a “war on women,” to use the New York Times’ words. But in 2015, a slew of hyper-talented female artists made it damn near impossible for everyone from critics to radio DJs to refuse to pay attention to their contributions. This year, women in country music grabbed the proverbial microphone and refused to give it back. 2015 was the year of the woman in country music, and that has incredible implications for country music as a whole and for the women who make all types of music. To be sure, the biggest successes for women in country music came this year from artists that few people had heard of before 2015. Mickey Guyton became the first African-American woman to make real waves on the charts in well more than a decade, shattering records when her debut single “Better Than You Left Me.” With “Love Me Like You Mean It,” Kelsea Ballerini became the first woman to top the Billboard Country Airplay chart with a debut in nearly a decade. Continuing that trend, Cam, Maddie & Tae, Kacey Musgraves and a host of other up-and-coming female artists joined genre mainstays like Miranda Lambert and Carrie Underwood to elbow their way into the conversation about this perpetually male-dominated music. And most of these artists did it without the industry support – radio airplay, namely – that their male counterparts enjoyed. In doing that, they were able to forcibly change the tone of country music in a broader sense. The “bro” domination has, to varying degrees, had a host of negative impacts on female artists. In addition to the oversexualization and overt sexism inherent in many of the genre’s most popular songs over the past few years, it’s also been markedly more difficult for female artists to find their way to the top of the charts. As a 2015 analysis of the Billboard charts found, female artists comprised only 8 percent of all charting singles in 2014, down from (an also paltry) 16 percent in the mid 1990s. In bringing a more feminine perspective to the charts in 2015, female artists have been able to dramatically change the conversation. This year, there were fewer songs about beer, trucks and T&A, and more independent ballads like Maddie & Tae’s “Shut Up & Fish.” The most popular country song of the year was arguably Little Big Town’s “Girl Crush.” The track, a sort of “love song” to an ex-lover’s new flame, stayed at #1 on the Billboard Hot Country chart for more than 11 weeks. It also had an impressive performance on the pop charts, and has since been nominated for consideration as Song of the Year at the 2016 Grammys. Despite the minor controversy that surrounded “Girl Crush,” it quickly became the most universally beloved country song of 2015. With this track, Little Big Town also raised important questions about the place of female sexuality in the genre, and for a change, made a female voice the most dominant in the country conversation. The tone is gradually changing in country music, drifting away from the hyper-masculinity that made bro-country a household term and toward a more inclusive lyrical aesthetic. Maddie & Tae’s “Girl in a Country Song” once again made it okay for women to poke fun at the dominant narrative. It’s also worth considering that it has been the women in country music over the past year who have brought the genre its widest acclaim from people who otherwise think that country music is terrible, namely music critics who happily ignore these albums. Releases from Lindi Ortega, Kacey Musgraves and Lee Ann Womack have helped spotlight the more authentic, traditional country sound that represents the genre at its best. With that, everyone from the New York Times to SPIN to National Public Radio began to take notice. In 2015, country music was also more inclusive than ever, as Mickey Guyton and Lindi Ortega brought more diversity both in sound and in demographics. There is still an incredible amount of work to be done in ensuring that women of all races, sexualities and identities have a place in country music, but at this point talented women are going to find their place in country music, via social media and YouTube and Spotify, even if the label executives and radio DJs take another decade or so to catch on to the very real fact that when there are more women (particularly women of color) in country music, country music is better. Stylistically, the presence of more women in country music means only good things. When women dominate country music, the genre is inarguably better. From Tammy Wynette to Terri Clark to Ballerini, women in country have always had to work harder and produce better music than the boys to get on the charts, much less hit #1. Even Luke Bryan, king of the bro-country phenomenon, has acknowledged as much. “They kind of have to be able to hang with the guys,” he told Entertainment Weekly last year. “But also be feminine and pretty.” As disgusting as that double standard truly is, the one benefit is that the music produced by women in country music decidedly makes the genre better. That’s proven by history – Musgraves’ authentic ’50s aesthetic alongside Brandy Clark’s outlaw persona is just a natural progression of Loretta Lynn singing about birth control to Reba McEntire recording a country song about the AIDS epidemic. The same could be said for country’s aesthetic, too. As much credit as Sturgill Simpson and Chris Stapleton get for returning country to its classic roots, let’s not forget that Musgraves and Alison Krauss and Mary Chapin Carpenter have always exemplified the traditional sound. Women’s rise in country music has a lot of broader implications. As the success of “Girl Crush” demonstrated, the lines between pop and country are more blurred than they’ve ever been. Compare that to the crossover success of Florida-Georgia Line’s “Cruise,” and it’s pretty obvious who most fans of the genre would rather have as its ambassadors. Instead of spilling over these archaic notions about women and overt objectification into pop music, there’s a newer, more progressive country sound that will have a great deal of influence on what pop-country sounds like in the future. Perhaps even more important than that, it’s impossible to ignore the sheer numbers of women who listen to country music. Of the more than 95 million Americans who listen to country music regularly, 52 percent are women who have been deprived of hearing female voices on the radio whilst being bombarded with sexist lyrics. This is a genre that reaches an incredible amount of women each year, and when there are more women on the radio, that means there’s less room on the airwaves for misogyny. There is also still an incredible amount of work to be done in terms of representation. Country music needs more women, period. There need to be more women at the record labels, on the radio and on the charts. The perspectives and voices and stories of women need to be valued more by the industry. The industry needs to stop profiting from sexism and the marginalization of women. In 2015, though, women in country demonstrated that they are committed to making these changes a reality. To be sure, the tide is changing in country music. 2015 was the year that women in country music became more than just a garnish in some sexist salad. This is a change that artists, critics and fans have been begging for since long before the bro-country takeover.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 12:30

The true triumph of “Transparent”: The Pfeffermans are all of us

Dame Magazine I used to take all the Buzzfeed quizzes—I couldn’t resist them, they were so silly. But I weaned myself from them for well over a year … until AfterEllen came along with one that was too irresistible to pass up: "Which Transparent character are you?" I mean, I had to know. Per their algorithm, would I be one of the queer Pfefferman women, like 40-something Sarah (Amy Landecker), who left her heteronormative life with a husband and kids, to pursue  … a possible heteronormative life with her old college girlfriend, Tammy (Melora Hardin)? Or like aimless, brilliant, charming Ali (Gaby Hoffmann), who is sleeping with her lovelorn best friend, Syd (Carrie Brownstein)? Or would I turn up as their Moppa, Maura (Jeffrey Tambor), currently without a place of her own, and dividing her time between the homes of her newly widowed ex-wife, Shelly (Judith Light), who is still in love with her, and that of her new trans friend, Davina (Alexandra Billings), who has become a kind of mentor to her? More from DAME: "How Do You Know When It's Time to Break Up with a Friend?" I had a hunch that, despite being a Jewish lesbian, I might not be a Pfefferman, but rather one who falls into the path of Pfeffermans. And indeed, according to AfterEllen, je suis Syd—which, as far as these quizzes go, feels quite accurate. Because more than once, I’ve been that woman: in love with a friend who wasn’t really straight, and got excited that she returned my affections, and then hurt when I realized she was trying to work through her own feelings about sexuality through sex with me. And I’ll tell you, it sucks being a Syd. And, for that matter, being a Tammy (annoying as Tammy is), ditched at the altar by the woman she thought she was building a future with. And being a Shelly, co-dependent on the ex-spouse who comes to her when she’s feeling untethered, only to leave when old marital dynamics threaten to set in. And a Davina, a fiftysomething trans woman with a brutal past, who'd generously taken Maura into her home and shared her hard-won wisdom, only to be condescended to about her choice of lovers. There is no better show in existence—past or present—that unpacks female sexuality with such complexity and humanity, such fully realized, complicated characters, and as vivid, unique storytelling as Transparent. I, like so many friends of mine who devoured these ten episodes, were genuinely verklempt—I actually wept while watching this second season, was just held in awe by its raw honesty, especially when I considered that, with its five Emmys,Transparent would draw a bigger and presumably broader audience than ever. This season, creator Jill Soloway, together with her trans and cisgender female writers and directors, have outdone themselves, truly, creating something even queerer, edgier, Jewier, and more feminist than any show that has come before it. We see sex scenes between septuagenarians, between trans and cisgender women including one (Anjelica Huston) who’s had a double-mastectomy from breast cancer. We go to a women’s music festival with Sarah and Ali and their Moppa—where Maura learns she’s unwelcome because she’s not a biological female. We watch Ali aggressively and amorously pursue Leslie, an older professor with whom she wants to study—a radical feminist lesbian poet based on Eileen Myles (played with perfect hot butch swagger by Cherry Jones; the real Eileen Myles has a small role as her colleague)—one to whom Moppa had been so arrogant and misogynist as a male political-science professor, she didn’t remember the cruelty Mort had exacted upon her as young university colleagues. A fact that Maura takes to heart—realizing that, as a woman, she still must reckon with her former male self. And we see Ali tease Syd, as her girlfriend, both physically—flashing her breast while sprawled on her chair, or taunting her with a strap-on, affectionately but unnervingly—and emotionally, as she pushes her desire to be polyamorous, when she knows Syd wants a monogamous relationship. And we watch a paranoid, unmoored Sarah come undone at a school function because she is certain that everyone has been gossiping about her about having left her husband for a woman, only to have ditched her, too—at their wedding. More from DAME: "When Did the Internet Become Smarter than My Doctor?" This is just a sampling, but a pointed one, because what I find to be boldest and most striking is not just the optics of sexual and gender boundaries being pushed, but the emotional ones. Because what is really happening here is a conversation about privilege and the repercussions when that privilege is used heedlessly. To watch these three Pfefferman women explore what their femaleness and sexuality/ies mean to them is extraordinary. But Soloway takes it a step further by revealing to us the toxicity of their narcissism. Yes, Tammy is brash and not very bright, but she’s been hurt and publicly humiliated by Sarah’s realization that she can’t stand her—at their wedding—and that the last thing she wants is to get married again. Of course that’s going to hurt like hell, and in fact long-sober Tammy falls off the wagon and starts drinking again, while Sarah quests to figure out who she is and what she wants. Sarah isn’t particularly apologetic about what she's inflicted upon Tammy, though. And Ali is quite singularly focused on the thesis she wants to write for graduate school, specifically for Leslie—a queer family history gorgeously revealed in flashbacks to Nazi Germany—and at the moment, at least, believes, rather self-servingly, that being queer gives her automatic license to be sexually polyamorous whether or not Syd is okay with it (she’s so not okay with it). Among the three, Maura appears to be the least cavalier, even as she’s hurt Davina—traipsing into her life, doling out unsolicited romantic advice, completely unmindful of their class differences and all that comes with it (i.e., the fact that Davina, a former sex worker, has had to save her hard-earned money for her breasts, while upper-middle-class Maura’s big question is whether she wants them at all). Still, she’s eager do some self-reflection—maybe not so much as an ex-spouse or even as a parent: she, together with Shelly (to whom she has a bit of unresolved business she could stand to work on), has betrayed their son in unspeakable ways. But as a person learning what it means to be a woman, as a woman who used to be a man, she wants to understand what it meant to be so deeply chauvinistic, so woman-hating, that it was a reflex. More from DAME: "Nothing Says Holiday Cheer Quite Like a War on Women Christmas Card" And that is the real triumph of Transparent—its transparency, its ability to really let us in and allow us to see the profoundly uncomfortable truths about how the Pfeffermans—and in turn, we—really are. That we are allowed to love and hate each of them, often in the same episode, sometimes even in the same moment. Yes, they’re narcissistic, and sometimes toxic and unfeeling. And sometimes they're charming and endearing and brilliant and we pull for them despite ourselves. But Maura, as we see especially this season, isn’t the only one who is in a transition in her life—they all are, and they’re fallible and they really step in shit of their own making. A lot. I feel for them because they are so human, even if they're not necessarily humane all the time. The fact is, we don’t have to love the Pfeffermans, though I find that I do, perhaps because they are recognizable to me. I know people like them, I’ve been involved with people like them, I am friends with people like them. And though AfterEllen says I’m a Syd, and I’ve been a Syd, I admit: Sometimes, I’ve been a Pfefferman, too.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 12:00

Behind the Ronald Reagan myth: “No one had ever entered the White House so grossly ill informed”

No one had ever entered the White House so grossly ill informed. At presidential news conferences, especially in his first year, Ronald Reagan embarrassed himself. On one occasion, asked why he advocated putting missiles in vulnerable places, he responded, his face registering bewilderment, “I don’t know but what maybe you haven’t gotten into the area that I’m going to turn over to the secretary of defense.” Frequently, he knew nothing about events that had been headlined in the morning newspaper. In 1984, when asked a question he should have fielded easily, Reagan looked befuddled, and his wife had to step in to rescue him. “Doing everything we can,” she whispered. “Doing everything we can,” the president echoed. To be sure, his detractors sometimes exaggerated his ignorance. The publication of his radio addresses of the 1950s revealed a considerable command of facts, though in a narrow range. But nothing suggested profundity. “You could walk through Ronald Reagan’s deepest thoughts,” a California legislator said, “and not get your ankles wet.” In all fields of public affairs—from diplomacy to the economy—the president stunned Washington policymakers by how little basic information he commanded. His mind, said the well-disposed Peggy Noonan, was “barren terrain.” Speaking of one far-ranging discussion on the MX missile, the Indiana congressman Lee Hamilton, an authority on national defense, reported, “Reagan’s only contribution throughout the entire hour and a half was to interrupt somewhere at midpoint to tell us he’d watched a movie the night before, and he gave us the plot from War Games.” The president “cut ribbons and made speeches. He did these things beautifully,” Congressman Jim Wright of Texas acknowledged. “But he never knew frijoles from pralines about the substantive facts of issues.” Some thought him to be not only ignorant but, in the word of a former CIA director, “stupid.” Clark Clifford called the president an “amiable dunce,” and the usually restrained columnist David Broder wrote, “The task of watering the arid desert between Reagan’s ears is a challenging one for his aides.” No Democratic adversary would ever constitute as great a peril to the president’s political future, his advisers concluded, as Reagan did himself. Therefore, they protected him by severely restricting situations where he might blurt out a fantasy. His staff, one study reported, wrapped him “in excelsior,” while “keeping the press at shouting distance or beyond.” In his first year as president, he held only six news conferences—fewest ever in the modern era. Aides also prepared scores of cue cards, so that he would know how to greet visitors and respond to interviewers. His secretary of the treasury and later chief of staff said of the president: “Every moment of every public appearance was scheduled, every word scripted, every place where Reagan was expected to stand was chalked with toe marks.” Those manipulations, he added, seemed customary to Reagan, for “he had been learning his lines, composing his facial expressions, hitting his toe marks for half a century.” Each night, before turning in, he took comfort in a shooting schedule for the next day’s television- focused events that was laid out for him at his bedside, just as it had been in Hollywood. His White House staff found it difficult, often impossible, to get him to stir himself to follow even this rudimentary routine. When he was expected to read briefing papers, he lazed on a couch watching old movies. On the day before a summit meeting with world leaders about the future of the economy, he was given a briefing book. The next morning, his chief of staff asked him why he had not even opened it. “Well, Jim,” the president explained, “The Sound of Music was on last night.” “Reagan,” his principal biographer, Lou Cannon, has written, “may have been the one president in the history of the republic who saw his election as a chance to get some rest.” (He spent nearly a full year of his tenure not in the White House but at his Rancho del Cielo in the hills above Santa Barbara.) Cabinet officials had to accommodate themselves to Reagan’s slumbering during discussions of pressing issues, and on a multination European trip, he nodded off so often at meetings with heads of state, among them French president François Mitterand, that reporters, borrowing the title of a film noir, designated the journey “The Big Sleep.” He even dozed during a televised audience at the Vatican while the pope was speaking to him. A satirist lampooned Reagan by transmuting Dolly Parton’s “Workin’ 9 to 5” into “Workin’ 9 to 10,” and TV’s Johnny Carson quipped, “There are only two reasons you wake President Reagan: World War III and if Hellcats of the Navy is on the Late Show.” Reagan tossed off criticism of his napping on the job with drollery. He told the White House press corps, “I am concerned about what is happening in government—and it’s caused me many a sleepless afternoon,” and he jested that posterity would place a marker on his chair in the Cabinet Room: “Reagan Slept Here.” His team devised ingenious ways to get him to pay attention. Aware that he was obsessed with movies, his national security adviser had the CIA put together a film on world leaders the president was scheduled to encounter. His defense secretary stooped lower. He got Reagan to sign off on production of the MX missile by showing him a cartoon. Once again, the president made a joke of his lack of involvement: “It’s true that hard work never killed anybody, but why take a chance?” Cannon, who had observed him closely for years and with considerable admiration, took his lapses more seriously. “Seen either in military or economic terms,” he concluded, “the nation paid a high price for a president who skimped on preparation, avoided complexities and news conferences and depended far too heavily on anecdotes, charts, graphics and cartoons.” Subordinates also found Reagan to be an exasperatingly disengaged administrator. “Trying to forge policy,” said George Shultz, his longest- serving secretary of state, was “like walking through a swamp.” Donald Regan recalled: “In the four years that I served as secretary of the treasury, I never saw President Reagan alone and never discussed economic philosophy....I had to figure these things out like any other American, by studying his speeches and reading the newspapers. . . . After I accepted the job, he simply hung up and vanished.” One of his national security advisers, General Colin Powell, recalled that “the President’s passive management style placed a tremendous burden on us,” and another national security adviser, Frank Carlucci, observed: “The Great Communicator wasn’t always the greatest communicator in the private sessions; you didn’t always get clean and crisp decisions. You assumed a lot. . . . You had to.” Numbers of observers contended that Reagan conducted himself not as a ruler but as a ceremonial monarch. In the midst of heated exchanges, a diplomat noted, Reagan behaved like a “remote sort of king . . . just not there.” After taking in the president’s performance during a discussion of the budget in 1981, one of his top aides remarked that Reagan looked like “a king . . . who had assembled his subalterns to listen to what they had to say and to preside, sort of,” and another said, “He made decisions like an ancient king or a Turkish pasha, passively letting his subjects serve him, selecting only those morsels of public policy that were especially tasty. Rarely did he ask searching questions and demand to know why someone had or had not done something.” As a consequence, a Republican senator went so far as to say: “With Ronald Reagan, no one is there. The sad fact is that we don’t have a president.” Instead of designating one person as his top aide, as Eisenhower had with Sherman Adams, Reagan set up a “troika”: James A. Baker III as chief of staff, Edwin Meese as counselor, and Michael Deaver as deputy chief of staff in charge of public relations—an arrangement that, for a time, left other appointees perplexed. The Reagan White House, said his first secretary of state, Alexander M. Haig Jr., was “as mysterious as a ghost ship; you heard the creak of the rigging and the groan of the timbers and sometimes even glimpsed the crew on deck. But which of the crew had the helm? Was it Meese, was it Baker, was it someone else? It was impossible to know for sure.” Similarly, Peggy Noonan ruminated: “Who’s in charge here? I could never understand where power was in that White House; it kept moving. I’d see men in suits huddled in a hall twenty paces from the Oval Office, and I’d think, there it is, that’s where they’re making the decisions. But the next day they were gone and the hall was empty.” The first lady made her own contribution to the diffusion of authority. No one of his appointees, not even his chief of staff, exercised so much power. The New York Times, discussing Nancy Reagan, even wrote of an “Associate Presidency.” She understood her husband’s limitations and did all she could to make sure that he was well served. Their son Michael said, “Dad looks at half a glass of water and says: Look at this! It’s half full! Nancy is always trying to figure out: Who stole the other half from my husband?” She sometimes influenced Reagan’s policies, notably when she pushed for arms control, and she was thought to have been responsible for the removal of two cabinet officials and of the president’s latter-day chief of staff. During his tenure, she dismissed accounts of her impact, but in her memoir, she acknowledged: “For eight years I was sleeping with the president, and if that doesn’t give you special access, I don’t know what does.” Reagan’s staff found especially exasperating the need to clear the president’s schedule with a first lady who placed so much reliance upon a West Coast astrologer, Joan Quigley. That had been true since the beginning in Sacramento when Reagan was inaugurated as governor at midnight because, it was reported, that was the hour this woman set after perusing the zodiac. On a number of occasions, Deaver would spend days working out an intricate itinerary for the president’s travels down to the last detail only to be told that he had to scrap everything because the astrologer had determined that the stars were not properly aligned. Horoscopes fixed the day and hour of such major events as presidential debates and summit meetings with Soviet leaders. The president’s most important aide said, “We were paralyzed by this craziness.” In these unpropitious circumstances, the troika managed much better than anticipated. Public administration theorists likened this three-headed makeshift to the mock definition of a camel: a horse put together by a committee. But Baker proved to be a highly effective chief of staff and Deaver a masterful maestro of staged events. Secretary Haig later remarked, “You couldn’t serve in this administration without knowing that Reagan was a cipher and that these men were running the government.” That judgment, however, failed to credit Reagan’s perspicacity. In setting up his team, he succeeded in taking to Washington two men who had served him faithfully in Sacramento—Meese and Deaver—while acknowledging that, since they and he had no experience inside the Beltway, he needed to salt his inner corps with a veteran of the Ford era. In choosing Baker, moreover, Reagan, stereotyped as a rigid ideologue, showed unexpected flexibility. Baker, a moderate, had been floor manager for Ford’s effort to deny Reagan the 1976 presidential nomination, and in 1980 he had run George Bush’s campaign against Governor Reagan. From the start of his political career, commentators, especially liberals, had been underestimating Reagan. When he announced that he was planning to run for governor of California, he encountered ridicule. At a time when Robert Cummings was a prominent film star, the Hollywood mogul Jack Warner responded, “No, Bob Cummings for governor, Ronald Reagan as his best friend.” Yet Reagan easily defeated the former mayor of San Francisco to win the Republican nomination, then stunned Democrats by prevailing over the incumbent governor, Pat Brown, by nearly a million votes. Furthermore, he went on to gain reelection to a second term. Reagan’s performance in Sacramento surprised both adversaries and followers. While continuing to proclaim his undying hostility to government intervention, he stepped up taxes on banks and corporations, increased benefits to welfare recipients, more than doubled funds for higher education, and safeguarded “wild and scenic rivers” from exploitation. A vocal advocate of “the right to life,” he nevertheless signed a bill in 1967 that resulted in a rise in legal abortions in the state from 518 in that year to nearly 100,000 in 1980. He was able to forge agreements with Democrats in the capital because he had the advantage, as a veteran of Screen Actors Guild battles, of being an experienced negotiator. (In later years, he said of his haggling with Mikhail Gorbachev: “It was easier than dealing with Jack Warner.”) His chief Democratic opponent in the legislature, who started out viewing Reagan with contempt, wound up concluding that he had been a pretty good governor, “better than Pat Brown, miles and planets and universes better than Jerry Brown”—the two most conspicuous Democratic leaders of the period. Scrutiny of his record, however, also raised disquieting features. Months after he took office as governor, a reporter asked him about his priorities. Disconcerted, Reagan turned toward an assistant and said, “I could take some coaching from the sidelines, if anyone can recall my legislative program.” Expected to decide between conflicting views on the abortion issue, “Reagan,” Cannon noted, “behaved as if lost at sea.” His aides often found it difficult to get him to concentrate. On one occasion, in the midst of a vital discussion about the budget, he wandered off: “Do you know how hard it is to mispronounce ‘psychiatric’ once you know how to pronounce it right? I had to do it in Kings Row and at first I couldn’t do it.” He especially alarmed members of his staff by flying into a rage if the press reported that he had changed his position on an issue, even when he undoubtedly had. All of his disabilities—gross misperceptions and knowledge gaps—he carried into the White House. Yet he was to leave office regarded as a consequential president, and a number of scholars were even to write of an “Age of Reagan.” Excerpted from "The American President: From Teddy Roosevelt to Bill Clinton" by William E. Leuchtenburg. Published by Oxford University Press. Copyright 2016 by William E. Leuchtenburg. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on December 27, 2015 11:00