Helen H. Moore's Blog, page 894

January 13, 2016

Requiem for a middleweight: Barack Obama’s witty, eloquent and upbeat speech — about failure

Barack Obama gave his last official State of the Union address on Tuesday evening, and viewed purely as performance it was a commanding work of oratory and rhetoric. The president has always been at his best when he’s not trying to cut a deal or make a sales pitch, and when he feels he has nothing to lose. You can’t fault this year’s SOTU for its breadth or vision: Obama was trying to set the terms of his final year in office, define his political legacy and frame the public perception of the 2016 campaign, whose topsy-turvy narrative is about to enter a critical period. I don’t think it’s stretching the point to say that Obama gave a more downbeat and emotional unofficial SOTU a week or so earlier, speaking from the heart rather than the head, with his tearful address on gun violence. As I wrote at the time, that issue and our inability to deal with it — or even to define it clearly — has come to symbolize all the failures and disappointments of the Obama years. No doubt the president’s upbeat, almost Reaganesque demeanor before Congress on Tuesday struck many viewers as discordant with a nation that feels paralyzed by cultural, ideological and racial divisions, economically stagnant, politically dysfunctional and consumed by irrational fears. You can hardly blame him for seeking to wipe away the tears and observe that the sun will still come up tomorrow. Obama is certainly within his rights to point out that his presidency has not been the unmitigated disaster depicted by the innumerable right-wing troglodytes who want to succeed him. A year ago, I held the conventional and charming notion that Rand Paul, with his faintly groovy, freewheeling libertarian views, posed the biggest threat to Hillary Clinton. Now Paul has become a figure of bathos and tragedy, complaining to talk-show hosts about his ejection from the GOP main stage. He is not nearly mean enough or crazy enough — or flat-out racist enough — for the Republican “base voter” of 2016. By all conventional measures, the economy is in vastly better shape than when Obama took office in 2009, although the divide between rich and poor has gotten worse. As the president pointed out on Tuesday, the unemployment rate has fallen by half, and so has the price of gasoline. (I would describe that last fact as a mixed blessing, in planetary terms.) Millions of people have access to affordable healthcare who didn’t before, even if the mechanism is a patchwork compromise at best. Our wasteful and destructive overseas wars in Iraq and Afghanistan are — well, no, they’re not over exactly. Low energy? Reduced in intensity? Conducted from afar like a video game, or like the alien genocide of Orson Scott Card’s “Ender’s Game”? In any case Obama has kept us out of another ground war, for now. As you see, I can’t help myself: Beneath all the things Obama said on Tuesday were all the things he didn’t say, or all the things he alluded to and brushed past. His delivery was suave, affable and relaxed; the speech was perfectly paced. It was fun to watch him repeatedly put House Speaker Paul Ryan, sitting stone-faced over his left shoulder, in an impossible position: Would Ryan applaud things that no reasonable person could oppose, like curing cancer or universal pre-K or extending tax cuts to low-wage workers, even at the risk of torrents of right-wing tweets accusing him of appeasing the socialist Kenyan dictator? (Ryan’s response: A few tepid claps, then hands back under his butt.) But if this SOTU was a car that looked good on the lot, we’re better off not opening the hood. If the cliché holds that some men and women are born great, while others have greatness thrust open them, the same can surely be said of mediocrity and disappointment. Obama came to the White House openly aspiring to great things and almost overtly comparing himself to Abraham Lincoln, another tall Illinoisan with an analytical cast of mind and limited legislative experience. Despite the obvious differences between the two men and their historical contexts, the parallels are seductive: Both were political outsiders with unusual family backgrounds, who had been raised by independent-minded women. In office, they faced militant, implacable resistance from an opposition party that stood for the values and mores of the white Southern oligarchy. Obama referred to Lincoln at least twice in Tuesday’s address. He did so once by name, in a striking admission that he has failed to address the partisan “rancor and suspicion” that dominates political life: “I have no doubt a president with the gifts of Lincoln or Roosevelt might have better bridged the divide.” A minute or two later, in urging the public to demand change and not to abandon the political process, he briefly paraphrased the concluding passage of the Gettysburg Address: “That’s what’s meant by a government of, by and for the people.” Of course the president who saved the union, ended slavery and redefined the mission of American democracy is quoted more often than any other, but Obama’s Lincoln references were not random or incidental. History will judge, over the next few decades or centuries, whether Obama’s failure to turn the tremendous wave of optimism that swept him into office toward meaningful policy or social change on any large scale was his fault or the Republicans’ fault or simply a reflection of a crumbling imperial nation in terminal decline. (If there is anyone left to write that history, that is.) We cannot possibly see that from the vantage point of the present. But Obama is among the most well-read and well-educated people ever to hold his current office. He is well aware that Lincoln was not some Moses-like moral visionary calling down the lightning of justice, but a shrewd political operator who seized on the opportunities history threw in his path. In fact Lincoln was not first or foremost a moral visionary, as Eric Foner’s fascinating book on the Great Emancipator’s shifting views of race and slavery, “The Fiery Trial,” makes clear. Lincoln was always opposed to slavery, but for most of his political career he believed it would continue into the indefinite future, and would only end when white Southerners agreed to give it up. Well into the Civil War years, as Southern intransigence drove him toward an abolitionist position, Lincoln thought that most emancipated slaves would choose to leave the United States, because white Americans would not tolerate living alongside large numbers of “free Negroes.” If the Confederacy had agreed to capitulate early in the war, while retaining its slaves, Lincoln would almost certainly have consented to that. His genius lay in perceiving the moral, existential and political crisis posed by the war, as the body count piled up and the national mood darkened, as an opening for what philosophers might call an epistemological shift. But it was also highly pragmatic. Proclaiming the end of slavery helped the Union win the war in concrete, material ways, by stripping the South of its wealth and its labor force and by supplying thousands of eager volunteers to the Union Army. By repurposing Thomas Jefferson’s nearly forgotten language from the Declaration of Independence to his own ends, Lincoln also turned the war itself into an irresistible moral crusade. He gave the North a reason to win the war, a sense of mission. And at a single stroke he created a new template for American identity, in which the nation was an unfinished democratic experiment driven forward by industrial capitalism, universal citizenship and a strong central government. As Barack Obama can tell you better than anyone, we are still quarreling over that remodeled notion of America 150 years later, and the dominant strain in the Republican Party seems devoted to rolling it back as far as possible. It is reasonable to claim that history and circumstance did not offer Obama a Lincoln-like moment of political opportunity, and that his opponents were too shrewd and too well financed, in a way the arrogant Confederates and their supporters were not. As I said earlier, conditions in 2009 were immeasurably different from those of 1861, and every new president must confront an immense interlocking apparatus of entrenched intelligence, military, financial and commercial bureaucracy — the “deep state” — that simply did not exist in the 19th century. But it is also reasonable to wonder whether Obama was ever able to see past the predefined terms of 21st-century political discourse, as Lincoln was able, however briefly, to leap beyond 19th-century conceptions of rights, law and justice. Obama has never questioned the endless worldwide “war” against a nebulous non-state enemy called “terrorism.” (Philosopher Alain Badiou has pointed out that “war” was a word that previously signified military conflict between states, and now seems to be a meaningless term of art designed to make Americans feel impressed by their own majesty.) Indeed, he has pursued that “war” in devious and imaginative new ways. Obama is highly skilled at delivering Democratic Party rhetoric about reducing economic inequality and limiting the political influence of bankers and big corporations. (Don’t get me wrong: Those positions are clearly preferable to the alternative.) But I see no signs that he has ever questioned the fundamental logic of neoliberal economics, in which Wall Street banks and those who run them are the disinterested guardians of public prosperity, too big to fail and too important to face punishment. Or the logic that dictates fiscal “austerity” and ever lower tax rates as fundamentally virtuous, and that insists on outsourcing all sorts of government functions to the private sector, at immense public expense and often with disastrous results. Obama came to the White House amid the financial crash of the Great Recession, a moral and existential crisis nearly as big as the Depression faced by FDR, although nowhere near the scale of the armed rebellion faced by Lincoln. As many people have noted, he essentially reappointed the same people who had wrecked the economy, in a new configuration. Maybe he could have dealt with it no better than he did, for internal or external reasons. Maybe no president, or at least no president who could conceivably get elected, could have done more. You don’t make it to that point in American politics if you are likely to announce that we’re no longer having an imaginary war against an imaginary enemy and we’re no longer having an economy based on the superior moral virtue of the rich. But the three or four minutes Obama spent talking about our poisoned and broken political system on Tuesday — to a chamber that suddenly fell dead quiet — dwarfed the rest of his speech. With eloquence and wit and tangible sadness, he compared his own efforts to redirect the course of our nation’s history to our two most famous presidents since George Washington, the two great individual change-agents of the last two centuries. And he told us he had failed.Barack Obama gave his last official State of the Union address on Tuesday evening, and viewed purely as performance it was a commanding work of oratory and rhetoric. The president has always been at his best when he’s not trying to cut a deal or make a sales pitch, and when he feels he has nothing to lose. You can’t fault this year’s SOTU for its breadth or vision: Obama was trying to set the terms of his final year in office, define his political legacy and frame the public perception of the 2016 campaign, whose topsy-turvy narrative is about to enter a critical period. I don’t think it’s stretching the point to say that Obama gave a more downbeat and emotional unofficial SOTU a week or so earlier, speaking from the heart rather than the head, with his tearful address on gun violence. As I wrote at the time, that issue and our inability to deal with it — or even to define it clearly — has come to symbolize all the failures and disappointments of the Obama years. No doubt the president’s upbeat, almost Reaganesque demeanor before Congress on Tuesday struck many viewers as discordant with a nation that feels paralyzed by cultural, ideological and racial divisions, economically stagnant, politically dysfunctional and consumed by irrational fears. You can hardly blame him for seeking to wipe away the tears and observe that the sun will still come up tomorrow. Obama is certainly within his rights to point out that his presidency has not been the unmitigated disaster depicted by the innumerable right-wing troglodytes who want to succeed him. A year ago, I held the conventional and charming notion that Rand Paul, with his faintly groovy, freewheeling libertarian views, posed the biggest threat to Hillary Clinton. Now Paul has become a figure of bathos and tragedy, complaining to talk-show hosts about his ejection from the GOP main stage. He is not nearly mean enough or crazy enough — or flat-out racist enough — for the Republican “base voter” of 2016. By all conventional measures, the economy is in vastly better shape than when Obama took office in 2009, although the divide between rich and poor has gotten worse. As the president pointed out on Tuesday, the unemployment rate has fallen by half, and so has the price of gasoline. (I would describe that last fact as a mixed blessing, in planetary terms.) Millions of people have access to affordable healthcare who didn’t before, even if the mechanism is a patchwork compromise at best. Our wasteful and destructive overseas wars in Iraq and Afghanistan are — well, no, they’re not over exactly. Low energy? Reduced in intensity? Conducted from afar like a video game, or like the alien genocide of Orson Scott Card’s “Ender’s Game”? In any case Obama has kept us out of another ground war, for now. As you see, I can’t help myself: Beneath all the things Obama said on Tuesday were all the things he didn’t say, or all the things he alluded to and brushed past. His delivery was suave, affable and relaxed; the speech was perfectly paced. It was fun to watch him repeatedly put House Speaker Paul Ryan, sitting stone-faced over his left shoulder, in an impossible position: Would Ryan applaud things that no reasonable person could oppose, like curing cancer or universal pre-K or extending tax cuts to low-wage workers, even at the risk of torrents of right-wing tweets accusing him of appeasing the socialist Kenyan dictator? (Ryan’s response: A few tepid claps, then hands back under his butt.) But if this SOTU was a car that looked good on the lot, we’re better off not opening the hood. If the cliché holds that some men and women are born great, while others have greatness thrust open them, the same can surely be said of mediocrity and disappointment. Obama came to the White House openly aspiring to great things and almost overtly comparing himself to Abraham Lincoln, another tall Illinoisan with an analytical cast of mind and limited legislative experience. Despite the obvious differences between the two men and their historical contexts, the parallels are seductive: Both were political outsiders with unusual family backgrounds, who had been raised by independent-minded women. In office, they faced militant, implacable resistance from an opposition party that stood for the values and mores of the white Southern oligarchy. Obama referred to Lincoln at least twice in Tuesday’s address. He did so once by name, in a striking admission that he has failed to address the partisan “rancor and suspicion” that dominates political life: “I have no doubt a president with the gifts of Lincoln or Roosevelt might have better bridged the divide.” A minute or two later, in urging the public to demand change and not to abandon the political process, he briefly paraphrased the concluding passage of the Gettysburg Address: “That’s what’s meant by a government of, by and for the people.” Of course the president who saved the union, ended slavery and redefined the mission of American democracy is quoted more often than any other, but Obama’s Lincoln references were not random or incidental. History will judge, over the next few decades or centuries, whether Obama’s failure to turn the tremendous wave of optimism that swept him into office toward meaningful policy or social change on any large scale was his fault or the Republicans’ fault or simply a reflection of a crumbling imperial nation in terminal decline. (If there is anyone left to write that history, that is.) We cannot possibly see that from the vantage point of the present. But Obama is among the most well-read and well-educated people ever to hold his current office. He is well aware that Lincoln was not some Moses-like moral visionary calling down the lightning of justice, but a shrewd political operator who seized on the opportunities history threw in his path. In fact Lincoln was not first or foremost a moral visionary, as Eric Foner’s fascinating book on the Great Emancipator’s shifting views of race and slavery, “The Fiery Trial,” makes clear. Lincoln was always opposed to slavery, but for most of his political career he believed it would continue into the indefinite future, and would only end when white Southerners agreed to give it up. Well into the Civil War years, as Southern intransigence drove him toward an abolitionist position, Lincoln thought that most emancipated slaves would choose to leave the United States, because white Americans would not tolerate living alongside large numbers of “free Negroes.” If the Confederacy had agreed to capitulate early in the war, while retaining its slaves, Lincoln would almost certainly have consented to that. His genius lay in perceiving the moral, existential and political crisis posed by the war, as the body count piled up and the national mood darkened, as an opening for what philosophers might call an epistemological shift. But it was also highly pragmatic. Proclaiming the end of slavery helped the Union win the war in concrete, material ways, by stripping the South of its wealth and its labor force and by supplying thousands of eager volunteers to the Union Army. By repurposing Thomas Jefferson’s nearly forgotten language from the Declaration of Independence to his own ends, Lincoln also turned the war itself into an irresistible moral crusade. He gave the North a reason to win the war, a sense of mission. And at a single stroke he created a new template for American identity, in which the nation was an unfinished democratic experiment driven forward by industrial capitalism, universal citizenship and a strong central government. As Barack Obama can tell you better than anyone, we are still quarreling over that remodeled notion of America 150 years later, and the dominant strain in the Republican Party seems devoted to rolling it back as far as possible. It is reasonable to claim that history and circumstance did not offer Obama a Lincoln-like moment of political opportunity, and that his opponents were too shrewd and too well financed, in a way the arrogant Confederates and their supporters were not. As I said earlier, conditions in 2009 were immeasurably different from those of 1861, and every new president must confront an immense interlocking apparatus of entrenched intelligence, military, financial and commercial bureaucracy — the “deep state” — that simply did not exist in the 19th century. But it is also reasonable to wonder whether Obama was ever able to see past the predefined terms of 21st-century political discourse, as Lincoln was able, however briefly, to leap beyond 19th-century conceptions of rights, law and justice. Obama has never questioned the endless worldwide “war” against a nebulous non-state enemy called “terrorism.” (Philosopher Alain Badiou has pointed out that “war” was a word that previously signified military conflict between states, and now seems to be a meaningless term of art designed to make Americans feel impressed by their own majesty.) Indeed, he has pursued that “war” in devious and imaginative new ways. Obama is highly skilled at delivering Democratic Party rhetoric about reducing economic inequality and limiting the political influence of bankers and big corporations. (Don’t get me wrong: Those positions are clearly preferable to the alternative.) But I see no signs that he has ever questioned the fundamental logic of neoliberal economics, in which Wall Street banks and those who run them are the disinterested guardians of public prosperity, too big to fail and too important to face punishment. Or the logic that dictates fiscal “austerity” and ever lower tax rates as fundamentally virtuous, and that insists on outsourcing all sorts of government functions to the private sector, at immense public expense and often with disastrous results. Obama came to the White House amid the financial crash of the Great Recession, a moral and existential crisis nearly as big as the Depression faced by FDR, although nowhere near the scale of the armed rebellion faced by Lincoln. As many people have noted, he essentially reappointed the same people who had wrecked the economy, in a new configuration. Maybe he could have dealt with it no better than he did, for internal or external reasons. Maybe no president, or at least no president who could conceivably get elected, could have done more. You don’t make it to that point in American politics if you are likely to announce that we’re no longer having an imaginary war against an imaginary enemy and we’re no longer having an economy based on the superior moral virtue of the rich. But the three or four minutes Obama spent talking about our poisoned and broken political system on Tuesday — to a chamber that suddenly fell dead quiet — dwarfed the rest of his speech. With eloquence and wit and tangible sadness, he compared his own efforts to redirect the course of our nation’s history to our two most famous presidents since George Washington, the two great individual change-agents of the last two centuries. And he told us he had failed.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 16:10

May the Powerball odds be ever in your favor: Poverty, greed, exploitation and the lottery lessons we refuse to learn

The odds of winning the Powerball lottery tonight are one in 292 million. Statistically speaking, given that the population of the U.S. is around 300 million, the odds of winning are almost identical to the odds if you didn't buy a ticket at all, i.e., nearly impossible. Still, with a jackpot now up to $1.5 billion, hopeful gamblers have been lining up and driving for hours to buy as many tickets as they can. Somebody in the U.S. has to win--might as well be you, right? Not quite. The lottery is a distinctive mixture of optimism, superstition and ignorance seasoned with magical thinking. You have a feeling. Your numbers are lucky. You are the one. Meanwhile the harsh realities of playing the lottery have been thoroughly parsed by economists, social scientists and other wonky professor-types who have patiently pointed out that there are hidden costs to playing, such as the price of gas and the wages lost by standing in line; that money is time that hourly wage-earners typically do not have, as explained in-depth by "poverty whisperer" Linda Tirado; and that the funds the state collects via the lottery act as a regressive tax, for the state uses those revenues to support programs, particularly education, that ought to have been fully supported by taxes in the first place. But they're not, so funding for education is left to languish, and unless people keep playing the lottery, one ends up with the kind of bad math skills that takes $1.3 billion, divides that number by the U.S. population of 300 million, and comes up with... $4.3 million for everybody! Poverty solved! But the fuller implications of the lottery fever building around this $1.5 billion (and rising) jackpot is far more depressing than the fact that more than 800,000 people shared that Facebook meme. Back in 1930, the Irish Sweepstakes raised enormous sums in Britain, Ireland, Canada and the U.S., by telling participants that monies raised via lottery would go to help sick children. A tiny portion did actually go to hospitals in Ireland, but the real winners were the men running the action, who became millionaires with the support of the Irish government. "For decades," Stephen Dodd noted in an article published in the Independent in 2003, "the greed that propelled the Irish Sweeps soiled Ireland's image around the world...The American edition of Reader's Digest once described the Sweeps as 'the greatest bleeding heart racket in the world'." The Irish Sweepstakes was a real event, but the lessons of that scam remain unlearned. It's not because teachers aren't trying. In schools today, fifth-graders still read Shirley Jackson's elegant, chilling short story "The Lottery," about a fictitious lottery where the winner gets stoned to death. From there, a straight line can be drawn to Steven King's short story "The Long Walk," and thence to Suzanne' Collins' series "The Hunger Games." Innumerable episodes of television, including "Hell Money," from "The X-Files," revolve around a lottery where the payoff promises to be worth the risk. What do all these real and fictional lotteries have in common? They exploit the desperation of the powerless, who feel themselves at the mercy of forces they neither control nor understand. But in every one of these stories, the lottery is a cheat. The promise of the payoff is false, as is the promise of a new and better life should you win. In every case, rather, the "winner" is worse off for having won, and in all the literary examples, the price of winning the lottery is your life. What is especially troubling about the lottery is that it operates under the guise of populist opportunity, offering hope to those who have little of it in their lives. When tied to the mythology of the self-made man, which stresses the boot-strappy chance-taking that leads to success, it becomes toxic for the poor. It's true that great fortunes have been made from the predictable psychology of (other people's financial) panic during recessions. Amid the financial upheaval of the '70s, the Hunt brothers in Texas tried to corner the market on silver and silver futures, for example, and Armand Hammer famously made his first million by cornering the market on tincture of vanilla at the start of Prohibition. However, that kind of speculation requires cash assets already in hand, and aren't so much gambling as they are carefully calculated risks. But who doesn't have $2 to bet on a potential payoff of $1.5 billion for a game that's a no-brainer to play? Trouble is that low-income people are most likely to play the lottery in hopes of escaping poverty, and that $2, however small a sum, represents a larger percentage of a meager paycheck than a substantial one. The Irish Sweepstakes took hold during the Great Depression, when great swaths of Americans found themselves jobless without assets. When economic prospects are dire, the desperate will sell their bodies, and when that fails, they'll sell body parts. Sounds extreme, but it's already happening politely via sale of blood plasma and the "gift" of surrogacy, even as a 2012 poll of 700 British single and married men between the ages of 21-50 discovered that 43 percent would rather give up their "virility" than lose their job. Though framed as as willingness to give up sex, it's what sex is for that counts -- it's the engine of romance, family and ultimately baby-making. When times are hard, economic survival trumps hope for the future. Which, with every passing day, looks to be a winning lottery ticket for that Virgin Galactic spaceflight to the moon.The odds of winning the Powerball lottery tonight are one in 292 million. Statistically speaking, given that the population of the U.S. is around 300 million, the odds of winning are almost identical to the odds if you didn't buy a ticket at all, i.e., nearly impossible. Still, with a jackpot now up to $1.5 billion, hopeful gamblers have been lining up and driving for hours to buy as many tickets as they can. Somebody in the U.S. has to win--might as well be you, right? Not quite. The lottery is a distinctive mixture of optimism, superstition and ignorance seasoned with magical thinking. You have a feeling. Your numbers are lucky. You are the one. Meanwhile the harsh realities of playing the lottery have been thoroughly parsed by economists, social scientists and other wonky professor-types who have patiently pointed out that there are hidden costs to playing, such as the price of gas and the wages lost by standing in line; that money is time that hourly wage-earners typically do not have, as explained in-depth by "poverty whisperer" Linda Tirado; and that the funds the state collects via the lottery act as a regressive tax, for the state uses those revenues to support programs, particularly education, that ought to have been fully supported by taxes in the first place. But they're not, so funding for education is left to languish, and unless people keep playing the lottery, one ends up with the kind of bad math skills that takes $1.3 billion, divides that number by the U.S. population of 300 million, and comes up with... $4.3 million for everybody! Poverty solved! But the fuller implications of the lottery fever building around this $1.5 billion (and rising) jackpot is far more depressing than the fact that more than 800,000 people shared that Facebook meme. Back in 1930, the Irish Sweepstakes raised enormous sums in Britain, Ireland, Canada and the U.S., by telling participants that monies raised via lottery would go to help sick children. A tiny portion did actually go to hospitals in Ireland, but the real winners were the men running the action, who became millionaires with the support of the Irish government. "For decades," Stephen Dodd noted in an article published in the Independent in 2003, "the greed that propelled the Irish Sweeps soiled Ireland's image around the world...The American edition of Reader's Digest once described the Sweeps as 'the greatest bleeding heart racket in the world'." The Irish Sweepstakes was a real event, but the lessons of that scam remain unlearned. It's not because teachers aren't trying. In schools today, fifth-graders still read Shirley Jackson's elegant, chilling short story "The Lottery," about a fictitious lottery where the winner gets stoned to death. From there, a straight line can be drawn to Steven King's short story "The Long Walk," and thence to Suzanne' Collins' series "The Hunger Games." Innumerable episodes of television, including "Hell Money," from "The X-Files," revolve around a lottery where the payoff promises to be worth the risk. What do all these real and fictional lotteries have in common? They exploit the desperation of the powerless, who feel themselves at the mercy of forces they neither control nor understand. But in every one of these stories, the lottery is a cheat. The promise of the payoff is false, as is the promise of a new and better life should you win. In every case, rather, the "winner" is worse off for having won, and in all the literary examples, the price of winning the lottery is your life. What is especially troubling about the lottery is that it operates under the guise of populist opportunity, offering hope to those who have little of it in their lives. When tied to the mythology of the self-made man, which stresses the boot-strappy chance-taking that leads to success, it becomes toxic for the poor. It's true that great fortunes have been made from the predictable psychology of (other people's financial) panic during recessions. Amid the financial upheaval of the '70s, the Hunt brothers in Texas tried to corner the market on silver and silver futures, for example, and Armand Hammer famously made his first million by cornering the market on tincture of vanilla at the start of Prohibition. However, that kind of speculation requires cash assets already in hand, and aren't so much gambling as they are carefully calculated risks. But who doesn't have $2 to bet on a potential payoff of $1.5 billion for a game that's a no-brainer to play? Trouble is that low-income people are most likely to play the lottery in hopes of escaping poverty, and that $2, however small a sum, represents a larger percentage of a meager paycheck than a substantial one. The Irish Sweepstakes took hold during the Great Depression, when great swaths of Americans found themselves jobless without assets. When economic prospects are dire, the desperate will sell their bodies, and when that fails, they'll sell body parts. Sounds extreme, but it's already happening politely via sale of blood plasma and the "gift" of surrogacy, even as a 2012 poll of 700 British single and married men between the ages of 21-50 discovered that 43 percent would rather give up their "virility" than lose their job. Though framed as as willingness to give up sex, it's what sex is for that counts -- it's the engine of romance, family and ultimately baby-making. When times are hard, economic survival trumps hope for the future. Which, with every passing day, looks to be a winning lottery ticket for that Virgin Galactic spaceflight to the moon.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 16:00

David Bowie’s magic dance: His explosive cinematic sexuality, from the Goblin King to “Merry Christmas Mr. Lawrence”

David Bowie famously didn’t allow Todd Haynes to use any of his music in his 1998 film "Velvet Goldmine," with the result that the film’s soundtrack and its imaging of an impossibly beautiful, alien (and yet all-too human) creature emerging from working-class England constantly refers to Bowie without us ever hearing his voice. The film both is and isn’t about him, and its references are at once pastiche, performance, evocation, reworking and homage. In this shimmering effect of not-quite Bowie-ness, "Velvet Goldmine" captures Bowie better than perhaps anyone else. What Haynes’ film emphasizes — the play with gender and sexual rules, the power of costume and personae, the sheer sexiness of alterity — is of course central to Bowie’s music, but it’s visible too across his acting career. Bowie’s screen performances are always doubled: He’s always David Bowie at the same time as he inhabits a character. He can’t help this, of course. He’s David freaking Bowie and he’s never going to blend. But that doubling effect also describes his mode of queer performance, in which his gender, sexuality or, indeed, species is never taken for granted. In "The Hunger" (1983), he makes a convincing 200-year old vampire. Of course, Tony Scott’s film is a lesbian cult classic for its sex scene between Catherine Deneuve and Susan Sarandon. But even though Bowie’s character has been unceremoniously boxed in the attic by that point, the eroticism of the Bowie-Deneuve-Sarandon triangle imbues both the film and its production history with a potent cocktail of iconic stardom and bisexual potential. His films are not always about issues of identity, gender or desire, yet on-screen he inevitably unsettles us in ways that are visceral and corporeal rather than didactic or deterministic. In "Labyrinth" (1986) he brings a thrillingly dangerous sensuality to the role of the goblin king, reveling in spiky, back-combed hair, strong eye makeup, and a series of New Romantic costumes whose plentiful ruffles never entirely distracted the eye from the tight pants. According to the film’s designer, he was costumed to be “a young girl’s dream of a pop star,” and Bowie as much as the costume department understood that dream to be vividly sexual. Reclining on a chair, tapping a riding crop against his boots, Bowie’s Jareth changed the dynamic of Jim Henson’s family adventure completely, and turned a new generation of young viewers onto his charms. Bowie dislodges normative ways of being and that’s a powerful thing for a filmmaker to work with. Throughout his career, he chose to collaborate with filmmakers who understood this and who made good use of his uniquely destabilizing effect. Nic Roeg certainly understood the sublime potential of Bowie’s otherness, casting him as the alienated alien in "The Man Who Fell to Earth" (1976). Following "Performance" (1970), Roeg knew a bit about working with rock stars, but this is a different kind of film altogether. Bowie’s long, thin (yes, cocaine-fueled) body is never more exquisitely angular than in Roeg’s chilly compositions. The film deploys his bright orange hair and starkly pale skin in distorted mirror reflections and dramatic graphic compositions to construct a compelling physical strangeness that contrasts with the conformity characteristic of the film’s unforgiving modern world. As his sometime lover Mary-Lou says, “You’re really a freak,” adding, “I don’t mean that unkindly. I like freaks.” The film is also a kind of queer love story, about a woman who falls for a creature so different that she wets herself in shock when she sees his real face. Bowie is on record as having drawn from his alien persona, but the film equally used his stardom: The trailer trumpets him as “David Bowie: phenomenon of our time.” Bowie uses this seepage from art to life and back in his performance, combining the outsider style and androgyny of his rock star persona with the unassimilable difference of the fictional alien. Does he attract crowds because he is a star or a star man? He is both at once and despite his desire to go home, we want him to stay with us on earth. Many commentators have written movingly about what David Bowie meant to them as young queers in the 1970s; how he radically changed what and how it was possible to be. I was a little younger, a teenager in the 1980s, but through the time-honored conduit of my friends’ older siblings, I was listening obsessively to "Hunky Dory" and "Ziggy Stardust" alongside Lou Reed, the Buzzcocks and the Undertones. The "Ashes to Ashes" video may have opened up a fledgling sense of the utter beauty of being weird, but as a wannabe cinephile, it was Bowie’s movies of the period that ended up playing a transformative role in my life. In the 1980s, British television’s film programming was radical by today’s standards, and I watched "Merry Christmas Mr. Lawrence" (1983) with my parents when I was about 13. Nagisa Oshima’s film is complex and resonant, a rare assemblage of talents from Ryuichi Sakamoto to Takeshi Kitano and Tom Conti. But what seduced me utterly was the perfect meshing of Oshima and Bowie’s perverse coolness. The scene in which Bowie, playing New Zealand prisoner of war Major Jack Celliers, kisses Sakamoto’s Japanese Captain Yonoi is surely burned into the memories of many a young pervert, and the film is not only homoerotic. The high-stakes power dynamic in this forbidden attraction crackles throughout, and Bowie manages to look at once rebellious and louche while shackled by the wrists and more so when finally buried in the ground and bound with rope around his chest. There’s a lot else going on in the film but I hope it’s no disservice to Oshima to say that as a young viewer, I experienced "Merry Christmas Mr. Lawrence" as at once a sexual and a cinematic revelation. A couple of years later, my local arthouse cinema played "In the Realm of the Senses" (1976), Oshima’s disturbing and pornographic art film, which had the equivalent of an NC-17 rating. I had only heard of it because of David Bowie and that was more than enough to get me to the cinema. I sought out Nic Roeg’s other films too, and Catherine Deneuve’s. My love of cinema and its transgressive potential was prompted in no small measure by Bowie’s performances, and his presence in a film has always signaled something unpredictable and mesmerizing. In David Lynch’s "Twin Peaks: Fire Walk With Me" (1992), his mere appearance in a security video throws even Lynch’s already surreal world off balance. His charismatic performance as Nikola Tesla in Christopher Nolan’s "The Prestige" (2006) brought a sense of real magic to a film intimately concerned with performance, trickery and doubling. And in a beautiful video, he returned to the kind of gender ambiguity with which he began his career in 2013’s "The Stars (Are Out Tonight)," swapping bodies with Tilda Swinton, and singing opposite Andreja Pejic as the echo of a younger Bowie. The platform may have changed but my excitement to see Swinton becoming Bowie becoming Pejic, as boundaries of gender and age disappear, surely echoes the feelings of those first viewers amazed by Bowie’s physical presence. His film performances are rightfully overshadowed by his music, but watching David Bowie on-screen taught me something crucial about the potential of cinema to rewrite social rules and to imagine radically new forms of being.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 15:59

5 blows against the testocracy in 2015. Hope for 2016?

AlterNet As we enter the new year, many schools around the nation will take a holiday from teaching and solely concentrate on rote memorization in services of standardized test preparation for the spring exams that come attached with strict consequences. And yet there is great hope for education in 2016! Never before in U.S. history have more students, parents and teachers engaged in acts of resistance to standardized tests. During the 2015 testing season, over 620,000 public school students around the U.S. refused to take standardized exams, according to a report by the National Center for Fair & Open Testing (FairTest). This mass movement promises to only expand. Advocates for authentic assessments scored these five significant victories in 2015 against the “testocracy” and its test-and-punish model of education.

1. Obama puts testing rhetoric in reverse.

President Obama announced in October that “unnecessary testing” is “consuming too much instructional time." This announcement came as a surprise given Obama’s support for policies like Race to the Top that contributed to the proliferation of high-stakes testing. The reversal of rhetoric was a result of the mass opt-out movement and will surely embolden authentic-assessment activists in the coming year.

2. Seattle strikes a blow against VAMpire evaluations.

Seattle educators waged a five-day strike to start the 2015 school year that was able to remove the so-called “student growth rating,” a form of value added modeling (VAM), from the contract. This stunning victory against the testocracy removed all standardized test scores from teacher evaluations.

3. An emergency exit for exit exams.

In a repudiation of one of the central components of the testocracy’s agenda, policymakers repealed the California graduation test and relaxed Texas’ graduation testing requirements—joining some six other states that repealed or delayed these exams in the 2013-2014 school year. Arizona, California, Georgia, and South Carolina decided to grant diplomas retroactively to thousands of students denied them because of the exit exams.

4. Na na na na, na na na na, hey AYP, goodbye!

The new federal education law, the Every Student Succeeds Act, certainly doesn’t dethrone the testocracy, maintaining the destructive requirement for testing in grades 3-8 and again in high school. However, ESSA does depose one of the cruelest aspects of the test-and-punish policy under NCLB: the so-called Adequate Yearly Progress annual test score improvement requirement that labeled nearly every American school failing.

5. Action against the ACT/SAT.

2015 was the best on record for the test-optional college admissions movement, with some three dozen more universities and colleges reducing or eliminating ACT or SAT requirements. According to FairTest, more than 850 institutions of higher learning have now dropped this testing requirement.

A version of this article first appeared at: http://educationvotes.nea.org

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 15:58

The Razzies aren’t fun anymore: The snarky answer to the Oscars has become a pointless, tone-deaf, self-important sham

You pretty much know what you’re getting when an awards group can’t even spell the names of the actors correctly. This morning, the Golden Raspberry Awards (popularly known as “the Razzies”) announced their nominations—the movies, directors, screenwriters, and thespians singled out as the “worst” of the year. The list offers the Razzies’ signature mix of the boring and the baffling—with low-hanging fruit like “Fifty Shades of Grey” and “Pixels” honored next to former Oscar winners Julianne Moore (“Seventh Son”) and Eddie Redmayne (“Jupiter Ascending”). In true Razzies fashion, they accidentally listed the actor’s name with an “I.” The rest of this year’s nominees showed about the same attention to detail and intellectual rigor—less an actual measure of what was truly bad last year than a chance for the Razzies to generate headlines. Thus, the acting pool is filled with big names like Gwyneth Paltrow (“Mortdecai”), Amanda Seyfried ("Love the Coopers" and “Pan”), Michelle Monaghan (“Pixels”), and Dakota Johnson (“Fifty Shades of Grey”), all of whom were OK in bad movies. In fact, critics credited Johnson as the lone bright spot in an otherwise generic BDSM romance: Total Film called her a “revelation,” while New York magazine’s David Edelstein commented that she’s “so good at navigating the heroine’s emotional zigs and zags that you want to buy into the whole cobwebbed premise.” What I hate about the Golden Raspberry Awards is that they’re increasingly unnecessary in an age when these movies have already been thoroughly roasted on Twitter months before the awards. What can the Razzies say about “Fifty Shades of Grey” that hasn’t already been said? Could they possibly make “Pan” look worse than it did after the fallout from the film’s whitewashing controversy—when Rooney Mara was criticized for playing Tiger Lily, a Native American character? But what’s even worse is that while the awards seem to believe they’re punching up at a broken system, this joyless awards show only continues to punch down at all the wrong targets. There are few nominees on the list that truly deserve mention. While I’m behind the Worst Picture nominations for “Fantastic Four” and “Paul Blart: Mall Cop 2,” two of the least necessary films ever made, “Pixels” isn’t even in the ballpark of previous Sandler entries like “That’s My Boy” or Worst Picture-winner “Jack and Jill.” In fact, it wasn’t even the worst Adam Sandler movie released this year. “The Cobbler,” Sandler’s bizarre body-switching gentrification movie, received some of the worst reviews in the history of the Toronto Film Festival. In addition, the Netflix-only release “Ridiculous Six” became one of just a few movies to score a zero percent on Rotten Tomatoes. Aside from “Paul Blart,” you won’t see any of the movies on this list that actually were the worst films of 2015. “United Passions” was one of the most abysmally reviewed movies ever: The FIFA propaganda piece starring Tim Roth earned a 1 rating on Metacritic and a zero on Rotten Tomatoes, meaning absolutely everyone who saw it hated it. In addition, the $25-30 million production posted the most dismal opening in box office history—earning just $918 on its opening weekend. “United Passions” earned just $607 in the United States during its run, making it the all-time king of flops. If that’s not Razzie-worthy, I don’t know what is. There are so many much better choices that deserve mention that it’s unfathomable that these are the films Razzie voters decided to call out. After all, 2015 was the year that brought us “Accidental Love,” the David O. Russell political satire abandoned back in 2011 due to funding issues; it found its way into theaters anyway, clearly half-finished and cobbled together from scraps. Co-starring Jessica Biel and Jake Gyllenhaal, the result is so embarrassing you want to throw yourself in front of the camera to stop it. There was also “The Human Centipede III,” Tom Six’s final sequence in the ass-to-mouth franchise. At one point, it shared the honor of the lowest Metascore ever (before skyrocketing all the way up to a 5). It garnered a worst rip-off or sequel nod, but doesn't crack the majors. But of course, this is how the Razzies work. Last year, the Daily Dot’s Samantha Allen pointed out that the group’s shortlist referred to "Transformers: Age of Extinction" as “Trannies #4.” According to Allen, the awards merely serve to punch down at already marginalized groups in Hollywood—like trans people, women over 40, and black folks, who have been nominated for a disproportionate number of awards. Since 1980, Allen tabulated that 13 black actors had been nominated for Academy Awards, while 14 had nabbed Razzie nominations. “That’s a sentence that reveals so much about Hollywood: how infrequently it recognizes black actors, how few black roles are created in the first place, and how harshly black actors are judged for their work in this environment,” she wrote. Thus, if you have to ask why a movie like “Fifty Shades of Grey” (which had a pretty decent Metascore of 46) earned six more nominations than “United Passions” and four more than “The Human Centipede III,” it’s because of who it appeals to: women. In 2010, “Sex and the City 2” earned a joint Worst Actress nomination for all four of its leads, while every single “Twilight” movie scored a nomination. This year, the Wachowskis’ “Jupiter Ascending” arrived on the Razzies’ list not just because it was a massive flop in theaters (losing over $100 million) but because it did what so few action films do: put a woman at the center. The case of “Jupiter Ascending” further illustrates why an exercise like the Razzies is so frustrating: It treats all of these movies like they’re on an even playing field, when they are anything but. The Wachowskis’ film might have been a box-office bomb but it was an incredibly daring, noble one. After all, how many movies have you seen that contain a sci-fi romance between an albino vigilante werewolf and a Russian space princess who cleans toilets for a living? Whether or not they are successful, such bold visions are the kinds of movies that we should be encouraging Hollywood to make—ones that are unafraid to do something different (or challenge the system), even if the end result is a tad ridiculous. Treating a silly, harmless misfire as worse than “The Human Centipede III,” which features a 500-person chain of prisoners eating their own filth, is both a grave disservice to the Wachowskis and to movie-lovers everywhere. After all, it’s really hard to hate a movie in which Mila Kunis defends her interspecies crush by asserting: “I love dogs. I’ve always loved dogs.” The same could be said for “The Boy Next Door,” Jennifer Lopez’s delightfully over-the-top erotic thriller. This future camp classic exists on a planet where giving someone a first-edition of “The Iliad” (found at a yard sale!) is no big deal. “The Boy Next Door” might be a dumb movie, but it’s also a necessary one: It’s one of the rare movies to feature two Latino leads. Targeting “The Boy Next Door” takes all the fun out of what the Golden Raspberry Awards should be: a glorious celebration of failure. The best moments in Razzie history have been when it felt like the actors were in on the joke—like when Halle Berry showed up to accept her Worst Actress award for “Catwoman.” In 2011, Sandra Bullock gave out a wagon full of copies of “All About Steve”—for which she won Worst Actress—to hand out to Razzie members; she begged them to watch the movie again. Say what you will about “All About Steve,” but Bullock—who served as a producer on the movie—fought for it, even after it tanked. She believed in that piece of crap. Founded in 1980 by John Wilson, the Golden Raspberry Awards were meant to serve as a counterpoint to the Oscars—deflating the egos of the rich and self-important—but the show become much, much worse than the ceremony it’s meant to mock. Thirty-five years after the Razzies gave out their first Worst Picture award to “Can’t Stop the Music,” they’ve become a nothing but an outdated, tone-deaf roast—like a drunk insult comic who doesn’t know when he’s had too many and won’t get off the stage. It’s time for the Razzies to cut themselves off and go home.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 14:53

No, really, you don’t want to win the Powerball lottery — no matter how big the prize

AlterNet The Powerball pot just hit $1.4 billion—the largest lottery tally not just in U.S. history, but the biggest haul globally. While someone will take home the prize, the chances are better than ever that it will not be you. In fact, right now, odds are placed at 1 in 292.2 million. That means statistically you have a better chance of becoming president than winning. (Though, if you just ratchet up the hate talk and consult with Donald’s Trump team, you might up your shot at a GOP run. Think it over.) The question is, do you really want to win? Money is nice, but for plenty of lottery winners, all that money sometimes leads to misery, pain and even death. The lottery curse may not be a real thing—anything including the word “curse” usually isn’t—but most jackpot winners end off worse than they started. Nearly 70 percent are financially ruined within seven years of winning, and a troubling number of winners even end up dead. "Of the thousands of lottery winners I knew, a few were happy and a few lived happily ever after," Edward Ugel, author of Money for Nothing: One Man's Journey Through the Dark Side of Lottery Millions told the Daily Beast. "But you would be blown away to see how many winners wish they'd never won." CNN looked at the lives of a few lottery winners post-jackpot and the bad luck that befell them after their supposedly lucky break. Sadly, many of their stories shared similar themes of tragedy. There was South Florida’s David Lee Edwards, who took home $27 million after taxes. Edwards and his wife spent money on lavish items including multiple mansions, a Ferrari and several other fancy cars, as well as a Lear jet. According to Edwards, in one year he spent $12 million. A former drug addict, he and his wife began using heavily, before divorcing in the face of mounting legal troubles. Edwards died alone in hospice care in 2013 at age 58. Abraham Shakespeare won $30 million in 2006, only to allegedly be swindled by a woman named DeeDee Moore who convinced him she was protecting him from all the vultures in his life. Shakespeare went missing in 2009, just a few months after Moore convinced him to transfer all his wealth to her. In 2012 she was sentenced to life in jail for Shakespeare’s murder. And then there was Amanda Clayton, who won $1 million in the Michigan lottery. She landed in legal hot water when it was discovered she’d continued to receive welfare benefits even after, but she ended up with parole. Within a year of winning, she was found dead of a likely drug overdose. Why do so many lottery stories end badly? And why do so many winners end up broke, addicted or dead? Jonathan DeYoe, a wealth manager, says lottery winners become a target, with requests for money coming in from all sides. After all, the very first things lottery agencies do when a winner is confirmed is announce that win to the world. Everyone suddenly knows your name and that you're rich. The constant hounding can be depleting in every way. "They're going to go after you," DeYoe told the Daily Beast. "People are going to ask you for money, and you're going to want to give them money, because you're a good person." "Forget about the typical guys who go after you: the insurance agents, the guys who have a bridge to sell you," Ugel says. "It's the friends and family you've got to worry about. Once you win, who'll always be expected to pick up every tab at every bar? Who'll always be expected to bail out the sister who's so terrible with money?" There’s also the issue of how people spend once they win, which will generally run in accordance with how they lived prior to winning, just on a much grander scale. The Daily Beast suggests the average lottery winner—and likely, the average lottery player—is “working-class, familiar with vice, and not shy about taking risks.” Without good financial advice, the acquisition of millions leads many to throw their money away on bad investments and conspicuous consumption. And if drugs or alcohol were a problem in the past, they’re almost certain to come back with a vengeance. Chalk it up to “positive income shock,” which according to at least one study, causes "changes in lifestyles which may well be prejudicial to health.” "First, you win some money and you start to hang out with a different group of people, and this changes what you want to do in life,” Andrew Clark, coauthor of the study, told the Daily Beast. “Second, your preferences don't change, but your budget constraint has been relaxed. As such, you can buy more cigarettes and booze." Bénédicte Apouey, another study author, explained further: "At first sight, this looks surprising. However, previous macroeconomic research has found that when the economy expands in the U.S., physical health deteriorates." So how, after winning millions, do you stay sane, healthy and—in case you’re wondering—rich? DeYoe suggests avoiding shortsighted thinking. Jackpot winners should consider “the long-term,” he advises. “Take some quiet time to reflect on who you are, what your values are, what your life vision and life mission are."

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 14:53

Ted Cruz just wants blood: Humanity is a weakness in this Republican Party

After Senator Ted Cruz suggested that the United States begin carpet bombing Islamic State (IS) forces in Syria, the reaction was swift. Hillary Clinton mocked candidates who use “bluster and bigotry.” Jeb Bush insisted the idea was “foolish.”  Rich Lowry, the editor of National Review, tweeted: “You can't carpet bomb an insurgency out of existence. This is just silly.” When CNN’s Wolf Blitzer objected that Cruz’s proposal would lead to lots of civilian casualties, the senator retorted somewhat incoherently: "You would carpet bomb where ISIS is -- not a city, but the location of the troops. You use air power directed -- and you have embedded special forces to direction the air power. But the object isn't to level a city. The object is to kill the ISIS terrorists." PolitiFact drily noted that Cruz apparently didn’t understand what the process of carpet (or “saturation”) bombing entails. By definition, it means bombing a wide area regardless of the human cost. By almost any standard Cruz’s proposal was laughable and his rivals and the media called him on it. What happened next? By all rights after such a mixture of inanity and ruthlessness, not to say bloody-mindedness against civilian populations, his poll numbers should have begun to sag. After all, he’d just flunked the commander-in-chief test and what might have seemed like a test of his humanity as well. In fact, his poll numbers actually crept up. The week before the imbroglio, an ABC opinion poll had registered him at 15% nationally. By the following week, he was up to 18% and one poll even had him at a resounding 24%. How to explain this?  While many factors can affect a candidate’s polling numbers, one uncomfortable conclusion can’t be overlooked when it comes to reactions to Cruz’s comments: by and large, Americans don’t think or care much about the real-world consequences of the unleashing of American air power or that of our allies. The other day, Human Rights Watch (HRW)reported that, in September and October, a Saudi Arabian coalition backed by the United States “carried out at least six apparently unlawful airstrikes in residential areas of the [Yemeni] capital,” Sana’a.  The attacks resulted in the deaths of 60 civilians. Just about no one in the United States took notice, nor was it given significant media coverage.  More than likely, this is the first time you’ve heard about the HRW findings. You might think that this is because the conflict in Yemen is off our national radar screen.  But how much attention have Americans paid to U.S. air strikes and bombing runs in Iraq?  Washington has literally been bombing Iraq on and off for twelve years and yet few have taken much notice.  That helps explain why bombing is such an attractive option for Washington any time trouble breaks out in the world.  Americans don’t seem to care much what goes on when our bombs or missiles hit the ground.  As pollsters found recently, a surprising number of Americans even want to bomb places that can’t be found on a map.  When Public Policy Polling asked GOP voters in mid-December if they favored bombing Agrabah, 30% said they did (as did 19% of Democrats), while only 13% opposed the idea.  Agrabah is the fictional city featured in the Disney movie Aladdin. Would you support or oppose bombing Agrabah? Support bombing Agrabah…………………….. 30% Oppose bombing Agrabah……………………… 13% Not sure………………………………………………… 57% That 57% were “not sure” might be considered at least modestly (but not wildly) reassuring. Why Cruz’s Numbers Went Up History suggests that this blanket bloodthirstiness or at least lack of empathy for those on the other end of America’s bombing campaigns isn’t new. In March 1951, nine months into the Korean War, Freda Kirchwey, a crusading liberal journalist at the Nation, expressed bewilderment at American indifference to the fate of Korean civilians killed by our bombs. The destruction was awful.  Little was left standing, structurally speaking, in North Korea.  Nothing, she complained in a column, “excuses the terrible shambles created up and down the Korean peninsula by the American-led forces, by American planes raining down napalm and fire bombs, and by heavy land and naval artillery.” And yet few seemed bothered by it. Because she was an optimist Kirchwey expressed the hope that Americans would eventually come to share her own moral anguish at what was being done in their name.  They never did.  If anything, the longer the war ground on, the less Americans seemed interested in the fate of the victims of our bombing. Why did they show so little empathy?  Science helps provide us with an answer and it’s a disturbing one: empathy grows harder as distances -- whether of status, geography, or both -- increase.  Think of it as a matter of our Stone Age brains.  It’s hard because in many circumstances an empathic response is, in fact, an unnatural act.  It is not natural, it turns out, for us to feel empathy for those who look different and speak a different language.  It is not natural for us to empathize with those who are invisible to us, as most bombing victims were and are.  Nor is it natural for us to feel empathy for people who have what social scientists call “low status” in our eyes, as did the Korean peasants we were killing.  Recent studies show that, faced with a choice of killing a single individual to save the lives of several people, we are far more apt to consider doing it if the individual we are sacrificing is of such low status. When subjects in an experiment are told that high-status people are being saved, the number willing to let the low-status victim die actually increases. Another social science finding helps us understand why empathy is often in short supply and why Ted Cruz is capable of cavalierly recommending we carpet bomb Syrians living under the control of the Islamic State.  Once we have convinced ourselves of the necessity and correctness of bombing the hell out of a country -- as Americans did during the Korean War and as we are now doing in our war against IS -- the wiring in our Stone Age brain helps us overcome any hint of guilt we might be inclined to feel over the ensuing loss of life.  It quite naturally acts to dehumanize the distant victims of our air strikes. This is a classic case of cognitive dissonance. Our brain hates to feel torn between conflicting emotions.  Instead it rationalizes doing what we want to do by discounting any feeling that gives rise to negative emotions, in this case, guilt. An extreme example of this was what happened when the Nazis decided to stigmatize Jews and later wipe them out. From the moment they began their ruthless anti-Semitic campaigns, they used hideous imagery to convince other Germans that Jews were not, like them, human at all, but little different than rats.  It is, of course, far easier to kill someone, or to sit by while others do the same, if you dehumanize them first. Rather than feeling empathy for the downtrodden Jews, many Germans felt contempt and disgust, strong emotions that swamped whatever other feelings they might have had. In a study a few years ago, researchers measured the activity in the brains of subjects looking at pictures of homeless people.  The finding was shocking.  Brain activity in the medial prefrontal cortex, the region of the brain where empathy is often registered, was significantly lower than normal.  Put another way, the subjects in this experiment literally paid the homeless no (or at least less) mind. This may sound cruel and uncaring, but as far as biology is concerned it makes sense.  Our genes, as the biologist Richard Dawkins has taught us, are “selfish”; they are, that is, built to enhance their own replication, which is, in effect, their biological imperative.  Caring for people who are low in status, particularly those who belong to another tribe, doesn’t serve this imperative.  Indeed, it may interfere with it by diverting the attention of the host -- that’s you and me -- from activities that will enhance our survival. Think of this as our Stone-Age brains in action.  It’s not that we necessarily make a conscious decision to ignore the fate of people who are low in status.  Our brain does this automatically and seamlessly for us.  Out of conscious awareness it decides if someone is useful to us.  If that person is, our brain quickly achieves a state of hyper-attentiveness: our nostrils flare, our eyes widen, and our ears tune in relevant sounds.  Think of what happens when you’re in the presence of somebody important and you’ll know what I mean.  If someone is deemed useless to us?  Unless we’re worried that they hyperpose a threat, our brain tells our body to relax. Because it is in our biological interest to feel empathy for people from our own tribe and family -- those, that is, in a position to either enhance our survival or perpetuate our genes -- we come equipped with mechanisms to help us distinguish our people from outsiders.  From the moment we’re born, we focus on those around us and bond with them.  A mother and child know each other through smell.  Brother and sister recognize each other’s familiar facial features. When we hear someone speaking a foreign language, we instinctively discount their humanity.  This was shown in a 2014 experiment designed to determine if human beings were more willing to sacrifice someone who spoke a different language in order to save the lives of several others.  The findings were clear-cut.  Only 18% of the subjects in the experiment were willing to make the cold calculation that saving the lives of several people at the cost of one life was “fair” when the intended victim shared their native language.  However, that percent more than doubled when it was revealed that the person to be sacrificed spoke a foreign language.  The experiment’s results remained the same whether that language was Korean, Hebrew, Japanese, English, or Spanish. Why Stories Matter When It Comes to American War You may be beginning to wonder if we aren’t doomed to eternal indifference to the human beings who suffer when we loose our Air Force on them, but science offers us a modicum of hope on the subject.  In recent years, one of the strongest findings is that storytelling can break through our indifference and foster empathy even for distant peoples who might otherwise seem alien to us. This more than anything else gives us the ability to empathize with those with whom we don’t identify demographically or otherwise. Stories hold our attention, while feeding the strong urge to find meaningful patterns in human behavior. As scientists have now demonstrated in experiments, the brain is a natural pattern finder.  It wants one and one to equal two.  Mysterious may be the will of God, but here on Earth we expect behavior to be explicable.  Stories are designed to establish cause and effect, and once we understand what motivates people we can usually find a way to empathize with them. Stories connect us to people in a way nothing else can.  It’s the reason politicians regularly tell stories on the campaign trail.  Years ago, Harvard social scientist Howard Gardner set out to discover what highly successful leaders have in common. After reviewing the lives of 11 luminaries, from Margaret Thatcher to Martin Luther King, Jr., he concluded that their success depended to a remarkable extent on their ability to communicate a compelling story or, as he put it, “narratives that help individuals think about and feel who they are, where they come from, and where they are headed.”  These stories, he found, “constitute the single most powerful weapon in the leader’s literary arsenal.” When people are reduced to numbers -- as were the civilian victims of air power during the Korean War and as are the civilians who become “collateral damage” in American air strikes in Iraq, Syria, and elsewhere -- we don’t feel their pain, nor do we automatically put ourselves in their shoes, which is by definition what you do when you are feeling empathic. We have the bomber pilot’s syndrome. We don’t feel anything for the victims below. This is one reason why antiwar movements matter.  They tell stories about the victims of war.  It was striking in the Vietnam years, for instance, how many Americans came to care for, say, a small naked Vietnamese girlnapalmed near her village, or so many other Vietnamese civilians whosuffered under a rain of American bombs, rockets, napalm, and artillery shells.  The stories that the massive antiwar movement regularly told here about the distant world being decimated by the U.S. war machine created a powerful sense of empathy among many, including active-duty American soldiers and veterans of the war, for the plight of the Vietnamese.  (It helped that few Americans believed that North Vietnam posed an existential threat to the United States.  Fear brings out the worst in us.) Storytelling happens to be in every human’s toolkit.  We are all born storytellers and attentive listeners.  Biology may incline us to turn a cold eye on the suffering of people we can’t see and don’t know, but stories can liberate us.  Ted Cruz may be able to build up his poll numbers by promising to carpet bomb foreigners in the Middle East of whom we are fearful, but at least we know that biology doesn’t have to dictate our response.  Our brains don’t have to stay in the Stone Age.  Stories can change us, if we start telling them.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 00:45

Don’t underestimate Bernie Sanders: The Vermont senator is poised for two historic victories

AlterNet Hillary Clinton’s worst nightmare may be coming true in Iowa and New Hampshire. According to a series of new polls, Clinton’s months-long lead in first-in-the-nation-voting Iowa has sunk to the outskirts of the margin of error, signaling that her campaign is approaching a statistical tie with Bernie Sanders. Meanwhile, in New Hampshire, the next contest, Sanders has mantained his lead (50 percent to 46 percent) in the latest polls, which also find he would do better than she would in hypothetical matchups against various Republican presidential competitors, because he appeals more to independents. Then, looking at Nevada, the next state in 2016's opening contests, Clinton isseen as no longer having an invincible lead, according to media reports. But she is firmly in the lead in the fourth state before March’s multi-state voting begins. In South Carolina, Clinton is averaging 69 percent to his 28 percent, although Sanders campaigned there this weekend and has visited more than she has. While the dozen states that will vote on Super Tuesday, March 1, have many more Democratic Party delegates (1,007) at stake than in Iowa (52), New Hampshire (32), Nevada (43) and South Carolina (59), the growing prospect of a Sanders surge during much of the first month of the 2016 nominating season undoubtedly has disconcerting echoes for the Clinton campaign. Take Iowa. Until last week’s poll by the Wall Street Journal/NBC/Marist, Clinton had been ahead by an average of 12.8 percent since mid-December, according to RealClearPolitics.com. But the WSJ/NBC/Marist poll released Friday found that Clinton had 48 percent, Sanders 45 percent and Martin O’Malley 5 percent. Beyond those tightening numbers, the Clinton campaign knows all too well what happened in 2008, when Barack Obama came from behind to win because he organized youthful voters to attend the Iowa caucuses in unprecedented numbers. This year, Sanders is aiming for a similar turnout. Eight years ago, Clinton regained her composure by winning in New Hampshire. But that came only after she brought in busloads of supporters from upstate New York to stand on corners and wave posters—the traditional New Hampshire way of showing get-out-the-vote enthusiasm. But this time round, the candidate she faces is from Vermont, next door to New Hampshire, and the latest polls show Sanders remains ahead by several points. Not only has he maintained a steady single-digit lead, Public Policy Polling’s latest New Hampshire-based matchup of Sanders and Clinton versus the range of GOP candidates found that while both would win in the fall, “Sanders does an average of 9 points better than Clinton in the general election matchups.” Their analysis challenges Clinton’s latest selling point to New Hampshire—that only she is electable against a Republican. “Sanders is the only candidate with a positive favorability rating among the overall electorate in the state, and it’s a very positive rating—55 percent of voters see him positively to only 35 percent who have a negative opinion,” PPPsaid. “He leads the entire GOP field by double digits—it’s 12 points over Bush at 50/38, 14 points over Rubio at 51/37, 19 over Carson at 53/34, and 20 points over both Trump and Cruz at 54/34 and 55/35 respectively.” Then comes Nevada, where it also looks like Clinton’s changes for a clearcut victory aren’t as clear as her campaign expected. According to Politico.com, Clinton’s early 20 point lead against Sanders is tightening. “The state that’s been touted as Clinton’s firewall against the Vermont senator in the event he generates any momentum out of the whiter and more liberal states of Iowa and New Hampshire is suddenly looking like it’s in play,” Politico reported last week. In 2008, the pro-Hillary Nevada Democratic Party announced she had won the state because she got the highest popular vote totals. Actually, Obama won and ended up with more delegates because of the way the state party unevenly divides delegates between rural and urban areas: the less-populated rural areas have more clout. With less than three weeks to go before the Iowa caususes (Monday night, February 1), it appears that Sanders is poised to dominate the kick-off contests and generate tremendous media nationwide attention. There are some indications that already is happening, as Clinton’s lead in other nationwide polls against Sanders is shrinking. The most eyebrow-raising example is a new poll by Investors.com that found “Clinton’s [nationwide] lead over Sanders, which had been 18 points, is now just 4 points.” Even more telling was its fine print, which noted Clinton’s regional leads were slipping. “Clinton saw her support drop most in the Northeast (where it fell to 36 percent from 50 percent) and the West (37 percent down from 49 percent). Sanders now holds the lead in both places,” it said. “Clinton support also tumbled among suburban voters, dropping to 39 percent from last a month’s 50 percent. And she has lost backing among moderate Democrats, falling to 44 percent from 58 percent. Sanders picked up 10 points among moderates, to 37 percent.” While all these polls and reports are as discouraging for Clinton as they are encouraging for Sanders, it’s important to remember the nominating season can last for months until a clear winner is known. February may be shaping up for Sanders, but just ponder how he will do in these 12 states and territories that vote on March 1, Super Tuesday: Alabama (50 delegates), American Samoa (10), Arkansas (37), Colorado (79), Georgia (116), Massachusetts (116), Minnesota (93), Oklahoma (42), Tennessee (76), Texas (252), Vermont (26), and Virginia (110). AlterNet Hillary Clinton’s worst nightmare may be coming true in Iowa and New Hampshire. According to a series of new polls, Clinton’s months-long lead in first-in-the-nation-voting Iowa has sunk to the outskirts of the margin of error, signaling that her campaign is approaching a statistical tie with Bernie Sanders. Meanwhile, in New Hampshire, the next contest, Sanders has mantained his lead (50 percent to 46 percent) in the latest polls, which also find he would do better than she would in hypothetical matchups against various Republican presidential competitors, because he appeals more to independents. Then, looking at Nevada, the next state in 2016's opening contests, Clinton isseen as no longer having an invincible lead, according to media reports. But she is firmly in the lead in the fourth state before March’s multi-state voting begins. In South Carolina, Clinton is averaging 69 percent to his 28 percent, although Sanders campaigned there this weekend and has visited more than she has. While the dozen states that will vote on Super Tuesday, March 1, have many more Democratic Party delegates (1,007) at stake than in Iowa (52), New Hampshire (32), Nevada (43) and South Carolina (59), the growing prospect of a Sanders surge during much of the first month of the 2016 nominating season undoubtedly has disconcerting echoes for the Clinton campaign. Take Iowa. Until last week’s poll by the Wall Street Journal/NBC/Marist, Clinton had been ahead by an average of 12.8 percent since mid-December, according to RealClearPolitics.com. But the WSJ/NBC/Marist poll released Friday found that Clinton had 48 percent, Sanders 45 percent and Martin O’Malley 5 percent. Beyond those tightening numbers, the Clinton campaign knows all too well what happened in 2008, when Barack Obama came from behind to win because he organized youthful voters to attend the Iowa caucuses in unprecedented numbers. This year, Sanders is aiming for a similar turnout. Eight years ago, Clinton regained her composure by winning in New Hampshire. But that came only after she brought in busloads of supporters from upstate New York to stand on corners and wave posters—the traditional New Hampshire way of showing get-out-the-vote enthusiasm. But this time round, the candidate she faces is from Vermont, next door to New Hampshire, and the latest polls show Sanders remains ahead by several points. Not only has he maintained a steady single-digit lead, Public Policy Polling’s latest New Hampshire-based matchup of Sanders and Clinton versus the range of GOP candidates found that while both would win in the fall, “Sanders does an average of 9 points better than Clinton in the general election matchups.” Their analysis challenges Clinton’s latest selling point to New Hampshire—that only she is electable against a Republican. “Sanders is the only candidate with a positive favorability rating among the overall electorate in the state, and it’s a very positive rating—55 percent of voters see him positively to only 35 percent who have a negative opinion,” PPPsaid. “He leads the entire GOP field by double digits—it’s 12 points over Bush at 50/38, 14 points over Rubio at 51/37, 19 over Carson at 53/34, and 20 points over both Trump and Cruz at 54/34 and 55/35 respectively.” Then comes Nevada, where it also looks like Clinton’s changes for a clearcut victory aren’t as clear as her campaign expected. According to Politico.com, Clinton’s early 20 point lead against Sanders is tightening. “The state that’s been touted as Clinton’s firewall against the Vermont senator in the event he generates any momentum out of the whiter and more liberal states of Iowa and New Hampshire is suddenly looking like it’s in play,” Politico reported last week. In 2008, the pro-Hillary Nevada Democratic Party announced she had won the state because she got the highest popular vote totals. Actually, Obama won and ended up with more delegates because of the way the state party unevenly divides delegates between rural and urban areas: the less-populated rural areas have more clout. With less than three weeks to go before the Iowa caususes (Monday night, February 1), it appears that Sanders is poised to dominate the kick-off contests and generate tremendous media nationwide attention. There are some indications that already is happening, as Clinton’s lead in other nationwide polls against Sanders is shrinking. The most eyebrow-raising example is a new poll by Investors.com that found “Clinton’s [nationwide] lead over Sanders, which had been 18 points, is now just 4 points.” Even more telling was its fine print, which noted Clinton’s regional leads were slipping. “Clinton saw her support drop most in the Northeast (where it fell to 36 percent from 50 percent) and the West (37 percent down from 49 percent). Sanders now holds the lead in both places,” it said. “Clinton support also tumbled among suburban voters, dropping to 39 percent from last a month’s 50 percent. And she has lost backing among moderate Democrats, falling to 44 percent from 58 percent. Sanders picked up 10 points among moderates, to 37 percent.” While all these polls and reports are as discouraging for Clinton as they are encouraging for Sanders, it’s important to remember the nominating season can last for months until a clear winner is known. February may be shaping up for Sanders, but just ponder how he will do in these 12 states and territories that vote on March 1, Super Tuesday: Alabama (50 delegates), American Samoa (10), Arkansas (37), Colorado (79), Georgia (116), Massachusetts (116), Minnesota (93), Oklahoma (42), Tennessee (76), Texas (252), Vermont (26), and Virginia (110).

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 00:15

Robert Reich: Wall Street and the GOP are trying to rewrite history

If you haven’t yet seen “The Big Short” – directed and co-written by Adam McKay, based on the non-fiction prize-winning book by Michael Lewis about the housing and credit bubble that triggered the Great Recession — I recommend you do so. Not only is the movie an enjoyable (if that’s the right word) way to understand how the big banks screwed millions of Americans out of their homes, savings, and jobs – and then got bailed out by taxpayers. It’s also a lesson in why they’re on the way to doing all this again – and how their political power continues to erode laws designed to prevent another crisis and to shield their executives from any accountability. Most importantly, the movie shows why Bernie Sanders’s plan to break up the biggest banks and reinstate the Glass-Steagall Act (separating investment from commercial banking) is necessary – and why Hillary Clinton’s more modest plan is inadequate. I’ll get back to Bernie and Hillary in a moment, but first you need to know why Wall Street wants us to forget what really happened. The movie gets the story essentially right: Traders on the Street pushed highly-risky mortgage loans, bundled them together into investments that hid the risks, got the major credit-rating agencies to give the bundles Triple-A ratings, and then sold them to unwary investors. It was a fraudulent Ponzi scheme that had to end badly – and it did. Yet since then, Wall Street and its hired guns (including most current Republican candidates for president) have tried to rewrite this history. They want us to believe the banks and investment houses were innocent victims of misguided government policies that gave mortgages to poor people who shouldn’t have got them. That’s pure baloney. The boom in subprime mortgages was concentrated in the private market, not in government. Wall Street itself created the risky mortgage market. It sliced and diced junk mortgages into bundles that hid how bad they were. And it invented the derivatives and CDOs that financed them The fact is, more than 84 percent of the subprime mortgages in 2006 were issued by private institutions, and nearly 83 percent of the subprime loans that went to low- and moderate-income borrowers that year. Why has Wall Street been pushing its lie, blaming the government for what happened? And why has the Street (along with its right-wing apologists, and its outlets such as Rupert Murdoch’s Wall Street Journal) so viciously attacked the movie “The Big Short?” So we won’t demand tougher laws to prevent another crisis  followed by another “too-big-to-fail” bailout. Which brings us back to Bernie and Hillary. Hillary Clinton doesn’t want to break up the big banks or resurrect the Glass-Steagall Act, as Bernie does Instead, she’d charge the big banks a bit more for carrying lots of debt and to oversee them more carefully. She’d also give bank regulators more power to break up any particular bank that they consider too risky. And she wants more oversight of so-called “shadow banks” such as hedge funds and insurance companies like the infamous AIG. In a world where the giant Wall Street banks didn’t have huge political power, these measures might be enough. But, if you hadn’t noticed, Wall Street wields extraordinary power. Which helps explain why no Wall Street executive has been indicted for the fraudulent behavior that led up to the 2008 crash. Or for the criminal price-fixing scheme settled last May. And why even the fines imposed on the banks have been only a fraction of the banks’ gains. And also why Dodd-Frank is being watered down into vapidity. For example, the law requires major banks to prepare “living wills” describing how they’d unwind their operations if they get into serious trouble. But no big bank has come up with one that passes muster. Federal investigators have found them all “unrealistic.” Most of Hillary’s proposals could already have been put into effect by the Fed and the Securities and Exchange Commission, but they haven’t been – presumably because of the Street’s muscle. As a practical matter, then, her proposals are invitations to more dilution and finagle. The only way to contain the Street’s excesses is by taking on its economic and political power directly – with reforms so big, bold, and public they can’t be watered down. Starting with busting up the biggest banks, as Bernie Sanders proposes. More than a century ago, Teddy Roosevelt broke up the Standard Oil Trust because it posed a danger to the U.S. economy. Today, Wall Street’s biggest banks pose an even greater danger. They’re far larger than they were before the crash of 2008. Unless they’re broken up and Glass-Steagall resurrected, we face substantial risk of another near-meltdown – once again threatening the incomes, jobs, savings, and homes of millions of Americans. To paraphrase philosopher George Santayana, those who cannot remember they were screwed by Wall Street are condemned to be screwed again.If you haven’t yet seen “The Big Short” – directed and co-written by Adam McKay, based on the non-fiction prize-winning book by Michael Lewis about the housing and credit bubble that triggered the Great Recession — I recommend you do so. Not only is the movie an enjoyable (if that’s the right word) way to understand how the big banks screwed millions of Americans out of their homes, savings, and jobs – and then got bailed out by taxpayers. It’s also a lesson in why they’re on the way to doing all this again – and how their political power continues to erode laws designed to prevent another crisis and to shield their executives from any accountability. Most importantly, the movie shows why Bernie Sanders’s plan to break up the biggest banks and reinstate the Glass-Steagall Act (separating investment from commercial banking) is necessary – and why Hillary Clinton’s more modest plan is inadequate. I’ll get back to Bernie and Hillary in a moment, but first you need to know why Wall Street wants us to forget what really happened. The movie gets the story essentially right: Traders on the Street pushed highly-risky mortgage loans, bundled them together into investments that hid the risks, got the major credit-rating agencies to give the bundles Triple-A ratings, and then sold them to unwary investors. It was a fraudulent Ponzi scheme that had to end badly – and it did. Yet since then, Wall Street and its hired guns (including most current Republican candidates for president) have tried to rewrite this history. They want us to believe the banks and investment houses were innocent victims of misguided government policies that gave mortgages to poor people who shouldn’t have got them. That’s pure baloney. The boom in subprime mortgages was concentrated in the private market, not in government. Wall Street itself created the risky mortgage market. It sliced and diced junk mortgages into bundles that hid how bad they were. And it invented the derivatives and CDOs that financed them The fact is, more than 84 percent of the subprime mortgages in 2006 were issued by private institutions, and nearly 83 percent of the subprime loans that went to low- and moderate-income borrowers that year. Why has Wall Street been pushing its lie, blaming the government for what happened? And why has the Street (along with its right-wing apologists, and its outlets such as Rupert Murdoch’s Wall Street Journal) so viciously attacked the movie “The Big Short?” So we won’t demand tougher laws to prevent another crisis  followed by another “too-big-to-fail” bailout. Which brings us back to Bernie and Hillary. Hillary Clinton doesn’t want to break up the big banks or resurrect the Glass-Steagall Act, as Bernie does Instead, she’d charge the big banks a bit more for carrying lots of debt and to oversee them more carefully. She’d also give bank regulators more power to break up any particular bank that they consider too risky. And she wants more oversight of so-called “shadow banks” such as hedge funds and insurance companies like the infamous AIG. In a world where the giant Wall Street banks didn’t have huge political power, these measures might be enough. But, if you hadn’t noticed, Wall Street wields extraordinary power. Which helps explain why no Wall Street executive has been indicted for the fraudulent behavior that led up to the 2008 crash. Or for the criminal price-fixing scheme settled last May. And why even the fines imposed on the banks have been only a fraction of the banks’ gains. And also why Dodd-Frank is being watered down into vapidity. For example, the law requires major banks to prepare “living wills” describing how they’d unwind their operations if they get into serious trouble. But no big bank has come up with one that passes muster. Federal investigators have found them all “unrealistic.” Most of Hillary’s proposals could already have been put into effect by the Fed and the Securities and Exchange Commission, but they haven’t been – presumably because of the Street’s muscle. As a practical matter, then, her proposals are invitations to more dilution and finagle. The only way to contain the Street’s excesses is by taking on its economic and political power directly – with reforms so big, bold, and public they can’t be watered down. Starting with busting up the biggest banks, as Bernie Sanders proposes. More than a century ago, Teddy Roosevelt broke up the Standard Oil Trust because it posed a danger to the U.S. economy. Today, Wall Street’s biggest banks pose an even greater danger. They’re far larger than they were before the crash of 2008. Unless they’re broken up and Glass-Steagall resurrected, we face substantial risk of another near-meltdown – once again threatening the incomes, jobs, savings, and homes of millions of Americans. To paraphrase philosopher George Santayana, those who cannot remember they were screwed by Wall Street are condemned to be screwed again.If you haven’t yet seen “The Big Short” – directed and co-written by Adam McKay, based on the non-fiction prize-winning book by Michael Lewis about the housing and credit bubble that triggered the Great Recession — I recommend you do so. Not only is the movie an enjoyable (if that’s the right word) way to understand how the big banks screwed millions of Americans out of their homes, savings, and jobs – and then got bailed out by taxpayers. It’s also a lesson in why they’re on the way to doing all this again – and how their political power continues to erode laws designed to prevent another crisis and to shield their executives from any accountability. Most importantly, the movie shows why Bernie Sanders’s plan to break up the biggest banks and reinstate the Glass-Steagall Act (separating investment from commercial banking) is necessary – and why Hillary Clinton’s more modest plan is inadequate. I’ll get back to Bernie and Hillary in a moment, but first you need to know why Wall Street wants us to forget what really happened. The movie gets the story essentially right: Traders on the Street pushed highly-risky mortgage loans, bundled them together into investments that hid the risks, got the major credit-rating agencies to give the bundles Triple-A ratings, and then sold them to unwary investors. It was a fraudulent Ponzi scheme that had to end badly – and it did. Yet since then, Wall Street and its hired guns (including most current Republican candidates for president) have tried to rewrite this history. They want us to believe the banks and investment houses were innocent victims of misguided government policies that gave mortgages to poor people who shouldn’t have got them. That’s pure baloney. The boom in subprime mortgages was concentrated in the private market, not in government. Wall Street itself created the risky mortgage market. It sliced and diced junk mortgages into bundles that hid how bad they were. And it invented the derivatives and CDOs that financed them The fact is, more than 84 percent of the subprime mortgages in 2006 were issued by private institutions, and nearly 83 percent of the subprime loans that went to low- and moderate-income borrowers that year. Why has Wall Street been pushing its lie, blaming the government for what happened? And why has the Street (along with its right-wing apologists, and its outlets such as Rupert Murdoch’s Wall Street Journal) so viciously attacked the movie “The Big Short?” So we won’t demand tougher laws to prevent another crisis  followed by another “too-big-to-fail” bailout. Which brings us back to Bernie and Hillary. Hillary Clinton doesn’t want to break up the big banks or resurrect the Glass-Steagall Act, as Bernie does Instead, she’d charge the big banks a bit more for carrying lots of debt and to oversee them more carefully. She’d also give bank regulators more power to break up any particular bank that they consider too risky. And she wants more oversight of so-called “shadow banks” such as hedge funds and insurance companies like the infamous AIG. In a world where the giant Wall Street banks didn’t have huge political power, these measures might be enough. But, if you hadn’t noticed, Wall Street wields extraordinary power. Which helps explain why no Wall Street executive has been indicted for the fraudulent behavior that led up to the 2008 crash. Or for the criminal price-fixing scheme settled last May. And why even the fines imposed on the banks have been only a fraction of the banks’ gains. And also why Dodd-Frank is being watered down into vapidity. For example, the law requires major banks to prepare “living wills” describing how they’d unwind their operations if they get into serious trouble. But no big bank has come up with one that passes muster. Federal investigators have found them all “unrealistic.” Most of Hillary’s proposals could already have been put into effect by the Fed and the Securities and Exchange Commission, but they haven’t been – presumably because of the Street’s muscle. As a practical matter, then, her proposals are invitations to more dilution and finagle. The only way to contain the Street’s excesses is by taking on its economic and political power directly – with reforms so big, bold, and public they can’t be watered down. Starting with busting up the biggest banks, as Bernie Sanders proposes. More than a century ago, Teddy Roosevelt broke up the Standard Oil Trust because it posed a danger to the U.S. economy. Today, Wall Street’s biggest banks pose an even greater danger. They’re far larger than they were before the crash of 2008. Unless they’re broken up and Glass-Steagall resurrected, we face substantial risk of another near-meltdown – once again threatening the incomes, jobs, savings, and homes of millions of Americans. To paraphrase philosopher George Santayana, those who cannot remember they were screwed by Wall Street are condemned to be screwed again.If you haven’t yet seen “The Big Short” – directed and co-written by Adam McKay, based on the non-fiction prize-winning book by Michael Lewis about the housing and credit bubble that triggered the Great Recession — I recommend you do so. Not only is the movie an enjoyable (if that’s the right word) way to understand how the big banks screwed millions of Americans out of their homes, savings, and jobs – and then got bailed out by taxpayers. It’s also a lesson in why they’re on the way to doing all this again – and how their political power continues to erode laws designed to prevent another crisis and to shield their executives from any accountability. Most importantly, the movie shows why Bernie Sanders’s plan to break up the biggest banks and reinstate the Glass-Steagall Act (separating investment from commercial banking) is necessary – and why Hillary Clinton’s more modest plan is inadequate. I’ll get back to Bernie and Hillary in a moment, but first you need to know why Wall Street wants us to forget what really happened. The movie gets the story essentially right: Traders on the Street pushed highly-risky mortgage loans, bundled them together into investments that hid the risks, got the major credit-rating agencies to give the bundles Triple-A ratings, and then sold them to unwary investors. It was a fraudulent Ponzi scheme that had to end badly – and it did. Yet since then, Wall Street and its hired guns (including most current Republican candidates for president) have tried to rewrite this history. They want us to believe the banks and investment houses were innocent victims of misguided government policies that gave mortgages to poor people who shouldn’t have got them. That’s pure baloney. The boom in subprime mortgages was concentrated in the private market, not in government. Wall Street itself created the risky mortgage market. It sliced and diced junk mortgages into bundles that hid how bad they were. And it invented the derivatives and CDOs that financed them The fact is, more than 84 percent of the subprime mortgages in 2006 were issued by private institutions, and nearly 83 percent of the subprime loans that went to low- and moderate-income borrowers that year. Why has Wall Street been pushing its lie, blaming the government for what happened? And why has the Street (along with its right-wing apologists, and its outlets such as Rupert Murdoch’s Wall Street Journal) so viciously attacked the movie “The Big Short?” So we won’t demand tougher laws to prevent another crisis  followed by another “too-big-to-fail” bailout. Which brings us back to Bernie and Hillary. Hillary Clinton doesn’t want to break up the big banks or resurrect the Glass-Steagall Act, as Bernie does Instead, she’d charge the big banks a bit more for carrying lots of debt and to oversee them more carefully. She’d also give bank regulators more power to break up any particular bank that they consider too risky. And she wants more oversight of so-called “shadow banks” such as hedge funds and insurance companies like the infamous AIG. In a world where the giant Wall Street banks didn’t have huge political power, these measures might be enough. But, if you hadn’t noticed, Wall Street wields extraordinary power. Which helps explain why no Wall Street executive has been indicted for the fraudulent behavior that led up to the 2008 crash. Or for the criminal price-fixing scheme settled last May. And why even the fines imposed on the banks have been only a fraction of the banks’ gains. And also why Dodd-Frank is being watered down into vapidity. For example, the law requires major banks to prepare “living wills” describing how they’d unwind their operations if they get into serious trouble. But no big bank has come up with one that passes muster. Federal investigators have found them all “unrealistic.” Most of Hillary’s proposals could already have been put into effect by the Fed and the Securities and Exchange Commission, but they haven’t been – presumably because of the Street’s muscle. As a practical matter, then, her proposals are invitations to more dilution and finagle. The only way to contain the Street’s excesses is by taking on its economic and political power directly – with reforms so big, bold, and public they can’t be watered down. Starting with busting up the biggest banks, as Bernie Sanders proposes. More than a century ago, Teddy Roosevelt broke up the Standard Oil Trust because it posed a danger to the U.S. economy. Today, Wall Street’s biggest banks pose an even greater danger. They’re far larger than they were before the crash of 2008. Unless they’re broken up and Glass-Steagall resurrected, we face substantial risk of another near-meltdown – once again threatening the incomes, jobs, savings, and homes of millions of Americans. To paraphrase philosopher George Santayana, those who cannot remember they were screwed by Wall Street are condemned to be screwed again.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 13, 2016 00:00