Helen H. Moore's Blog, page 901
January 6, 2016
This show flips all the scripts: A male prep school student, drugged and raped at a basketball party — and what happens next
The most abstract thing to say about “American Crime,” ABC’s anthology miniseries returning for a second installment tonight, is that the direction is like nothing else on broadcast television. The opening moments of tonight’s premiere is just a scene of several teenagers either participating in or observing basketball practice, but it’s inexplicably, lyrically tense, as the camera first gets close to the sweating basketball players, and then hangs back to steal glances at the kids on the bleachers. Creator John Ridley—who produced “12 Years a Slave”—creates a sense of unchecked hormones, the pressure to succeed, and a sense of foreboding from just the visual landscape of the scene. As the season unfolded—ABC sent out the first four episodes to critics—I found myself returning to the scene, trying to dissect its ambiguous affect now that the show had doled out more information. “American Crime” offers up that scene the way that “Breaking Bad’s" second season offered up the floating teddy bear in the pool—not a flash-forward, but a sign of things to come, and one that gains more meaning as the weeks wear on. Very little else of the show is abstract. “American Crime’s" second season is immediately more successful than its first, partly for this reason. It carries a stronger sense of artful engagement with the viewer, through both direction and tone; where the first season of “American Crime” unfolded a very grim crime of murder and brutal violence, the second season is about a crime where so far, at least, everyone involved is still alive. And maybe because most viewers who have seen “American Crime” will have already considered season one’s themes of police discrimination, racial profiling and cycles of addiction in the now-classic HBO series “The Wire,” season two offers a newer, different topic: campus rape. And though the scandals at both Steubenville and St. Paul’s are drawn from in the show’s narrative, there is a significant, major difference: The victim is a teenage boy. This detail is immediately polarizing and destabilizing, both within and without the show. It clashes with a deeply held and long-running set of beliefs about what it means to be a man, or to be a teenage boy; it inherently challenges the assumptions that tend to be held around rape, especially among young people in high school. And because it’s exploring victimhood of various shades, it skews a little preachy—but not, necessarily, in a bad way. If “American Crime” has any one lesson, it is about how densely complex the world we live in is, when you bring in the threads of bureaucracies, institutions, families, and politics—and how difficult it is, in that web, to make ideas stick to the events of the everyday. There is not a frame of “American Crime” that is not conscious of race, class, gender or sexuality; the series is peppered with moments that exist only to illustrate the comparative weight of different identity markers. That sounds very heavy. And in the first season—especially when coupled with the murder investigation being portrayed—“American Crime” was a depressing show to watch. In this season, it’s less depressing than engaging, because on top of there being this genuinely gripping mystery to unravel, there’s at least one teenage boy worth saving, too. In a show with a lot of acting heavyweights, Connor Jessup, who plays the rape victim Taylor, is “American Crime’s" emotional center, managing to elicit a lot of empathy without saying much at all. We first meet him speaking to his guidance counselor, and the way Ridley frames him, his counselor’s words might as well be the honking sound from “Peanuts” cartoons—Taylor isn’t listening. He’s elsewhere, drifting, and these big ideas about college and the future mean nothing to him. The trouble starts, sort of, when Taylor is suspended from school. Pictures of him drunk at a basketball party are circulating on the phones and Facebook profiles of his fellow students at Leyland High, a private high school in Indiana. Leyland invokes their code of conduct. But Taylor, distraught, confesses to his mother, Anne (Lili Taylor), that he was drugged, and worse, that someone did something to him. It’s enough to send Anne into her own tailspin, as she tries to get someone to care about the fact that her son was victimized. She immediately runs into the school’s dual bureaucracies of moneyed administration and star sports team. Anne’s a single mother working as the manager of a restaurant; Taylor is at Leyland on financial aid and has never managed to fit in. Much of the impetus of the first episodes is on how Anne gets anyone to care about Taylor’s situation—including Taylor himself, who just wants it to go away—in the face of these institutional obstacles. Lili Taylor is one of several actors “American Crime” brings back for the second installment, reimagined as different characters (à la “American Horror Story”). In the first four episodes, it’s the female actors that stand out: Anne, the mother of the raped boy, is torn between trying to do the right thing and feeling like her life is spinning out of control. Regina King (who won an Emmy for her role in the show’s first season) plays a high-achieving power mom to her basketball-playing son, one acutely aware of racial dynamics around black teen boys and very keen to not reproduce them for her child. And Felicity Huffman plays Leyland’s ambitious principal in the middle of a multimillion-dollar capital campaign—not quite as blatantly intolerant as her first-season character, but carrying a seething edge of winning at all costs, as her school is embroiled in scandal. It’s still not exactly an easy watch, but it’s a far more engrossing one than in season one. Leyland is an immediately recognizable institution, and the relationship between the school and its sports team looks a bit like “Friday Night Lights” through a darker lens. Ridley’s camerawork observes the nail polish and lip gloss of teenage girls making eyes at boys, and the spheres of difference between a kid on financial aid and a kid with his own credit card. And from a purely narrative perspective, the twists of the story are remarkable. To be sure, it’s a bit convenient, introducing themes of teenage queer sexuality, shades of consent, underage drinking and class issues all in one unaccounted-for hour at a party—but given the real-life stories of campus rape cases on high schools and colleges throughout the country, it’s hardly that contrived. Above all, it feels relevant, in a meditative way that belies the immediacy of the material. Rape and consent have become endlessly discussed topics, but are far from “solved”; and the numerous complications of 400-odd students experiencing puberty simultaneously are essentially universal, no matter the era or the circumstances. Add to that the confusing and ever-shifting notion of whatever American masculinity is supposed to be—somewhere between high school locker rooms, texted nudes of other men’s girlfriends, and driving really fast in cars—and the second season of “American Crime” is already starting off on an engaging, mysterious and tragic note.







Published on January 06, 2016 15:15
Now you can worry: Trump’s in it to win it, so it’s time to drop fantasy that his campaign is all for show
For most of the second-half of 2015, whether Donald Trump’s presidential campaign is “serious” (definition: unclear) was arguably the question of American politics. It was asked after his launch speech, which was rambling, incoherent, demagogic, and way too long; it was asked after he criticized Sen. John McCain for being captured by the Vietcong; it was asked after he proposed “rounding up” and deporting some 11 million undocumented immigrants; it was asked after he attacked Fox News darling Megyn Kelly; it was asked after he proposed banning Muslims from entering the U.S.; it was asked again, and again, and again. Slowly but surely, more and more commentators began to realize that the answer was yes. Donald Trump’s presidential campaign was “serious” — at least in so far as “serious” was equivalent to “capable of winning.” The national polls showed it, the state-level polls showed it, the issue-based polling showed it. By the holiday season, elite endorsements, pricey ad buys, and an expensive big data/GOTV apparatus were the only usual components of a “serious” campaign that Trump 2016 lacked. Those weren’t insignificant criteria, mind you. Barack Obama’s two presidential election campaigns showed, without question, that voter analytics and GOTV could be difference-makers. And as political scientists tell us, it’s often the case that the candidate with the most elite endorsements is the candidate who ultimately wins. So if you were still inclined to dismiss Trump — like Vox’s Ezra Klein, for example — you had some ground to stand on, even if it was shrinking a little more every day. But that was then and this is now. And while it’s still early, there have already been two major developments within Trump’s campaign that, taken together, suggest the GOP poll leader is about to cross the last two emblems of a campaign’s “seriousness” off of his list. You probably have heard about the first one, which is the release of Trump’s first bona fide campaign commercial. It wasn’t the first video the Trump campaign put out — they spent much of 2015 getting attention with cheeky Instagram clips — but it was the campaign’s first television ad. Even more importantly, Trump announced he was going to actually spend millions of dollars to ensure voters in Iowa and New Hampshire saw it. ($2 million per week, if Trump is to be believed, which he isn’t.) The second and far more important development, meanwhile, was made public by a recent Politico report, which claimed that Trump was not only spending money on a big data/GOTV apparatus — but that he’d been doing so for months already. The details about the program were admittedly fuzzy; none of the major operators involved agreed to speak on-the-record. But as Politico rightly noted in its report, the implications of Trump’s investment were clear:
For most of the second-half of 2015, whether Donald Trump’s presidential campaign is “serious” (definition: unclear) was arguably the question of American politics. It was asked after his launch speech, which was rambling, incoherent, demagogic, and way too long; it was asked after he criticized Sen. John McCain for being captured by the Vietcong; it was asked after he proposed “rounding up” and deporting some 11 million undocumented immigrants; it was asked after he attacked Fox News darling Megyn Kelly; it was asked after he proposed banning Muslims from entering the U.S.; it was asked again, and again, and again. Slowly but surely, more and more commentators began to realize that the answer was yes. Donald Trump’s presidential campaign was “serious” — at least in so far as “serious” was equivalent to “capable of winning.” The national polls showed it, the state-level polls showed it, the issue-based polling showed it. By the holiday season, elite endorsements, pricey ad buys, and an expensive big data/GOTV apparatus were the only usual components of a “serious” campaign that Trump 2016 lacked. Those weren’t insignificant criteria, mind you. Barack Obama’s two presidential election campaigns showed, without question, that voter analytics and GOTV could be difference-makers. And as political scientists tell us, it’s often the case that the candidate with the most elite endorsements is the candidate who ultimately wins. So if you were still inclined to dismiss Trump — like Vox’s Ezra Klein, for example — you had some ground to stand on, even if it was shrinking a little more every day. But that was then and this is now. And while it’s still early, there have already been two major developments within Trump’s campaign that, taken together, suggest the GOP poll leader is about to cross the last two emblems of a campaign’s “seriousness” off of his list. You probably have heard about the first one, which is the release of Trump’s first bona fide campaign commercial. It wasn’t the first video the Trump campaign put out — they spent much of 2015 getting attention with cheeky Instagram clips — but it was the campaign’s first television ad. Even more importantly, Trump announced he was going to actually spend millions of dollars to ensure voters in Iowa and New Hampshire saw it. ($2 million per week, if Trump is to be believed, which he isn’t.) The second and far more important development, meanwhile, was made public by a recent Politico report, which claimed that Trump was not only spending money on a big data/GOTV apparatus — but that he’d been doing so for months already. The details about the program were admittedly fuzzy; none of the major operators involved agreed to speak on-the-record. But as Politico rightly noted in its report, the implications of Trump’s investment were clear:
For most of the second-half of 2015, whether Donald Trump’s presidential campaign is “serious” (definition: unclear) was arguably the question of American politics. It was asked after his launch speech, which was rambling, incoherent, demagogic, and way too long; it was asked after he criticized Sen. John McCain for being captured by the Vietcong; it was asked after he proposed “rounding up” and deporting some 11 million undocumented immigrants; it was asked after he attacked Fox News darling Megyn Kelly; it was asked after he proposed banning Muslims from entering the U.S.; it was asked again, and again, and again. Slowly but surely, more and more commentators began to realize that the answer was yes. Donald Trump’s presidential campaign was “serious” — at least in so far as “serious” was equivalent to “capable of winning.” The national polls showed it, the state-level polls showed it, the issue-based polling showed it. By the holiday season, elite endorsements, pricey ad buys, and an expensive big data/GOTV apparatus were the only usual components of a “serious” campaign that Trump 2016 lacked. Those weren’t insignificant criteria, mind you. Barack Obama’s two presidential election campaigns showed, without question, that voter analytics and GOTV could be difference-makers. And as political scientists tell us, it’s often the case that the candidate with the most elite endorsements is the candidate who ultimately wins. So if you were still inclined to dismiss Trump — like Vox’s Ezra Klein, for example — you had some ground to stand on, even if it was shrinking a little more every day. But that was then and this is now. And while it’s still early, there have already been two major developments within Trump’s campaign that, taken together, suggest the GOP poll leader is about to cross the last two emblems of a campaign’s “seriousness” off of his list. You probably have heard about the first one, which is the release of Trump’s first bona fide campaign commercial. It wasn’t the first video the Trump campaign put out — they spent much of 2015 getting attention with cheeky Instagram clips — but it was the campaign’s first television ad. Even more importantly, Trump announced he was going to actually spend millions of dollars to ensure voters in Iowa and New Hampshire saw it. ($2 million per week, if Trump is to be believed, which he isn’t.) The second and far more important development, meanwhile, was made public by a recent Politico report, which claimed that Trump was not only spending money on a big data/GOTV apparatus — but that he’d been doing so for months already. The details about the program were admittedly fuzzy; none of the major operators involved agreed to speak on-the-record. But as Politico rightly noted in its report, the implications of Trump’s investment were clear: 

[T]he very existence of the Trump data program undermines the assumption that his campaign is uninterested in ― or unaware of ― the basic technological infrastructure needed to identify and mobilize voters. Such tools, used so effectively by Barack Obama during his two presidential campaigns, could be especially critical for Trump as he seeks to increase turnout among new or untraditional GOP voters.So what would you call a campaign that is leading the national polls, the state-level polls, the issue-based polls; that is flush with cash; that is a rallying point for an activist (and activated) die-hard base; and that is spending real money on ads, data, and getting out the vote? I’d call that a “serious” campaign, personally. A serious one, indeed.

[T]he very existence of the Trump data program undermines the assumption that his campaign is uninterested in ― or unaware of ― the basic technological infrastructure needed to identify and mobilize voters. Such tools, used so effectively by Barack Obama during his two presidential campaigns, could be especially critical for Trump as he seeks to increase turnout among new or untraditional GOP voters.So what would you call a campaign that is leading the national polls, the state-level polls, the issue-based polls; that is flush with cash; that is a rallying point for an activist (and activated) die-hard base; and that is spending real money on ads, data, and getting out the vote? I’d call that a “serious” campaign, personally. A serious one, indeed.

[T]he very existence of the Trump data program undermines the assumption that his campaign is uninterested in ― or unaware of ― the basic technological infrastructure needed to identify and mobilize voters. Such tools, used so effectively by Barack Obama during his two presidential campaigns, could be especially critical for Trump as he seeks to increase turnout among new or untraditional GOP voters.So what would you call a campaign that is leading the national polls, the state-level polls, the issue-based polls; that is flush with cash; that is a rallying point for an activist (and activated) die-hard base; and that is spending real money on ads, data, and getting out the vote? I’d call that a “serious” campaign, personally. A serious one, indeed.







Published on January 06, 2016 14:55
Legalization works!: Colorado legalized marijuana two years ago and the sky didn’t fall, after all








Published on January 06, 2016 14:53
Pretty girls get better grades: Attractiveness does affect women’s scores — but not men’s
I pulled many all-nighters in college and graduate school. Long hours, with little to no sleep became my lifestyle, and I’d wake up each morning, do my best to look presentable, and do it all over again. I was an academic overachiever, and would rather lose sleep than lose face when it came to my grades. I did my best to make it look easy, turning in seminar papers in sundresses then strolling away as if I hadn’t spent the last three days obsessing over sentence structure and research. I graduated with honors, and was proud to achieve academic success. But a new study sheds light on an ugly truth when it comes to women’s academic performance. According to research presented at the American Economic Association yesterday, attractive women get better grades not due to higher intelligence, but professor bias. If men aren’t dating us because we’re too smart for our own good, then we’re making the grade for expertly applying mascara. The study's researchers used a sample of more than 5,000 students and more than 100,000 grades and found that physical attractiveness affects the grades of women, not men. Economic researchers at Metropolitan State University of Denver collected student ID photos, and had study participants rate the students’ attractiveness on a scale of one to ten. The research team later examined measures of academic achievement such as final course grades and ACT scores. To determine attractiveness, the researchers established each rater’s baseline bias before showing the photos, which was later subtracted from the ratings assigned to each participant. As it turns out, an increase of one standard deviation in terms of attractiveness for female students was found to correspond with a 0.024 grade increase based on a 4.0 scale. Researchers later divided the female students into three groups -- average, more attractive, less attractive -- and found the women in the less attractive group received average course grades 0.067 points lower than those earned from the two other groups. Rey Hernandez-Julian, one of the lead researchers, told Inside Higher Ed it could be “professors invest more time and energy into the better-looking students, helping them learn more and earn the higher grades” or perhaps educators “simply reward the appearance with higher grades given identical performance.” The results of the study found the gender of the rater or professor didn’t matter. Both men and women tended to give attractive females higher grades. The disparity did not exist for men, or students taking online classes. It’s disheartening to think about. I have attractive female friends who worked just as hard (if not harder) than I did in school who were equally proud of their academic accomplishments. My friend Kristin, for example, stops hearts -- literally. A bombshell blonde, Kristin studied cardiovascular sciences at Oxford University, and we’ve had many conversations about pursuing advanced degrees and the discrimination that stems from being a woman in traditionally male-dominated fields of studies. Oftentimes women are discriminated against for 1) being female and 2) being somewhat attractive. When will we stop judging women on their intelligence based on their looks? The study results can be damaging, too — my bone structure didn’t help me learn Latin conjugations any more in college than my boobs helped me master syntax in graduate school. None of my professors ever seemed to favor or disfavor me based on my looks, but to even question it cheapens the way I value my career as a scholar. Whatever. At the end of the day I guess there’s worse things a woman can be than cute and cum laude.I pulled many all-nighters in college and graduate school. Long hours, with little to no sleep became my lifestyle, and I’d wake up each morning, do my best to look presentable, and do it all over again. I was an academic overachiever, and would rather lose sleep than lose face when it came to my grades. I did my best to make it look easy, turning in seminar papers in sundresses then strolling away as if I hadn’t spent the last three days obsessing over sentence structure and research. I graduated with honors, and was proud to achieve academic success. But a new study sheds light on an ugly truth when it comes to women’s academic performance. According to research presented at the American Economic Association yesterday, attractive women get better grades not due to higher intelligence, but professor bias. If men aren’t dating us because we’re too smart for our own good, then we’re making the grade for expertly applying mascara. The study's researchers used a sample of more than 5,000 students and more than 100,000 grades and found that physical attractiveness affects the grades of women, not men. Economic researchers at Metropolitan State University of Denver collected student ID photos, and had study participants rate the students’ attractiveness on a scale of one to ten. The research team later examined measures of academic achievement such as final course grades and ACT scores. To determine attractiveness, the researchers established each rater’s baseline bias before showing the photos, which was later subtracted from the ratings assigned to each participant. As it turns out, an increase of one standard deviation in terms of attractiveness for female students was found to correspond with a 0.024 grade increase based on a 4.0 scale. Researchers later divided the female students into three groups -- average, more attractive, less attractive -- and found the women in the less attractive group received average course grades 0.067 points lower than those earned from the two other groups. Rey Hernandez-Julian, one of the lead researchers, told Inside Higher Ed it could be “professors invest more time and energy into the better-looking students, helping them learn more and earn the higher grades” or perhaps educators “simply reward the appearance with higher grades given identical performance.” The results of the study found the gender of the rater or professor didn’t matter. Both men and women tended to give attractive females higher grades. The disparity did not exist for men, or students taking online classes. It’s disheartening to think about. I have attractive female friends who worked just as hard (if not harder) than I did in school who were equally proud of their academic accomplishments. My friend Kristin, for example, stops hearts -- literally. A bombshell blonde, Kristin studied cardiovascular sciences at Oxford University, and we’ve had many conversations about pursuing advanced degrees and the discrimination that stems from being a woman in traditionally male-dominated fields of studies. Oftentimes women are discriminated against for 1) being female and 2) being somewhat attractive. When will we stop judging women on their intelligence based on their looks? The study results can be damaging, too — my bone structure didn’t help me learn Latin conjugations any more in college than my boobs helped me master syntax in graduate school. None of my professors ever seemed to favor or disfavor me based on my looks, but to even question it cheapens the way I value my career as a scholar. Whatever. At the end of the day I guess there’s worse things a woman can be than cute and cum laude.I pulled many all-nighters in college and graduate school. Long hours, with little to no sleep became my lifestyle, and I’d wake up each morning, do my best to look presentable, and do it all over again. I was an academic overachiever, and would rather lose sleep than lose face when it came to my grades. I did my best to make it look easy, turning in seminar papers in sundresses then strolling away as if I hadn’t spent the last three days obsessing over sentence structure and research. I graduated with honors, and was proud to achieve academic success. But a new study sheds light on an ugly truth when it comes to women’s academic performance. According to research presented at the American Economic Association yesterday, attractive women get better grades not due to higher intelligence, but professor bias. If men aren’t dating us because we’re too smart for our own good, then we’re making the grade for expertly applying mascara. The study's researchers used a sample of more than 5,000 students and more than 100,000 grades and found that physical attractiveness affects the grades of women, not men. Economic researchers at Metropolitan State University of Denver collected student ID photos, and had study participants rate the students’ attractiveness on a scale of one to ten. The research team later examined measures of academic achievement such as final course grades and ACT scores. To determine attractiveness, the researchers established each rater’s baseline bias before showing the photos, which was later subtracted from the ratings assigned to each participant. As it turns out, an increase of one standard deviation in terms of attractiveness for female students was found to correspond with a 0.024 grade increase based on a 4.0 scale. Researchers later divided the female students into three groups -- average, more attractive, less attractive -- and found the women in the less attractive group received average course grades 0.067 points lower than those earned from the two other groups. Rey Hernandez-Julian, one of the lead researchers, told Inside Higher Ed it could be “professors invest more time and energy into the better-looking students, helping them learn more and earn the higher grades” or perhaps educators “simply reward the appearance with higher grades given identical performance.” The results of the study found the gender of the rater or professor didn’t matter. Both men and women tended to give attractive females higher grades. The disparity did not exist for men, or students taking online classes. It’s disheartening to think about. I have attractive female friends who worked just as hard (if not harder) than I did in school who were equally proud of their academic accomplishments. My friend Kristin, for example, stops hearts -- literally. A bombshell blonde, Kristin studied cardiovascular sciences at Oxford University, and we’ve had many conversations about pursuing advanced degrees and the discrimination that stems from being a woman in traditionally male-dominated fields of studies. Oftentimes women are discriminated against for 1) being female and 2) being somewhat attractive. When will we stop judging women on their intelligence based on their looks? The study results can be damaging, too — my bone structure didn’t help me learn Latin conjugations any more in college than my boobs helped me master syntax in graduate school. None of my professors ever seemed to favor or disfavor me based on my looks, but to even question it cheapens the way I value my career as a scholar. Whatever. At the end of the day I guess there’s worse things a woman can be than cute and cum laude.







Published on January 06, 2016 14:41
Chemistry geeks, rejoice! 4 new elements were just added to the periodic table
Published on January 06, 2016 00:45
Robert Reich: We’re teetering on the brink of recession
Economic forecasters exist to make astrologers look good, but I’ll hazard a guess. I expect the U.S. economy to sputter in 2016. That’s because the economy faces a deep structural problem: not enough demand for all the goods and services it’s capable of producing. American consumers account for almost 70 percent of economic activity, but they won’t have enough purchasing power in 2016 to keep the economy going on more than two cylinders. Blame widening inequality. Consider: The median wage is 4 percent below what it was in 2000, adjusted for inflation. The median wage of young people, even those with college degrees, is also dropping, adjusted for inflation. That means a continued slowdown in the rate of family formation—more young people living at home and deferring marriage and children – and less demand for goods and services. At the same time, the labor participation rate—the percentage of Americans of working age who have jobs—remains near a 40-year low. The giant boomer generation won’t and can’t take up the slack. Boomers haven’t saved nearly enough for retirement, so they’re being forced to cut back expenditures. Exports won’t make up for this deficiency in demand. To the contrary, Europe remains in or close to recession, China’s growth is slowing dramatically, Japan is still on its back, and most developing countries are in the doldrums. Business investment won’t save the day, either. Without enough customers, businesses won’t step up investment. Add in uncertainties about the future—including who will become president, the makeup of the next Congress, the Middle East, and even the possibilities of domestic terrorism—and I wouldn’t be surprised if business investment declined in 2016. I’d feel more optimistic if I thought government was ready to spring into action to stimulate demand, but the opposite is true. The Federal Reserve has started to raise interest rates—spooked by an inflationary ghost that shows no sign of appearing. And Congress, notwithstanding its end-of-year tax-cutting binge, is still in the thralls of austerity economics. Chances are, therefore, the next president will inherit an economy teetering on the edge of recession.Economic forecasters exist to make astrologers look good, but I’ll hazard a guess. I expect the U.S. economy to sputter in 2016. That’s because the economy faces a deep structural problem: not enough demand for all the goods and services it’s capable of producing. American consumers account for almost 70 percent of economic activity, but they won’t have enough purchasing power in 2016 to keep the economy going on more than two cylinders. Blame widening inequality. Consider: The median wage is 4 percent below what it was in 2000, adjusted for inflation. The median wage of young people, even those with college degrees, is also dropping, adjusted for inflation. That means a continued slowdown in the rate of family formation—more young people living at home and deferring marriage and children – and less demand for goods and services. At the same time, the labor participation rate—the percentage of Americans of working age who have jobs—remains near a 40-year low. The giant boomer generation won’t and can’t take up the slack. Boomers haven’t saved nearly enough for retirement, so they’re being forced to cut back expenditures. Exports won’t make up for this deficiency in demand. To the contrary, Europe remains in or close to recession, China’s growth is slowing dramatically, Japan is still on its back, and most developing countries are in the doldrums. Business investment won’t save the day, either. Without enough customers, businesses won’t step up investment. Add in uncertainties about the future—including who will become president, the makeup of the next Congress, the Middle East, and even the possibilities of domestic terrorism—and I wouldn’t be surprised if business investment declined in 2016. I’d feel more optimistic if I thought government was ready to spring into action to stimulate demand, but the opposite is true. The Federal Reserve has started to raise interest rates—spooked by an inflationary ghost that shows no sign of appearing. And Congress, notwithstanding its end-of-year tax-cutting binge, is still in the thralls of austerity economics. Chances are, therefore, the next president will inherit an economy teetering on the edge of recession.Economic forecasters exist to make astrologers look good, but I’ll hazard a guess. I expect the U.S. economy to sputter in 2016. That’s because the economy faces a deep structural problem: not enough demand for all the goods and services it’s capable of producing. American consumers account for almost 70 percent of economic activity, but they won’t have enough purchasing power in 2016 to keep the economy going on more than two cylinders. Blame widening inequality. Consider: The median wage is 4 percent below what it was in 2000, adjusted for inflation. The median wage of young people, even those with college degrees, is also dropping, adjusted for inflation. That means a continued slowdown in the rate of family formation—more young people living at home and deferring marriage and children – and less demand for goods and services. At the same time, the labor participation rate—the percentage of Americans of working age who have jobs—remains near a 40-year low. The giant boomer generation won’t and can’t take up the slack. Boomers haven’t saved nearly enough for retirement, so they’re being forced to cut back expenditures. Exports won’t make up for this deficiency in demand. To the contrary, Europe remains in or close to recession, China’s growth is slowing dramatically, Japan is still on its back, and most developing countries are in the doldrums. Business investment won’t save the day, either. Without enough customers, businesses won’t step up investment. Add in uncertainties about the future—including who will become president, the makeup of the next Congress, the Middle East, and even the possibilities of domestic terrorism—and I wouldn’t be surprised if business investment declined in 2016. I’d feel more optimistic if I thought government was ready to spring into action to stimulate demand, but the opposite is true. The Federal Reserve has started to raise interest rates—spooked by an inflationary ghost that shows no sign of appearing. And Congress, notwithstanding its end-of-year tax-cutting binge, is still in the thralls of austerity economics. Chances are, therefore, the next president will inherit an economy teetering on the edge of recession.Economic forecasters exist to make astrologers look good, but I’ll hazard a guess. I expect the U.S. economy to sputter in 2016. That’s because the economy faces a deep structural problem: not enough demand for all the goods and services it’s capable of producing. American consumers account for almost 70 percent of economic activity, but they won’t have enough purchasing power in 2016 to keep the economy going on more than two cylinders. Blame widening inequality. Consider: The median wage is 4 percent below what it was in 2000, adjusted for inflation. The median wage of young people, even those with college degrees, is also dropping, adjusted for inflation. That means a continued slowdown in the rate of family formation—more young people living at home and deferring marriage and children – and less demand for goods and services. At the same time, the labor participation rate—the percentage of Americans of working age who have jobs—remains near a 40-year low. The giant boomer generation won’t and can’t take up the slack. Boomers haven’t saved nearly enough for retirement, so they’re being forced to cut back expenditures. Exports won’t make up for this deficiency in demand. To the contrary, Europe remains in or close to recession, China’s growth is slowing dramatically, Japan is still on its back, and most developing countries are in the doldrums. Business investment won’t save the day, either. Without enough customers, businesses won’t step up investment. Add in uncertainties about the future—including who will become president, the makeup of the next Congress, the Middle East, and even the possibilities of domestic terrorism—and I wouldn’t be surprised if business investment declined in 2016. I’d feel more optimistic if I thought government was ready to spring into action to stimulate demand, but the opposite is true. The Federal Reserve has started to raise interest rates—spooked by an inflationary ghost that shows no sign of appearing. And Congress, notwithstanding its end-of-year tax-cutting binge, is still in the thralls of austerity economics. Chances are, therefore, the next president will inherit an economy teetering on the edge of recession.Economic forecasters exist to make astrologers look good, but I’ll hazard a guess. I expect the U.S. economy to sputter in 2016. That’s because the economy faces a deep structural problem: not enough demand for all the goods and services it’s capable of producing. American consumers account for almost 70 percent of economic activity, but they won’t have enough purchasing power in 2016 to keep the economy going on more than two cylinders. Blame widening inequality. Consider: The median wage is 4 percent below what it was in 2000, adjusted for inflation. The median wage of young people, even those with college degrees, is also dropping, adjusted for inflation. That means a continued slowdown in the rate of family formation—more young people living at home and deferring marriage and children – and less demand for goods and services. At the same time, the labor participation rate—the percentage of Americans of working age who have jobs—remains near a 40-year low. The giant boomer generation won’t and can’t take up the slack. Boomers haven’t saved nearly enough for retirement, so they’re being forced to cut back expenditures. Exports won’t make up for this deficiency in demand. To the contrary, Europe remains in or close to recession, China’s growth is slowing dramatically, Japan is still on its back, and most developing countries are in the doldrums. Business investment won’t save the day, either. Without enough customers, businesses won’t step up investment. Add in uncertainties about the future—including who will become president, the makeup of the next Congress, the Middle East, and even the possibilities of domestic terrorism—and I wouldn’t be surprised if business investment declined in 2016. I’d feel more optimistic if I thought government was ready to spring into action to stimulate demand, but the opposite is true. The Federal Reserve has started to raise interest rates—spooked by an inflationary ghost that shows no sign of appearing. And Congress, notwithstanding its end-of-year tax-cutting binge, is still in the thralls of austerity economics. Chances are, therefore, the next president will inherit an economy teetering on the edge of recession.







Published on January 06, 2016 00:30
We have a race on our hands: Bernie breaks fundraising records ahead of the primaries









Published on January 06, 2016 00:15
12 things you need to know about Trump’s bullet-bedecked spokeswoman Katrina Pierson








Published on January 06, 2016 00:00
January 5, 2016
What’s missing from “Making a Murderer”: How the riveting documentary’s flaws actually fuel its popularity
Over the course of the last two weeks, “Making a Murderer,” Netflix’s original docuseries about the incarceration of 53-year-old Wisconsin resident Steven Avery, has become something of an underground hit. Netflix released the series on Dec. 18, teeing it up for holiday binge-watching, and the gambit paid off; starting yesterday, the first full week of 2016 has been peppered with speculation, appreciation and discussion of the docuseries and the real-life case it is based on. When I first wrote about the series last month, based on watching just the first two episodes, I misunderstood the premise of the series; “Making a Murderer” is about the particular fallibility of the courts system, not incarceration. Specifically, Steven Avery’s life is dominated by three court cases—two where he is the defendant and one where his nephew Brendan Dassey is the defendant—and the docuseries spends several hours focusing on the arguments and processes of just one of those cases, the 2006 trial against Avery for the rape, mutilation and murder of photographer Teresa Halbach. “Making a Murderer” reveals the trial to be a breathtaking display of police and district-attorney misconduct, as Manitowoc County law enforcement go well out of their way to pin Avery for the crime despite any and all evidence to the contrary. That includes planted evidence, tainted evidence, coerced confessions, memory manipulation and, when that fails, outright lying. “Making a Murderer” excels in uncovering the frustrating, warped details. The most riveting moments of the docuseries are in what should, by all rights, be nothing but dry courtroom back-and-forth; “Making a Murderer” makes Avery’s defense lawyers, Dean Strang and Jerry Buting, into careful, brilliant, avenging heroes as they steadily chip away at piece after piece of the prosecution’s flimsy case. The listening and viewing public have shown quite a bit of interest in true-crime stories that inspire some audience-led legwork, and “Making a Murderer” is no exception. “Serial’s" success was followed closely by “The Jinx,” and though both are very different forays into true crime, the element of real-life investigation made both riveting. Netflix’s “Making a Murderer” is one of a few documentaries and docuseries trying to capitalize on the fervor of audience fact-finding and truth-seeking; Discovery’s “The Killing Fields,” debuting tonight, is another, and while Ryan Murphy’s upcoming “American Crime Story” is a scripted series, it aims for the same sweet spot — the first season will be "The People v. O.J. Simpson." Reddit has become a hotbed of amateur investigation into the Avery case, and Change.org petitions are circulating asking for pardons for Avery and his nephew Brendan Dassey, based on the narrative of the docuseries. And yet, I think “Making a Murderer” does not quite hold up to the same pressure that “Serial” and “The Jinx” were eventually subject to. “Serial” was created by the same team behind long-running public radio standby “This American Life,” and was hosted by journalist Sarah Koenig. A significant feature of the podcast is that Koenig’s narrative style includes self-questioning and reflection, tempered by humor (“Sometimes, I think Dana isn’t listening to me”). Plus, in the latter half of season one, Koenig directly addresses issues that the audience brings to her attention, like the racial component of the friendship between Adnan Syed and Jay Wilds. It’s not perfect, and “Serial” was the subject of much discussion, but it was difficult to fault Koenig’s dedicated investigation, even if you did not agree with her conclusions. “The Jinx,” meanwhile, was almost the opposite; it was a highly personal and particularly artful docuseries—one that very well might not have existed in the form it did if the producers hadn’t stumbled into Durst’s ramblings on a live microphone after his final interview with Andrew Jarecki. What it has in common with “Serial” is that strong central narrator, in the form of Jarecki, as primary interrogator; Jarecki’s conversations with Durst are the equivalent of Koenig’s phone calls with Syed—intimate, unstable and rife with dramatic tension. One gets the sense in both “Serial” and “The Jinx” that the truth led the fact-finders into unexpected directions—whether that is the envelope reading “Beverley Hills” or Syed telling Koenig, “I mean, you don’t really know me.” “Making a Murderer” lacks that sense of central storytelling, and as a result is neither quite as artful nor quite as comprehensively investigative as either “The Jinx” or “Serial.” Instead what comes through in the Netflix series is agenda; the series took 10 years to make, as filmmakers Laura Ricciardi and Moira Demos were essentially embedded with the Avery clan from the 2006 trials and onward. Yet despite this range, the docuseries struggles to frame its chilling details into a strong sense of the big picture. For example, Manitowoc County’s methods are clear, but its motives are vague, at best; there are a few garbled threads about a vendetta against the Averys and digging in their heels about paying Steven Avery $36 million in restitution, but it never quite resolves beyond making the authorities cartoonishly evil. The 10 years working on the documentary does not appear to have offered the context of a decade, or further investigation on the case; if Manitowoc County assumed, ridiculously, that Avery and Dassey were guilty right from the start, “Making a Murderer” sees no reason to look past their innocence. Which is odd, because both filmmakers profess to not be sure about what really happened to Teresa Halbach. In an interview with NPR, Ricciardi and Demos describe their process and experience with nuance and passion, and it’s regrettable that the nuance, at least, does not make it through to “Making a Murderer.” The filmmakers spent two years living in Wisconsin and working with the Averys, and describe the film as an attempt to reveal not the truth of the case but the flaws in the process of the justice system. Personally, I find it difficult to read the miniseries as being about anything except exculpating Avery and Dassey—the final episode ends with Avery narrating how he is going to continue fighting for his freedom and Dassey reading an open letter protesting his innocence. And yet Pajiba, via Reddit, has already uncovered a whole trove of evidence that, if even partly corroborated, casts quite a bit of doubt on the “Making a Murderer” narrative of the events of Oct. 31, 2005. What really matters, probably, is whether or not the courts process is reliable, but “Making a Murderer” focuses elsewhere, spending a great deal of time interviewing family members and playing up both Avery’s and Dassey’s gentle-heartedness. It ultimately weakens the documentary’s reliability, even if in the short-term, it’s brought in more signatories onto the Change.org petitions. And yet it’s entirely possible that the holes in “Making a Murderer’s" narrative contribute to even more avid truth-seeking from the true-crime aficionados following the story. The latter half of “Serial” was far less interesting than the first half, after all, when all of the questions were still left so tantalizingly open. The fact that puzzle pieces don’t quite fit together is an immediate kind of dramatic tension for the viewer; more than anything else, it suggests a future where the puzzle pieces could fit together perfectly, even though further investigation in nearly every true crime story produces not clarity but, instead, endless complexity. More practically—and perhaps more to the point—the plot holes might be paving the way for a season two; Netflix’s chief content officer has said he sees not the first episode but the whole first season to be a pilot, of sorts. With its odd hiccups in pacing, missing pieces and episode-long focus on a blood vial that goes nowhere (I swear, I am still so confused by the blood vial) what “Making a Murderer” feels like to me, above all, is unfinished. A second season might lock this fact-finding mission into place—revealing the personality and comprehensiveness that its current season, to my mind, so tantalizingly lacks, through the simple mechanism of getting enough people interested in the case that Wisconsin is forced to reopen it.Over the course of the last two weeks, “Making a Murderer,” Netflix’s original docuseries about the incarceration of 53-year-old Wisconsin resident Steven Avery, has become something of an underground hit. Netflix released the series on Dec. 18, teeing it up for holiday binge-watching, and the gambit paid off; starting yesterday, the first full week of 2016 has been peppered with speculation, appreciation and discussion of the docuseries and the real-life case it is based on. When I first wrote about the series last month, based on watching just the first two episodes, I misunderstood the premise of the series; “Making a Murderer” is about the particular fallibility of the courts system, not incarceration. Specifically, Steven Avery’s life is dominated by three court cases—two where he is the defendant and one where his nephew Brendan Dassey is the defendant—and the docuseries spends several hours focusing on the arguments and processes of just one of those cases, the 2006 trial against Avery for the rape, mutilation and murder of photographer Teresa Halbach. “Making a Murderer” reveals the trial to be a breathtaking display of police and district-attorney misconduct, as Manitowoc County law enforcement go well out of their way to pin Avery for the crime despite any and all evidence to the contrary. That includes planted evidence, tainted evidence, coerced confessions, memory manipulation and, when that fails, outright lying. “Making a Murderer” excels in uncovering the frustrating, warped details. The most riveting moments of the docuseries are in what should, by all rights, be nothing but dry courtroom back-and-forth; “Making a Murderer” makes Avery’s defense lawyers, Dean Strang and Jerry Buting, into careful, brilliant, avenging heroes as they steadily chip away at piece after piece of the prosecution’s flimsy case. The listening and viewing public have shown quite a bit of interest in true-crime stories that inspire some audience-led legwork, and “Making a Murderer” is no exception. “Serial’s" success was followed closely by “The Jinx,” and though both are very different forays into true crime, the element of real-life investigation made both riveting. Netflix’s “Making a Murderer” is one of a few documentaries and docuseries trying to capitalize on the fervor of audience fact-finding and truth-seeking; Discovery’s “The Killing Fields,” debuting tonight, is another, and while Ryan Murphy’s upcoming “American Crime Story” is a scripted series, it aims for the same sweet spot — the first season will be "The People v. O.J. Simpson." Reddit has become a hotbed of amateur investigation into the Avery case, and Change.org petitions are circulating asking for pardons for Avery and his nephew Brendan Dassey, based on the narrative of the docuseries. And yet, I think “Making a Murderer” does not quite hold up to the same pressure that “Serial” and “The Jinx” were eventually subject to. “Serial” was created by the same team behind long-running public radio standby “This American Life,” and was hosted by journalist Sarah Koenig. A significant feature of the podcast is that Koenig’s narrative style includes self-questioning and reflection, tempered by humor (“Sometimes, I think Dana isn’t listening to me”). Plus, in the latter half of season one, Koenig directly addresses issues that the audience brings to her attention, like the racial component of the friendship between Adnan Syed and Jay Wilds. It’s not perfect, and “Serial” was the subject of much discussion, but it was difficult to fault Koenig’s dedicated investigation, even if you did not agree with her conclusions. “The Jinx,” meanwhile, was almost the opposite; it was a highly personal and particularly artful docuseries—one that very well might not have existed in the form it did if the producers hadn’t stumbled into Durst’s ramblings on a live microphone after his final interview with Andrew Jarecki. What it has in common with “Serial” is that strong central narrator, in the form of Jarecki, as primary interrogator; Jarecki’s conversations with Durst are the equivalent of Koenig’s phone calls with Syed—intimate, unstable and rife with dramatic tension. One gets the sense in both “Serial” and “The Jinx” that the truth led the fact-finders into unexpected directions—whether that is the envelope reading “Beverley Hills” or Syed telling Koenig, “I mean, you don’t really know me.” “Making a Murderer” lacks that sense of central storytelling, and as a result is neither quite as artful nor quite as comprehensively investigative as either “The Jinx” or “Serial.” Instead what comes through in the Netflix series is agenda; the series took 10 years to make, as filmmakers Laura Ricciardi and Moira Demos were essentially embedded with the Avery clan from the 2006 trials and onward. Yet despite this range, the docuseries struggles to frame its chilling details into a strong sense of the big picture. For example, Manitowoc County’s methods are clear, but its motives are vague, at best; there are a few garbled threads about a vendetta against the Averys and digging in their heels about paying Steven Avery $36 million in restitution, but it never quite resolves beyond making the authorities cartoonishly evil. The 10 years working on the documentary does not appear to have offered the context of a decade, or further investigation on the case; if Manitowoc County assumed, ridiculously, that Avery and Dassey were guilty right from the start, “Making a Murderer” sees no reason to look past their innocence. Which is odd, because both filmmakers profess to not be sure about what really happened to Teresa Halbach. In an interview with NPR, Ricciardi and Demos describe their process and experience with nuance and passion, and it’s regrettable that the nuance, at least, does not make it through to “Making a Murderer.” The filmmakers spent two years living in Wisconsin and working with the Averys, and describe the film as an attempt to reveal not the truth of the case but the flaws in the process of the justice system. Personally, I find it difficult to read the miniseries as being about anything except exculpating Avery and Dassey—the final episode ends with Avery narrating how he is going to continue fighting for his freedom and Dassey reading an open letter protesting his innocence. And yet Pajiba, via Reddit, has already uncovered a whole trove of evidence that, if even partly corroborated, casts quite a bit of doubt on the “Making a Murderer” narrative of the events of Oct. 31, 2005. What really matters, probably, is whether or not the courts process is reliable, but “Making a Murderer” focuses elsewhere, spending a great deal of time interviewing family members and playing up both Avery’s and Dassey’s gentle-heartedness. It ultimately weakens the documentary’s reliability, even if in the short-term, it’s brought in more signatories onto the Change.org petitions. And yet it’s entirely possible that the holes in “Making a Murderer’s" narrative contribute to even more avid truth-seeking from the true-crime aficionados following the story. The latter half of “Serial” was far less interesting than the first half, after all, when all of the questions were still left so tantalizingly open. The fact that puzzle pieces don’t quite fit together is an immediate kind of dramatic tension for the viewer; more than anything else, it suggests a future where the puzzle pieces could fit together perfectly, even though further investigation in nearly every true crime story produces not clarity but, instead, endless complexity. More practically—and perhaps more to the point—the plot holes might be paving the way for a season two; Netflix’s chief content officer has said he sees not the first episode but the whole first season to be a pilot, of sorts. With its odd hiccups in pacing, missing pieces and episode-long focus on a blood vial that goes nowhere (I swear, I am still so confused by the blood vial) what “Making a Murderer” feels like to me, above all, is unfinished. A second season might lock this fact-finding mission into place—revealing the personality and comprehensiveness that its current season, to my mind, so tantalizingly lacks, through the simple mechanism of getting enough people interested in the case that Wisconsin is forced to reopen it.







Published on January 05, 2016 16:00
The NRA is backing terrorism: Wingnuts and “patriots” are the real safety threat we should fear
It’s impossible to engage publicly in our country’s debate about gun safety legislation without being drowned out by the “patriots” screaming about their “freedom.” After I published an essay on Salon last fall imploring legislators to stand up to the gun lobby and pass meaningful gun safety laws, helpful Internet commenters were quick to remind me that my fears did not trump their freedom. I am a teacher, and I wrote the essay following the school shooting at Umpqua Community College, after I learned that my own college instructed teachers to fight back against active shooters with staplers if necessary. My fears, then, were fears for my life, for my students’ lives, and for the lives of my children. Those lives, to the many vocal opponents of gun safety (who certainly do not represent all gun owners in this country), are less important than their easy and unrestricted access to guns. Today, as President Obama announces executive actions that would, among other things, extend background check requirements to private gun sellers, the shouts from those who claim to stand for freedom and liberty will grow louder. They will say, as the president acknowledged in his address, that gun safety laws don’t even work to prevent crime. It’s hard to debate the point, given the NRA’s success in blocking public safety research into guns. Prior to 1996, however, when that research ban was passed, the CDC concluded that in households where guns were present, family members were three times as likely to be shot as those who live in homes without guns. More recently, a Stanford University study found that right-to-carry laws were linked to an increase in violent crime; and researchers from the University of California, Davis, and the University of Newcastle in Australia, using data from the 2013 FBI crime report and gun control laws by state, concluded that homicide rates in cities within states that require a permit to buy a gun are lower than those in cities within states that do not require permits. And the National Journal, citing data from the CDC, the Law Center to Prevent Gun Violence, and the NRA-ILA, reported that states with more restrictive gun laws have the fewest gun-related deaths. This research—which is to say these facts—don’t matter to the gun extremists who argue for their unassailable right to own guns. After they claim that gun legislation is ineffective, they will shriek about the Constitution, failing to see the illogic in the argument that an amendment is something that must never be changed. Writing in Psychology Today in the wake of the Sandy Hook massacre, David Ropeik explains that the extremists who speak of clutching their guns with their cold, dead hands are “fighting for the right to own a gun [as] a way of asserting control against a society that many feel is encroaching on their values and freedoms.” To such extremists, the right to bear arms is not one of many constitutional rights. It is the most important right, through which all other rights must be defended. Simply put, the right to own a gun is the right to stand up against the government. For example, heavy artillery has made it possible this week for terrorists to occupy a federal building in Oregon, to stand against the perceived tyranny of having to pay (at a rate 93 percent lower than the average market rate) for the use of federal lands for grazing. Here these men are, exercising their “freedoms” behind the barrels of their guns. All of their talk of tyranny makes it very clear then what the debate over guns in this country is really about: When gun extremists resist the commonsense legislation that would curb gun violence, they are fighting for their “freedom” to break the law. (It’s a freedom, it’s worth noting, only available to whites in this country, as people of color—
children of color
—do not have the freedom to hold even toy guns without being shot by police.) Claiming to stand against tyranny, gun extremists—backed by the NRA and the rest of the gun lobby—are the real tyrants, holding Americans hostage, putting our lives in danger. Public opinion on gun safety legislation is changing, and President Obama’s executive actions reflect the beliefs of the majority of Americans. The opposition will grow louder as we continue to enact safe and sane gun policies at the state and federal levels; as they scream about their freedom, we’d all do well to keep in mind what they are really fighting for. When faced again, as I undoubtedly will be, with the onslaught of comments from gun extremists, I will tell them this time that they are right: Fear should not trump freedom. That is, their irrational fear of a tyrannical government out to get them should not outweigh our freedom to live without being shot in our schools, movie theaters, shopping malls, cars and homes.It’s impossible to engage publicly in our country’s debate about gun safety legislation without being drowned out by the “patriots” screaming about their “freedom.” After I published an essay on Salon last fall imploring legislators to stand up to the gun lobby and pass meaningful gun safety laws, helpful Internet commenters were quick to remind me that my fears did not trump their freedom. I am a teacher, and I wrote the essay following the school shooting at Umpqua Community College, after I learned that my own college instructed teachers to fight back against active shooters with staplers if necessary. My fears, then, were fears for my life, for my students’ lives, and for the lives of my children. Those lives, to the many vocal opponents of gun safety (who certainly do not represent all gun owners in this country), are less important than their easy and unrestricted access to guns. Today, as President Obama announces executive actions that would, among other things, extend background check requirements to private gun sellers, the shouts from those who claim to stand for freedom and liberty will grow louder. They will say, as the president acknowledged in his address, that gun safety laws don’t even work to prevent crime. It’s hard to debate the point, given the NRA’s success in blocking public safety research into guns. Prior to 1996, however, when that research ban was passed, the CDC concluded that in households where guns were present, family members were three times as likely to be shot as those who live in homes without guns. More recently, a Stanford University study found that right-to-carry laws were linked to an increase in violent crime; and researchers from the University of California, Davis, and the University of Newcastle in Australia, using data from the 2013 FBI crime report and gun control laws by state, concluded that homicide rates in cities within states that require a permit to buy a gun are lower than those in cities within states that do not require permits. And the National Journal, citing data from the CDC, the Law Center to Prevent Gun Violence, and the NRA-ILA, reported that states with more restrictive gun laws have the fewest gun-related deaths. This research—which is to say these facts—don’t matter to the gun extremists who argue for their unassailable right to own guns. After they claim that gun legislation is ineffective, they will shriek about the Constitution, failing to see the illogic in the argument that an amendment is something that must never be changed. Writing in Psychology Today in the wake of the Sandy Hook massacre, David Ropeik explains that the extremists who speak of clutching their guns with their cold, dead hands are “fighting for the right to own a gun [as] a way of asserting control against a society that many feel is encroaching on their values and freedoms.” To such extremists, the right to bear arms is not one of many constitutional rights. It is the most important right, through which all other rights must be defended. Simply put, the right to own a gun is the right to stand up against the government. For example, heavy artillery has made it possible this week for terrorists to occupy a federal building in Oregon, to stand against the perceived tyranny of having to pay (at a rate 93 percent lower than the average market rate) for the use of federal lands for grazing. Here these men are, exercising their “freedoms” behind the barrels of their guns. All of their talk of tyranny makes it very clear then what the debate over guns in this country is really about: When gun extremists resist the commonsense legislation that would curb gun violence, they are fighting for their “freedom” to break the law. (It’s a freedom, it’s worth noting, only available to whites in this country, as people of color—
children of color
—do not have the freedom to hold even toy guns without being shot by police.) Claiming to stand against tyranny, gun extremists—backed by the NRA and the rest of the gun lobby—are the real tyrants, holding Americans hostage, putting our lives in danger. Public opinion on gun safety legislation is changing, and President Obama’s executive actions reflect the beliefs of the majority of Americans. The opposition will grow louder as we continue to enact safe and sane gun policies at the state and federal levels; as they scream about their freedom, we’d all do well to keep in mind what they are really fighting for. When faced again, as I undoubtedly will be, with the onslaught of comments from gun extremists, I will tell them this time that they are right: Fear should not trump freedom. That is, their irrational fear of a tyrannical government out to get them should not outweigh our freedom to live without being shot in our schools, movie theaters, shopping malls, cars and homes.







Published on January 05, 2016 15:59