Helen H. Moore's Blog, page 938
November 29, 2015
Katniss is a hero for boys, too: My 11-year-old sons need more books and films like “The Hunger Games” series
On the screen, Katniss Everdeen and Peeta Mellark are kissing. I’m watching with my 11-year-old triplet sons and there’s some squirming in the room. Leo says, “This is Ace’s favorite bit.” Ace snorts, “No, it’s not. My favorite part is where everybody dies.” Luke laughs. We’re watching a scene from “Catching Fire” on TV because they’ve been bugging me to show them a piece of the "Hunger Games” film franchise, even though their mom and I agree they’re too young to go with me to the premiere of “The Hunger Games: Mockingjay Part 2” later that day. We did let our kids read the books, which they devoured, sometimes getting into arguments about who had claimed the rights over who had dibs on “Catching Fire” or “Mockingjay.” After that, the archery set that had been in the garage for months re-emerged and arrows started flying through our backyard. There’s been plenty of ink and pixels devoted to Katniss’ influence on girls, book sales, the sport of archery and even Jennifer Lawrence herself. I think it’s time to talk about her value to another demographic – boys. First, I claim no objectivity about “The Hunger Games” saga. It had me at “This is the day of the reaping.” Hell, these books reaped me. I’m a 44-year-old man whose normal reading pace is a page or two per night before bed. (People worry that George R.R. Martin might not live long enough to finish “A Song of Ice and Fire” – I’m not sure I will, either.) I read all 1,174 pages of Suzanne Collins’ trilogy in about four days. I did no work. Minimal chores. When my wife wasn’t home, the triplets lived their own version of the Hunger Games in our kitchen because I’d stopped preparing food. “The Hunger Games” books thrilled, fascinated, worried and wounded me; they left me exhausted, dazed and grateful. I wanted my sons to read them in part to share the experience, but also because I feel like maybe they needed it. Unlike too many boys, these guys read. Outside of Japanese manga and a book written by their own grandmother, they tend to read stories about other members of their own demographic. That’s easy to do. According to one study of children’s literature, male characters are featured “nearly twice as often in titles and 1.6 times as often as central characters.” (Among feature films popular with kids, the ratio is even worse.) In addition, my sons (like me) have no sisters, and I’m not aware of any female friends with whom they seem terribly close. I don’t want the interior life of girls to be as entirely foreign to them as it once felt to me. Annette Wannamaker teaches young adult literature at Eastern Michigan University and writes about boys and media. “‘The Hunger Games’ is a great text for boys and girls to read because Katniss is a non-stereotypical girl and because Peeta is a non-stereotypical boy,” she told me. “Peeta is heroic as well, but in a way that is not stereotypically masculine: He is brave and strong, but he is also nurturing.” She added that the idea that boys won’t connect with a story about a girl is just wrong. “There's a myth that books need to have characters young readers can 'identify with,' but the exact opposite is true: literature gives us a chance to view the world through another person's eyes, to consider a perspective that is different from our own. This is vital because it teaches empathy.” Children’s author Shannon Hale wrote in her blog, “…the idea that girls should read about and understand boys but that boys don't have to read about girls, that boys aren't expected to understand and empathize with the female population of the world....this belief directly leads to rape culture. To a culture that tells boys and men, it doesn't matter how the girl feels, what she wants.” Hale describes author visits in which the school only invited girl students to her assemblies. In one case, “I did the presentation. But I felt sick to my stomach. Later I asked what other authors had visited. They'd had a male writer. For his assembly, both boys and girls had been invited.” I appear to be the only unaccompanied man who’s come to this particular theater for the debut of “Mockingjay: Part 2” here in Jennifer Lawrence’s hometown of Louisville. The house is about three-quarters full (which I guess portends the supposedly disappointing opening weekend draw of “only” $247 million worldwide). I’m here in part to complete my own “Hunger Games” experience and also to find some male fans who don’t live in my house. The movie itself is satisfying, riveting and tense, though many scenes feel relentless, grim and gray. Still, it hits all the marks I need it to, particularly at the very end, when the girl from District 12 becomes her own woman, not exactly at peace, but no longer on fire. In between setpieces, I watch with this question in mind: How would these scenes play if (à la “Twilight Reimagined”) Katniss were a boy and Peeta and Gale were girls? Peeta’s emotional intelligence would feel familiar; Gale’s ruthlessness as a military tactician would be more interesting. But Katniss’ interactions with either of them would somehow look much creepier, manipulative, even callous. Imagine a male Katniss threatening to kill a deranged female Peeta, saying, “I’ll put (her) down like a Capitol mutt.” In the lobby, I approach a few teenagers who appear to be either leaving the movie or in line for the next show. Fifteen year-old Greg doesn’t like the concept of a male-centered "Hunger Games." “It’d be boring. It’d be just like every other movie.” Good point. I ask 14-year-old Brycen what he likes about Katniss. “She’s hot.” OK, thank you. Anthony is a 19-year-old university student who says his favorite “Hunger Games” character is Haymitch. Now, Haymitch kicks ass – both on the page and as rendered by Woody Harrelson – but this is a pattern among the handful of male fans I’ve unscientifically surveyed: None of them claim Katniss as their favorite series character. (My kids’ favorite: Finnick.) Anthony likes the action sequences more than the romance, but says, “Katniss is a cool character. She’s strong, independent.” He thinks it’s good that more women – real and fictional – are reaching prominence and asserting themselves. “Like Ronda Rousey,” he says. Anthony and I – and lots of guys, really – understand that there’s value for girls in the stories of characters like Katniss, but only in a very abstract way. Hayley, a 21-year-old psych major, spells it out for me between screenings: “Katniss is important for us because she’s a healthy, strong woman who’s vulnerable, but her vulnerability isn’t a flaw. And the truth is it’s hard for girls to be strong. It’s hard for girls to be powerful. Katniss isn’t told she’s bossy. She’s not talked over. And boys need to understand that’s not how it is in real life.” I recount this exchange to my wife – my authority on all things feminine -- and say something like, “Does that really happen, even now? Getting talked over by men?” Gabrielle’s silent response combines affection with exasperation and makes it clear that she has forgiven me many times for committing the very sin whose existence I’ve just questioned. Hmm. If I did that, it wasn’t on purpose. So why would I? It feels like at some point, boys – including me – pick up this notion that when it comes to male-female interaction, strength, power and status are often a zero-sum proposition. This leads us to interpret displays of ability or authority in women as signs of weakness in ourselves -- gender relations as its own kind of Hunger Games. Earlier this month, @MTVceleb put out this call on Twitter: Are you a female #HungerGames fan? Send us your story of just why Katniss inspires you… MTV.com (whose target audience is teen girls) compiled the responses, some of which were quite touching. But some fans, seemingly both male and female, took offense to the solicitation, offering comments like:

“sexist” “why can't Katniss inspire men?” “…Katniss not only resonates with girls but boys as well.”I reached out to one of those male fans -- @muttmellark, an 18-year-old from Ireland. He tweeted, “She shows that a someone picked from a crowd can have an hugely important role in society regardless of where they come from.” With her final movie completed, Katniss Everdeen is history now, ascending from Panem to the pantheon of fiction’s great heroines, along with -- who, exactly? Artemis? Buffy? Nancy Drew? Although, really, why segregate humanity’s heroic hall of fame? I think Katniss has proven she can stand alongside anyone from Odysseus to Harry Potter. Every Christmas, my sons expect books in their stockings. This year, they might find something like “Howl's Moving Castle” by Diana Wynne Jones or “Dragonflight” by Anne McCaffrey. I won’t make a big deal out of it, won’t say, “Even though it’s about a girl…” or “Since you liked Katniss, maybe you’ll like this, too.” My hope is that reading stories like these will help my sons understand that women can be strong without making men feel weak. As Katniss and Peeta discover in the climactic scene from “The Hunger Games,” even if the odds aren’t in your favor, it should be possible for both boy and girl to find a way to win.On the screen, Katniss Everdeen and Peeta Mellark are kissing. I’m watching with my 11-year-old triplet sons and there’s some squirming in the room. Leo says, “This is Ace’s favorite bit.” Ace snorts, “No, it’s not. My favorite part is where everybody dies.” Luke laughs. We’re watching a scene from “Catching Fire” on TV because they’ve been bugging me to show them a piece of the "Hunger Games” film franchise, even though their mom and I agree they’re too young to go with me to the premiere of “The Hunger Games: Mockingjay Part 2” later that day. We did let our kids read the books, which they devoured, sometimes getting into arguments about who had claimed the rights over who had dibs on “Catching Fire” or “Mockingjay.” After that, the archery set that had been in the garage for months re-emerged and arrows started flying through our backyard. There’s been plenty of ink and pixels devoted to Katniss’ influence on girls, book sales, the sport of archery and even Jennifer Lawrence herself. I think it’s time to talk about her value to another demographic – boys. First, I claim no objectivity about “The Hunger Games” saga. It had me at “This is the day of the reaping.” Hell, these books reaped me. I’m a 44-year-old man whose normal reading pace is a page or two per night before bed. (People worry that George R.R. Martin might not live long enough to finish “A Song of Ice and Fire” – I’m not sure I will, either.) I read all 1,174 pages of Suzanne Collins’ trilogy in about four days. I did no work. Minimal chores. When my wife wasn’t home, the triplets lived their own version of the Hunger Games in our kitchen because I’d stopped preparing food. “The Hunger Games” books thrilled, fascinated, worried and wounded me; they left me exhausted, dazed and grateful. I wanted my sons to read them in part to share the experience, but also because I feel like maybe they needed it. Unlike too many boys, these guys read. Outside of Japanese manga and a book written by their own grandmother, they tend to read stories about other members of their own demographic. That’s easy to do. According to one study of children’s literature, male characters are featured “nearly twice as often in titles and 1.6 times as often as central characters.” (Among feature films popular with kids, the ratio is even worse.) In addition, my sons (like me) have no sisters, and I’m not aware of any female friends with whom they seem terribly close. I don’t want the interior life of girls to be as entirely foreign to them as it once felt to me. Annette Wannamaker teaches young adult literature at Eastern Michigan University and writes about boys and media. “‘The Hunger Games’ is a great text for boys and girls to read because Katniss is a non-stereotypical girl and because Peeta is a non-stereotypical boy,” she told me. “Peeta is heroic as well, but in a way that is not stereotypically masculine: He is brave and strong, but he is also nurturing.” She added that the idea that boys won’t connect with a story about a girl is just wrong. “There's a myth that books need to have characters young readers can 'identify with,' but the exact opposite is true: literature gives us a chance to view the world through another person's eyes, to consider a perspective that is different from our own. This is vital because it teaches empathy.” Children’s author Shannon Hale wrote in her blog, “…the idea that girls should read about and understand boys but that boys don't have to read about girls, that boys aren't expected to understand and empathize with the female population of the world....this belief directly leads to rape culture. To a culture that tells boys and men, it doesn't matter how the girl feels, what she wants.” Hale describes author visits in which the school only invited girl students to her assemblies. In one case, “I did the presentation. But I felt sick to my stomach. Later I asked what other authors had visited. They'd had a male writer. For his assembly, both boys and girls had been invited.” I appear to be the only unaccompanied man who’s come to this particular theater for the debut of “Mockingjay: Part 2” here in Jennifer Lawrence’s hometown of Louisville. The house is about three-quarters full (which I guess portends the supposedly disappointing opening weekend draw of “only” $247 million worldwide). I’m here in part to complete my own “Hunger Games” experience and also to find some male fans who don’t live in my house. The movie itself is satisfying, riveting and tense, though many scenes feel relentless, grim and gray. Still, it hits all the marks I need it to, particularly at the very end, when the girl from District 12 becomes her own woman, not exactly at peace, but no longer on fire. In between setpieces, I watch with this question in mind: How would these scenes play if (à la “Twilight Reimagined”) Katniss were a boy and Peeta and Gale were girls? Peeta’s emotional intelligence would feel familiar; Gale’s ruthlessness as a military tactician would be more interesting. But Katniss’ interactions with either of them would somehow look much creepier, manipulative, even callous. Imagine a male Katniss threatening to kill a deranged female Peeta, saying, “I’ll put (her) down like a Capitol mutt.” In the lobby, I approach a few teenagers who appear to be either leaving the movie or in line for the next show. Fifteen year-old Greg doesn’t like the concept of a male-centered "Hunger Games." “It’d be boring. It’d be just like every other movie.” Good point. I ask 14-year-old Brycen what he likes about Katniss. “She’s hot.” OK, thank you. Anthony is a 19-year-old university student who says his favorite “Hunger Games” character is Haymitch. Now, Haymitch kicks ass – both on the page and as rendered by Woody Harrelson – but this is a pattern among the handful of male fans I’ve unscientifically surveyed: None of them claim Katniss as their favorite series character. (My kids’ favorite: Finnick.) Anthony likes the action sequences more than the romance, but says, “Katniss is a cool character. She’s strong, independent.” He thinks it’s good that more women – real and fictional – are reaching prominence and asserting themselves. “Like Ronda Rousey,” he says. Anthony and I – and lots of guys, really – understand that there’s value for girls in the stories of characters like Katniss, but only in a very abstract way. Hayley, a 21-year-old psych major, spells it out for me between screenings: “Katniss is important for us because she’s a healthy, strong woman who’s vulnerable, but her vulnerability isn’t a flaw. And the truth is it’s hard for girls to be strong. It’s hard for girls to be powerful. Katniss isn’t told she’s bossy. She’s not talked over. And boys need to understand that’s not how it is in real life.” I recount this exchange to my wife – my authority on all things feminine -- and say something like, “Does that really happen, even now? Getting talked over by men?” Gabrielle’s silent response combines affection with exasperation and makes it clear that she has forgiven me many times for committing the very sin whose existence I’ve just questioned. Hmm. If I did that, it wasn’t on purpose. So why would I? It feels like at some point, boys – including me – pick up this notion that when it comes to male-female interaction, strength, power and status are often a zero-sum proposition. This leads us to interpret displays of ability or authority in women as signs of weakness in ourselves -- gender relations as its own kind of Hunger Games. Earlier this month, @MTVceleb put out this call on Twitter: Are you a female #HungerGames fan? Send us your story of just why Katniss inspires you… MTV.com (whose target audience is teen girls) compiled the responses, some of which were quite touching. But some fans, seemingly both male and female, took offense to the solicitation, offering comments like:
“sexist” “why can't Katniss inspire men?” “…Katniss not only resonates with girls but boys as well.”I reached out to one of those male fans -- @muttmellark, an 18-year-old from Ireland. He tweeted, “She shows that a someone picked from a crowd can have an hugely important role in society regardless of where they come from.” With her final movie completed, Katniss Everdeen is history now, ascending from Panem to the pantheon of fiction’s great heroines, along with -- who, exactly? Artemis? Buffy? Nancy Drew? Although, really, why segregate humanity’s heroic hall of fame? I think Katniss has proven she can stand alongside anyone from Odysseus to Harry Potter. Every Christmas, my sons expect books in their stockings. This year, they might find something like “Howl's Moving Castle” by Diana Wynne Jones or “Dragonflight” by Anne McCaffrey. I won’t make a big deal out of it, won’t say, “Even though it’s about a girl…” or “Since you liked Katniss, maybe you’ll like this, too.” My hope is that reading stories like these will help my sons understand that women can be strong without making men feel weak. As Katniss and Peeta discover in the climactic scene from “The Hunger Games,” even if the odds aren’t in your favor, it should be possible for both boy and girl to find a way to win.On the screen, Katniss Everdeen and Peeta Mellark are kissing. I’m watching with my 11-year-old triplet sons and there’s some squirming in the room. Leo says, “This is Ace’s favorite bit.” Ace snorts, “No, it’s not. My favorite part is where everybody dies.” Luke laughs. We’re watching a scene from “Catching Fire” on TV because they’ve been bugging me to show them a piece of the "Hunger Games” film franchise, even though their mom and I agree they’re too young to go with me to the premiere of “The Hunger Games: Mockingjay Part 2” later that day. We did let our kids read the books, which they devoured, sometimes getting into arguments about who had claimed the rights over who had dibs on “Catching Fire” or “Mockingjay.” After that, the archery set that had been in the garage for months re-emerged and arrows started flying through our backyard. There’s been plenty of ink and pixels devoted to Katniss’ influence on girls, book sales, the sport of archery and even Jennifer Lawrence herself. I think it’s time to talk about her value to another demographic – boys. First, I claim no objectivity about “The Hunger Games” saga. It had me at “This is the day of the reaping.” Hell, these books reaped me. I’m a 44-year-old man whose normal reading pace is a page or two per night before bed. (People worry that George R.R. Martin might not live long enough to finish “A Song of Ice and Fire” – I’m not sure I will, either.) I read all 1,174 pages of Suzanne Collins’ trilogy in about four days. I did no work. Minimal chores. When my wife wasn’t home, the triplets lived their own version of the Hunger Games in our kitchen because I’d stopped preparing food. “The Hunger Games” books thrilled, fascinated, worried and wounded me; they left me exhausted, dazed and grateful. I wanted my sons to read them in part to share the experience, but also because I feel like maybe they needed it. Unlike too many boys, these guys read. Outside of Japanese manga and a book written by their own grandmother, they tend to read stories about other members of their own demographic. That’s easy to do. According to one study of children’s literature, male characters are featured “nearly twice as often in titles and 1.6 times as often as central characters.” (Among feature films popular with kids, the ratio is even worse.) In addition, my sons (like me) have no sisters, and I’m not aware of any female friends with whom they seem terribly close. I don’t want the interior life of girls to be as entirely foreign to them as it once felt to me. Annette Wannamaker teaches young adult literature at Eastern Michigan University and writes about boys and media. “‘The Hunger Games’ is a great text for boys and girls to read because Katniss is a non-stereotypical girl and because Peeta is a non-stereotypical boy,” she told me. “Peeta is heroic as well, but in a way that is not stereotypically masculine: He is brave and strong, but he is also nurturing.” She added that the idea that boys won’t connect with a story about a girl is just wrong. “There's a myth that books need to have characters young readers can 'identify with,' but the exact opposite is true: literature gives us a chance to view the world through another person's eyes, to consider a perspective that is different from our own. This is vital because it teaches empathy.” Children’s author Shannon Hale wrote in her blog, “…the idea that girls should read about and understand boys but that boys don't have to read about girls, that boys aren't expected to understand and empathize with the female population of the world....this belief directly leads to rape culture. To a culture that tells boys and men, it doesn't matter how the girl feels, what she wants.” Hale describes author visits in which the school only invited girl students to her assemblies. In one case, “I did the presentation. But I felt sick to my stomach. Later I asked what other authors had visited. They'd had a male writer. For his assembly, both boys and girls had been invited.” I appear to be the only unaccompanied man who’s come to this particular theater for the debut of “Mockingjay: Part 2” here in Jennifer Lawrence’s hometown of Louisville. The house is about three-quarters full (which I guess portends the supposedly disappointing opening weekend draw of “only” $247 million worldwide). I’m here in part to complete my own “Hunger Games” experience and also to find some male fans who don’t live in my house. The movie itself is satisfying, riveting and tense, though many scenes feel relentless, grim and gray. Still, it hits all the marks I need it to, particularly at the very end, when the girl from District 12 becomes her own woman, not exactly at peace, but no longer on fire. In between setpieces, I watch with this question in mind: How would these scenes play if (à la “Twilight Reimagined”) Katniss were a boy and Peeta and Gale were girls? Peeta’s emotional intelligence would feel familiar; Gale’s ruthlessness as a military tactician would be more interesting. But Katniss’ interactions with either of them would somehow look much creepier, manipulative, even callous. Imagine a male Katniss threatening to kill a deranged female Peeta, saying, “I’ll put (her) down like a Capitol mutt.” In the lobby, I approach a few teenagers who appear to be either leaving the movie or in line for the next show. Fifteen year-old Greg doesn’t like the concept of a male-centered "Hunger Games." “It’d be boring. It’d be just like every other movie.” Good point. I ask 14-year-old Brycen what he likes about Katniss. “She’s hot.” OK, thank you. Anthony is a 19-year-old university student who says his favorite “Hunger Games” character is Haymitch. Now, Haymitch kicks ass – both on the page and as rendered by Woody Harrelson – but this is a pattern among the handful of male fans I’ve unscientifically surveyed: None of them claim Katniss as their favorite series character. (My kids’ favorite: Finnick.) Anthony likes the action sequences more than the romance, but says, “Katniss is a cool character. She’s strong, independent.” He thinks it’s good that more women – real and fictional – are reaching prominence and asserting themselves. “Like Ronda Rousey,” he says. Anthony and I – and lots of guys, really – understand that there’s value for girls in the stories of characters like Katniss, but only in a very abstract way. Hayley, a 21-year-old psych major, spells it out for me between screenings: “Katniss is important for us because she’s a healthy, strong woman who’s vulnerable, but her vulnerability isn’t a flaw. And the truth is it’s hard for girls to be strong. It’s hard for girls to be powerful. Katniss isn’t told she’s bossy. She’s not talked over. And boys need to understand that’s not how it is in real life.” I recount this exchange to my wife – my authority on all things feminine -- and say something like, “Does that really happen, even now? Getting talked over by men?” Gabrielle’s silent response combines affection with exasperation and makes it clear that she has forgiven me many times for committing the very sin whose existence I’ve just questioned. Hmm. If I did that, it wasn’t on purpose. So why would I? It feels like at some point, boys – including me – pick up this notion that when it comes to male-female interaction, strength, power and status are often a zero-sum proposition. This leads us to interpret displays of ability or authority in women as signs of weakness in ourselves -- gender relations as its own kind of Hunger Games. Earlier this month, @MTVceleb put out this call on Twitter: Are you a female #HungerGames fan? Send us your story of just why Katniss inspires you… MTV.com (whose target audience is teen girls) compiled the responses, some of which were quite touching. But some fans, seemingly both male and female, took offense to the solicitation, offering comments like:
“sexist” “why can't Katniss inspire men?” “…Katniss not only resonates with girls but boys as well.”I reached out to one of those male fans -- @muttmellark, an 18-year-old from Ireland. He tweeted, “She shows that a someone picked from a crowd can have an hugely important role in society regardless of where they come from.” With her final movie completed, Katniss Everdeen is history now, ascending from Panem to the pantheon of fiction’s great heroines, along with -- who, exactly? Artemis? Buffy? Nancy Drew? Although, really, why segregate humanity’s heroic hall of fame? I think Katniss has proven she can stand alongside anyone from Odysseus to Harry Potter. Every Christmas, my sons expect books in their stockings. This year, they might find something like “Howl's Moving Castle” by Diana Wynne Jones or “Dragonflight” by Anne McCaffrey. I won’t make a big deal out of it, won’t say, “Even though it’s about a girl…” or “Since you liked Katniss, maybe you’ll like this, too.” My hope is that reading stories like these will help my sons understand that women can be strong without making men feel weak. As Katniss and Peeta discover in the climactic scene from “The Hunger Games,” even if the odds aren’t in your favor, it should be possible for both boy and girl to find a way to win.On the screen, Katniss Everdeen and Peeta Mellark are kissing. I’m watching with my 11-year-old triplet sons and there’s some squirming in the room. Leo says, “This is Ace’s favorite bit.” Ace snorts, “No, it’s not. My favorite part is where everybody dies.” Luke laughs. We’re watching a scene from “Catching Fire” on TV because they’ve been bugging me to show them a piece of the "Hunger Games” film franchise, even though their mom and I agree they’re too young to go with me to the premiere of “The Hunger Games: Mockingjay Part 2” later that day. We did let our kids read the books, which they devoured, sometimes getting into arguments about who had claimed the rights over who had dibs on “Catching Fire” or “Mockingjay.” After that, the archery set that had been in the garage for months re-emerged and arrows started flying through our backyard. There’s been plenty of ink and pixels devoted to Katniss’ influence on girls, book sales, the sport of archery and even Jennifer Lawrence herself. I think it’s time to talk about her value to another demographic – boys. First, I claim no objectivity about “The Hunger Games” saga. It had me at “This is the day of the reaping.” Hell, these books reaped me. I’m a 44-year-old man whose normal reading pace is a page or two per night before bed. (People worry that George R.R. Martin might not live long enough to finish “A Song of Ice and Fire” – I’m not sure I will, either.) I read all 1,174 pages of Suzanne Collins’ trilogy in about four days. I did no work. Minimal chores. When my wife wasn’t home, the triplets lived their own version of the Hunger Games in our kitchen because I’d stopped preparing food. “The Hunger Games” books thrilled, fascinated, worried and wounded me; they left me exhausted, dazed and grateful. I wanted my sons to read them in part to share the experience, but also because I feel like maybe they needed it. Unlike too many boys, these guys read. Outside of Japanese manga and a book written by their own grandmother, they tend to read stories about other members of their own demographic. That’s easy to do. According to one study of children’s literature, male characters are featured “nearly twice as often in titles and 1.6 times as often as central characters.” (Among feature films popular with kids, the ratio is even worse.) In addition, my sons (like me) have no sisters, and I’m not aware of any female friends with whom they seem terribly close. I don’t want the interior life of girls to be as entirely foreign to them as it once felt to me. Annette Wannamaker teaches young adult literature at Eastern Michigan University and writes about boys and media. “‘The Hunger Games’ is a great text for boys and girls to read because Katniss is a non-stereotypical girl and because Peeta is a non-stereotypical boy,” she told me. “Peeta is heroic as well, but in a way that is not stereotypically masculine: He is brave and strong, but he is also nurturing.” She added that the idea that boys won’t connect with a story about a girl is just wrong. “There's a myth that books need to have characters young readers can 'identify with,' but the exact opposite is true: literature gives us a chance to view the world through another person's eyes, to consider a perspective that is different from our own. This is vital because it teaches empathy.” Children’s author Shannon Hale wrote in her blog, “…the idea that girls should read about and understand boys but that boys don't have to read about girls, that boys aren't expected to understand and empathize with the female population of the world....this belief directly leads to rape culture. To a culture that tells boys and men, it doesn't matter how the girl feels, what she wants.” Hale describes author visits in which the school only invited girl students to her assemblies. In one case, “I did the presentation. But I felt sick to my stomach. Later I asked what other authors had visited. They'd had a male writer. For his assembly, both boys and girls had been invited.” I appear to be the only unaccompanied man who’s come to this particular theater for the debut of “Mockingjay: Part 2” here in Jennifer Lawrence’s hometown of Louisville. The house is about three-quarters full (which I guess portends the supposedly disappointing opening weekend draw of “only” $247 million worldwide). I’m here in part to complete my own “Hunger Games” experience and also to find some male fans who don’t live in my house. The movie itself is satisfying, riveting and tense, though many scenes feel relentless, grim and gray. Still, it hits all the marks I need it to, particularly at the very end, when the girl from District 12 becomes her own woman, not exactly at peace, but no longer on fire. In between setpieces, I watch with this question in mind: How would these scenes play if (à la “Twilight Reimagined”) Katniss were a boy and Peeta and Gale were girls? Peeta’s emotional intelligence would feel familiar; Gale’s ruthlessness as a military tactician would be more interesting. But Katniss’ interactions with either of them would somehow look much creepier, manipulative, even callous. Imagine a male Katniss threatening to kill a deranged female Peeta, saying, “I’ll put (her) down like a Capitol mutt.” In the lobby, I approach a few teenagers who appear to be either leaving the movie or in line for the next show. Fifteen year-old Greg doesn’t like the concept of a male-centered "Hunger Games." “It’d be boring. It’d be just like every other movie.” Good point. I ask 14-year-old Brycen what he likes about Katniss. “She’s hot.” OK, thank you. Anthony is a 19-year-old university student who says his favorite “Hunger Games” character is Haymitch. Now, Haymitch kicks ass – both on the page and as rendered by Woody Harrelson – but this is a pattern among the handful of male fans I’ve unscientifically surveyed: None of them claim Katniss as their favorite series character. (My kids’ favorite: Finnick.) Anthony likes the action sequences more than the romance, but says, “Katniss is a cool character. She’s strong, independent.” He thinks it’s good that more women – real and fictional – are reaching prominence and asserting themselves. “Like Ronda Rousey,” he says. Anthony and I – and lots of guys, really – understand that there’s value for girls in the stories of characters like Katniss, but only in a very abstract way. Hayley, a 21-year-old psych major, spells it out for me between screenings: “Katniss is important for us because she’s a healthy, strong woman who’s vulnerable, but her vulnerability isn’t a flaw. And the truth is it’s hard for girls to be strong. It’s hard for girls to be powerful. Katniss isn’t told she’s bossy. She’s not talked over. And boys need to understand that’s not how it is in real life.” I recount this exchange to my wife – my authority on all things feminine -- and say something like, “Does that really happen, even now? Getting talked over by men?” Gabrielle’s silent response combines affection with exasperation and makes it clear that she has forgiven me many times for committing the very sin whose existence I’ve just questioned. Hmm. If I did that, it wasn’t on purpose. So why would I? It feels like at some point, boys – including me – pick up this notion that when it comes to male-female interaction, strength, power and status are often a zero-sum proposition. This leads us to interpret displays of ability or authority in women as signs of weakness in ourselves -- gender relations as its own kind of Hunger Games. Earlier this month, @MTVceleb put out this call on Twitter: Are you a female #HungerGames fan? Send us your story of just why Katniss inspires you… MTV.com (whose target audience is teen girls) compiled the responses, some of which were quite touching. But some fans, seemingly both male and female, took offense to the solicitation, offering comments like:
“sexist” “why can't Katniss inspire men?” “…Katniss not only resonates with girls but boys as well.”I reached out to one of those male fans -- @muttmellark, an 18-year-old from Ireland. He tweeted, “She shows that a someone picked from a crowd can have an hugely important role in society regardless of where they come from.” With her final movie completed, Katniss Everdeen is history now, ascending from Panem to the pantheon of fiction’s great heroines, along with -- who, exactly? Artemis? Buffy? Nancy Drew? Although, really, why segregate humanity’s heroic hall of fame? I think Katniss has proven she can stand alongside anyone from Odysseus to Harry Potter. Every Christmas, my sons expect books in their stockings. This year, they might find something like “Howl's Moving Castle” by Diana Wynne Jones or “Dragonflight” by Anne McCaffrey. I won’t make a big deal out of it, won’t say, “Even though it’s about a girl…” or “Since you liked Katniss, maybe you’ll like this, too.” My hope is that reading stories like these will help my sons understand that women can be strong without making men feel weak. As Katniss and Peeta discover in the climactic scene from “The Hunger Games,” even if the odds aren’t in your favor, it should be possible for both boy and girl to find a way to win.






Published on November 29, 2015 11:00
Everyone loves an a**hole: Science explains our attraction to creeps and deviants








Published on November 29, 2015 10:00
What the hell’s wrong with us? Autism, vaccines and why some people believe Jenny McCarthy over every doctor
Stephanie Messenger is an Australian author of self-published educational books for children, such as Don’t Bully Billy and Sarah Visits a Naturopath. In 2012, she published a book that, according to promotional materials, “takes children on a journey to learn about the ineffectiveness of vaccinations and to know they don’t have to be scared of childhood illnesses, like measles and chicken pox.” The blurb on the back of the book talks about how nowadays we’re bombarded with messages urging us to fear diseases, from people who have “vested interests” in selling “some potion or vaccine.” Messenger called the book Melanie’s Marvelous Measles. Perhaps she drew inspiration from the beloved British children’s author Roald Dahl’s George’s Marvellous Medicine. Which would be ironic, given Dahl’s own feelings about measles, which he wrote about in 1986.

Olivia, my eldest daughter, caught measles when she was seven years old. As the illness took its usual course I can remember reading to her often in bed and not feeling particularly alarmed about it. Then one morning, when she was well on the road to recovery, I was sitting on her bed showing her how to fashion little animals out of coloured pipe-cleaners, and when it came to her turn to make one herself, I noticed that her fingers and her mind were not working together and she couldn’t do anything. “Are you feeling alright?” I asked her. “I feel all sleepy,” she said. In an hour she was unconscious. In twelve hours she was dead. The measles had turned into a terrible thing called measles encephalitis and there was nothing the doctors could do to save her.In 1962, when the measles took Olivia’s life, there was no vaccine. Practically everyone caught the measles at some point in childhood. Most recovered without any lasting damage, but it killed around a hundred children in the United Kingdom and more than four hundred in America every year, and put tens of thousands more in the hospital, leaving some blind or brain-damaged. When a vaccine was licensed in the United States a year later, in 1963, the number of people who caught measles plummeted by 98 percent. “In my opinion,” Dahl concluded, “parents who now refuse to have their children immunized are putting the lives of those children at risk.” Fortunately, we now have a vaccine that protects not only against measles, but against mumps and rubella as well: the combination MMR shot. The World Health Organization estimates that between the years 2000 and 2013, measles vaccination saved more than fifteen million lives around the world. Unfortunately, since the late 1990s, MMR has been the focus of intense debate and fear, often with conspiratorial undertones. * The trouble with MMR started in the United Kingdom. When the vaccine was introduced there, in 1988, it was an immediate success. In the first year, a million children were vaccinated. For the next ten years, uptake of the vaccine remained above 90 percent. Then, in 1998, a doctor called Andrew Wakefield, along with a team of colleagues, published a study that ignited controversy. In the paper, which was published by a highly respected medical journal, The Lancet, Wakefield and colleagues claimed to have found measles virus in the intestines of a handful of autistic children. The paper speculated that the MMR shot may have played a role in causing the children’s autism, but pointed out that the findings were not sufficient to prove the relationship. Regardless, Wakefield took the findings directly to the media. In a press conference held the day before the paper was published, and that many of the paper’s coauthors refused to attend, Wakefield claimed that the danger posed by MMR was so great that the vaccine ought to be immediately withdrawn, and individual measles, mumps, and rubella shots, given a year apart, ought to be used instead. (Wakefield himself, it is worth noting, has never opposed vaccination across the board; in fact, he has maintained that vaccines are an important part of health—just not the combined MMR shot, which he continues to argue is linked to autism.) Concerned parents are understandably influenced by the media, and there is no better illustration than the panic that followed Wakefield’s alarming announcement. Interest in the story was modest at first. In 1998, the year of Wakefield’s press conference, a handful of news stories reported his claim, and vaccine uptake began to fall slightly. It wasn’t until 2001 that the story began to take on a life of its own. For several years, the idea that the MMR vaccine causes autism received more coverage in the British media than any other science story. As fear-mongering coverage peaked between 2001 and 2003, uptake of the vaccine dipped to 80 percent. Some parts of the country, particularly parts of London, had drastically lower vaccination rates. The falling vaccination rates prompted outbreaks of the diseases that the vaccine prevents—particularly, since it’s so highly contagious, measles. The first outbreak was in Dublin in 2000, where vaccination rates were already lower than in the United Kingdom. Almost sixteen hundred cases of measles were reported. More than a hundred children were admitted to hospital with serious complications, and three died. A thirteen-year-old boy died in England in 2006, becoming the first person to die of measles in England since 1994. In 2008, measles was declared endemic in the United Kingdom for the first time in fourteen years. In 2012 there were more than two thousand cases of measles in England and Wales—mostly affecting children and teenagers whose parents had declined the MMR vaccine years earlier. In 2013, another outbreak in Wales infected more than a thousand people, hospitalizing eighty-eight, and killing a twenty-five-year-old man. In 2004 it emerged that the entire MMR-autism debate was built on a lie. Investigative journalist Brian Deer uncovered evidence that, before beginning his research, Wakefield had been involved in a patent application for an allegedly safer alternative to the combined MMR vaccine. He had also received a payment in the region of half a million pounds from a personal-injury law firm to conduct the research, and the same law firm had referred parents who believed their children to be vaccine-damaged to Wakefield so he could use the children in his research. But failing to declare a conflict of interest was the least of Wakefield’s wrongdoing. Deer discovered that the study, which involved conducting invasive medical procedures on developmentally challenged children, had not been granted ethical approval. Finally it emerged that Wakefield may have fudged elements of the children’s medical histories to fit his MMR-autism theory, and a co-worker suggested that Wakefield had knowingly reported incorrect test results. Ultimately the paper was retracted, and Wakefield’s license to practice medicine in the United Kingdom was withdrawn. That all looks pretty bad, I think it’s fair to say, but we shouldn’t necessarily dismiss the hypothesis that MMR somehow causes autism on the basis of Wakefield’s behavior alone. Since his paper was published, dozens of independent, large, well-conducted studies, involving hundreds of thousands of children across several continents, have found no association whatsoever between the MMR vaccine and autism. As Paul Offit, a pediatrician and immunologist, has pointed out, we still don’t know for sure exactly what causes autism, but by now we can say with considerable certainty that vaccines can be crossed off the list of suspects. Despite Wakefield’s study being utterly discredited, and despite the weight of evidence against his claims, concerns about MMR continue to linger, in Britain and elsewhere. It didn’t take long for the panic over MMR to cross the Atlantic, where the anti-vaccination cause was taken up by celebrities such as Jenny McCarthy and her then-boyfriend Jim Carrey. Along the way, the claims mutated and merged with other fears. A particular concern in the United States was the presence in various vaccines of a mercury-based preservative, thimerosal, which was held by some anti-vaccine activists to be responsible for the increasing prevalence of autism. (Studies have shown this claim to be mistaken, too.) For many concerned parents, the controversy has thrown suspicion on the entire vaccine schedule. According to a 2009 survey, more than one in ten American parents have refused at least one recommended vaccine for their child, and twice as many parents choose to delay certain shots, leaving their child unprotected for longer. Wakefield remains a polarizing figure, a hero to some and a dangerous quack to others. A recent article, written in the wake of a measles outbreak that began at Disneyland in California in December 2014, described Wakefield as the “father of the anti-vaccine movement.” Yet unfounded fears about vaccines predate Andrew Wakefield. In fact, this wasn’t the first time a British doctor had gone to the media with trumped-up claims of vaccine-related harm. An uncannily similar episode had transpired a few decades earlier. * The most common symptom of pertussis is uncontrollable fits of coughing. Because of narrowing of the throat, the struggle to draw a breath sometimes produces a high-pitched whooping noise, hence the disease’s colloquial name, whooping cough. The coughing can be violent enough to result in bleeding eyeballs, broken ribs, and hernias. In extreme cases, the coughing can last up to four months, sometimes leading to malnourishment, loss of sight or hearing, or brain damage. But pertussis is most dangerous in infants. Infants don’t whoop. Instead, unable to breathe, they sometimes quietly turn blue and die. The World Health Organization estimates that almost two hundred thousand people die each year from whooping cough around the world, most of them young children in developing countries. Fortunately, we have a vaccine that protects not only against pertussis, but also against diphtheria and tetanus: the DTaP shot, formerly known as DPT. Unfortunately, in the 1970s and ’80s, DPT became the subject of intense debate and fear, often with conspiratorial undertones. In 1973, a British doctor called John Wilson gave a presentation at an academic conference in which he claimed that the pertussis component of DPT was causing seizures and brain damage in infants. The research was based on a small number of children, and it has since emerged that many of the children were misdiagnosed, and some hadn’t even received the DPT vaccine. Regardless, Wilson took his findings to the media, appearing on prime-time television in a program that contained harrowing images of sick children and claimed that a hundred British children suffered brain damage every year as a result of the DPT vaccine. Uptake of the DPT vaccine fell from around 80 percent at the beginning of the decade to just 31 percent by 1978. This was followed by a pertussis epidemic during 1978 and ’79, in which a hundred thousand cases of whooping cough were reported in England and Wales. It’s estimated that around six hundred children died in the outbreak. Despite flaws in Wilson’s study, as well as a growing number of studies that found no evidence of the alleged link between DPT and brain damage, by the early eighties the fear had spread to America. In 1982 a documentary called DPT: Vaccine Roulette aired on U.S. television. Like its British precursor, it was full of emotive scenes of children who had allegedly been harmed by the DPT vaccine. The damage was being covered up or ignored by the government and medical establishment, the documentary argued. It stopped short of telling parents outright not to have their children vaccinated, but the implication was clear. One parent, a woman named Barbara Loe Fisher, watched Vaccine Roulette and came to believe that her own son had been injured by the DPT vaccine. Together with other parents who believed their children had been hurt by vaccines, Fisher formed a group called Dissatisfied Parents Together (or DPT for short). The group still exists, now going by the name National Vaccine Information Center. The change of name reflected the fact that their distrust of vaccines had broadened beyond the DPT shot. Over the years, Fisher’s group, and others like it, has questioned the safety and efficacy of practically every vaccine in use. Which brings us back to where we started. The May 2, 1998, issue of The Lancet carried a letter to the editor penned by none other than Barbara Loe Fisher. She referred to a critique of Andrew Wakefield’s research as a “pre-emptive strike by US vaccine policymakers.” Hinting at nefarious motives, she wrote, “it is perhaps understandable that health officials are tempted to discredit innovative clinical research into the biological mechanism of vaccine-associated health problems when they have steadfastly refused to conduct this kind of basic science research themselves.” Fisher’s National Vaccine Information Center later bestowed upon Andrew Wakefield an award for “Courage in Science.” So the current epidemic of fear over the MMR vaccine is in many ways simply an extension of the vaccine anxiety that blossomed in the 1970s. But it didn’t start there. In fact, people have been worried about the safety of vaccines—and the motives of the people who make and sell them—since the discovery of the very first vaccine. A pox on you Common symptoms of smallpox included foul-smelling and excruciatingly painful pus-filled blisters all over the face and body. Open sores inside the mouth poured virus particles into the mouth and throat, meaning that the disease was highly contagious, spread by coughing, sneezing, and even talking. Around one in three infected adults died of the disease, and four out of five children. Those who survived were often left disfigured, or worse—many were blinded, pregnant women miscarried, and children’s growth was stunted. Smallpox killed more people than any other disease throughout history. As recently as 1967, smallpox killed an estimated two million people around the world in that year alone. The virus shaped the course of history. Battles and wars were won and lost because of outbreaks of smallpox. It killed monarchs and rulers in office. It helped clear the way for the colonization of North and South America by European settlers by killing off millions of the native inhabitants. Fortunately, you’re not going to catch smallpox. The virus has been eradicated from the wild, thanks to the discovery, two centuries ago, of the world’s first vaccine. Unfortunately, the new practice of vaccination gave rise to the kind of vaccine anxiety and organized anti-vaccine movements that persist to this day. The vaccine was discovered by Edward Jenner. Jenner was a classic mildly eccentric eighteenth-century English country gentleman. He dabbled in things like fossil collecting, hot air ballooning, and growing oversized vegetables. His interest in smallpox was piqued when, flirting with a milkmaid one afternoon, he learned the folk wisdom that catching cowpox, a disease that caused blisters on cows’ udders, somehow seemed to protect milkmaids and other farm workers against smallpox. In humans, cowpox just caused a few harmless blisters on the hands, but it seemed to somehow offer lifelong immunity to smallpox. Jenner decided to put this folk wisdom to the test. He initially exposed fifteen farm workers who had previously suffered from cowpox to smallpox virus. None became infected. Then, in 1796, he undertook his boldest experiment to date. He deliberately infected a young boy with cowpox, and then exposed him to smallpox. The boy did not get sick. Jenner called the procedure vaccination, derived from the Latin vaccinae meaning “of the cow,” and published his findings in 1798. By 1820, millions of people had been vaccinated in Britain, Europe, and the United States, and the number of people dying from smallpox was cut in half. Not everyone was impressed. There immediately arose some sporadic opposition to the vaccine. Objections were occasionally raised on religious grounds—to vaccinate oneself, some argued, was to question God’s divine plan. Others objected for economic reasons, or simply out of disgust at a vaccine derived from sick cows, coupled with distrust of the doctors who administered them. By 1800, Jenner was moved to defend his vaccine from detractors, writing “the feeble efforts of a few individuals to depreciate the new practice are sinking fast into contempt.” His optimism was misplaced. The first truly organized anti-vaccination movements have their origins in the Compulsory Vaccination Acts passed by British Parliament in the 1850s and ’60s. The first law, introduced in 1853, threatened parents who failed to vaccinate their children with fines and imprisonment. The law was widely accepted at first, due in large part to a particularly bad smallpox epidemic that had swept through England the year before, but vaccination rates fell off again when people realized that the law simply wasn’t enforced. Parliament passed a new tougher law in 1867. It was in reaction to these laws that the first dedicated and well-organized anti-vaccination leagues were formed. Critics claimed that the vaccine was at best useless, at worst a scam or a poison. By 1900 there were in the region of two hundred anti-vaccination groups across England. The United States quickly followed suit; American anti-vaccination societies began to spring up in the 1870s. In 1898, the English critics of vaccination won. The British government gave in, passing a law that allowed so-called conscientious objectors to opt out of vaccinating their children. Objection certificates were made easier to obtain in 1907. Vaccination rates fell, and outbreaks of smallpox rose once again in parts of England. In neighboring Scotland and Ireland, where anti-vaccination movements had not gained as much traction, vaccination continued to be readily accepted, and smallpox continued to decline. * So vaccine anxiety was a side effect of the very first vaccine, and the symptoms have never quite cleared up. Perhaps the most remarkable thing about the long-standing unease about vaccines is how little the arguments have changed over the centuries. Jenner’s critics created elaborate cartoons depicting doctors as unfeeling monsters, intent on sacrificing innocent, helpless children. Twenty-first-century anti-vaccinationists write blog posts with titles like “Doctors want power to kill disabled babies.” Nineteenth-century activists claimed that the smallpox vaccine contained “poison of adders, the blood, entrails, and excretion of bats, toads and suckling whelps” and fought for their right to remain “pure and unpolluted.” The modern-day “green our vaccines” movement doesn’t go so far as to say vaccines contain entrails, but they still misconstrue vaccines as containing “toxins” including antifreeze, insect repellent, and spermicide. And, as Paul Offit has pointed out, the current concerns about MMR somehow causing autism are about as plausible, biologically speaking, as the claim, widely reported in the early 1800s, that the smallpox vaccine caused recipients to sprout horns, run about on all fours, and low and squint like cows. And throughout it all, there have been theories alleging a vast international conspiracy to trump up the dangers of the diseases that vaccines, to hide the truth about vaccine side effects, and to ensure profits for Big Pharma and the government. One nineteenth-century British activist wrote of smallpox, “this infection scare is a sham, fostered, if not got up originally by doctors as a means of raising their own importance and tightening their grasp on the throat of the nation’s common sense which has lain so long paralysed and inert in their clutches.” More than a century later, Barbara Loe Fisher called the HPV vaccine “one of the biggest money making schemes in the history of medicine.” In some parts of the world, conspiracist fears about vaccines have provoked more drastic measures than simply opting out of vaccination. In parts of Pakistan, local religious leaders have denounced vaccination as an American ploy to sterilize Muslims. According to the BBC, more than sixty polio workers, or their drivers or guards, have been murdered in Pakistan since 2012. (The CIA, it’s worth pointing out, inadvertently fanned the flames of distrust by setting up a fake vaccination program in Abbottabad in 2011, as part of an effort to confirm Osama Bin Laden’s whereabouts by having vaccine workers surreptitiously collect DNA samples from Bin Laden’s family members. When the stunningly misguided plan came to light, it put every vaccine worker in the country under suspicion.) Similar killings of polio workers have taken place in Nigeria. Pakistan and Nigeria, not coincidentally, are two of only three countries in the world where polio remains endemic. * Of course, not every parent of an unvaccinated child is a raving conspiracy theorist. Some unvaccinated children are too young to have received the vaccine. Others have medical conditions that make vaccination impossible. And many parents who fail to stick to the recommended vaccine schedule do so not out of fear of a Big Pharma conspiracy, but because they lack the time or money for doctor visits, and because they have fallen through the cracks in the health care system. These are the children who rely on “herd immunity”—the protection that comes from most of the people around us being immune to a disease. Parents who consciously choose to deny their children vaccines are putting not only their own child in harm’s way, but other children, too. And yet it would be a mistake to demonize parents who choose to reject vaccines. They are thoughtful, caring, well intentioned, and often well informed. Thanks to a small but vocal minority of dedicated anti-vaccinists, the Internet is rife with conspiracy-laced misinformation urging us not to trust vaccines. Making matters worse, the media often portrays the controversy with a false sense of balance. Most parents have heard the claims about autism and vaccines, and, according to a recent study, merely reading anti-vaccine conspiracy theories can reduce parents’ willingness to have their children vaccinated. The science is clear: Vaccines do not cause autism. But conspiracy theories erode our trust in science, allowing controversy to linger long after the questions have been settled. Excerpted from "Suspicious Minds: Why We Believe Conspiracy Theories" by Rob Brotherton. Published by Bloomsbury USA. Copyright 2015 by Rob Brotherton. Reprinted with permission of the publisher. All rights reserved.Stephanie Messenger is an Australian author of self-published educational books for children, such as Don’t Bully Billy and Sarah Visits a Naturopath. In 2012, she published a book that, according to promotional materials, “takes children on a journey to learn about the ineffectiveness of vaccinations and to know they don’t have to be scared of childhood illnesses, like measles and chicken pox.” The blurb on the back of the book talks about how nowadays we’re bombarded with messages urging us to fear diseases, from people who have “vested interests” in selling “some potion or vaccine.” Messenger called the book Melanie’s Marvelous Measles. Perhaps she drew inspiration from the beloved British children’s author Roald Dahl’s George’s Marvellous Medicine. Which would be ironic, given Dahl’s own feelings about measles, which he wrote about in 1986.
Olivia, my eldest daughter, caught measles when she was seven years old. As the illness took its usual course I can remember reading to her often in bed and not feeling particularly alarmed about it. Then one morning, when she was well on the road to recovery, I was sitting on her bed showing her how to fashion little animals out of coloured pipe-cleaners, and when it came to her turn to make one herself, I noticed that her fingers and her mind were not working together and she couldn’t do anything. “Are you feeling alright?” I asked her. “I feel all sleepy,” she said. In an hour she was unconscious. In twelve hours she was dead. The measles had turned into a terrible thing called measles encephalitis and there was nothing the doctors could do to save her.In 1962, when the measles took Olivia’s life, there was no vaccine. Practically everyone caught the measles at some point in childhood. Most recovered without any lasting damage, but it killed around a hundred children in the United Kingdom and more than four hundred in America every year, and put tens of thousands more in the hospital, leaving some blind or brain-damaged. When a vaccine was licensed in the United States a year later, in 1963, the number of people who caught measles plummeted by 98 percent. “In my opinion,” Dahl concluded, “parents who now refuse to have their children immunized are putting the lives of those children at risk.” Fortunately, we now have a vaccine that protects not only against measles, but against mumps and rubella as well: the combination MMR shot. The World Health Organization estimates that between the years 2000 and 2013, measles vaccination saved more than fifteen million lives around the world. Unfortunately, since the late 1990s, MMR has been the focus of intense debate and fear, often with conspiratorial undertones. * The trouble with MMR started in the United Kingdom. When the vaccine was introduced there, in 1988, it was an immediate success. In the first year, a million children were vaccinated. For the next ten years, uptake of the vaccine remained above 90 percent. Then, in 1998, a doctor called Andrew Wakefield, along with a team of colleagues, published a study that ignited controversy. In the paper, which was published by a highly respected medical journal, The Lancet, Wakefield and colleagues claimed to have found measles virus in the intestines of a handful of autistic children. The paper speculated that the MMR shot may have played a role in causing the children’s autism, but pointed out that the findings were not sufficient to prove the relationship. Regardless, Wakefield took the findings directly to the media. In a press conference held the day before the paper was published, and that many of the paper’s coauthors refused to attend, Wakefield claimed that the danger posed by MMR was so great that the vaccine ought to be immediately withdrawn, and individual measles, mumps, and rubella shots, given a year apart, ought to be used instead. (Wakefield himself, it is worth noting, has never opposed vaccination across the board; in fact, he has maintained that vaccines are an important part of health—just not the combined MMR shot, which he continues to argue is linked to autism.) Concerned parents are understandably influenced by the media, and there is no better illustration than the panic that followed Wakefield’s alarming announcement. Interest in the story was modest at first. In 1998, the year of Wakefield’s press conference, a handful of news stories reported his claim, and vaccine uptake began to fall slightly. It wasn’t until 2001 that the story began to take on a life of its own. For several years, the idea that the MMR vaccine causes autism received more coverage in the British media than any other science story. As fear-mongering coverage peaked between 2001 and 2003, uptake of the vaccine dipped to 80 percent. Some parts of the country, particularly parts of London, had drastically lower vaccination rates. The falling vaccination rates prompted outbreaks of the diseases that the vaccine prevents—particularly, since it’s so highly contagious, measles. The first outbreak was in Dublin in 2000, where vaccination rates were already lower than in the United Kingdom. Almost sixteen hundred cases of measles were reported. More than a hundred children were admitted to hospital with serious complications, and three died. A thirteen-year-old boy died in England in 2006, becoming the first person to die of measles in England since 1994. In 2008, measles was declared endemic in the United Kingdom for the first time in fourteen years. In 2012 there were more than two thousand cases of measles in England and Wales—mostly affecting children and teenagers whose parents had declined the MMR vaccine years earlier. In 2013, another outbreak in Wales infected more than a thousand people, hospitalizing eighty-eight, and killing a twenty-five-year-old man. In 2004 it emerged that the entire MMR-autism debate was built on a lie. Investigative journalist Brian Deer uncovered evidence that, before beginning his research, Wakefield had been involved in a patent application for an allegedly safer alternative to the combined MMR vaccine. He had also received a payment in the region of half a million pounds from a personal-injury law firm to conduct the research, and the same law firm had referred parents who believed their children to be vaccine-damaged to Wakefield so he could use the children in his research. But failing to declare a conflict of interest was the least of Wakefield’s wrongdoing. Deer discovered that the study, which involved conducting invasive medical procedures on developmentally challenged children, had not been granted ethical approval. Finally it emerged that Wakefield may have fudged elements of the children’s medical histories to fit his MMR-autism theory, and a co-worker suggested that Wakefield had knowingly reported incorrect test results. Ultimately the paper was retracted, and Wakefield’s license to practice medicine in the United Kingdom was withdrawn. That all looks pretty bad, I think it’s fair to say, but we shouldn’t necessarily dismiss the hypothesis that MMR somehow causes autism on the basis of Wakefield’s behavior alone. Since his paper was published, dozens of independent, large, well-conducted studies, involving hundreds of thousands of children across several continents, have found no association whatsoever between the MMR vaccine and autism. As Paul Offit, a pediatrician and immunologist, has pointed out, we still don’t know for sure exactly what causes autism, but by now we can say with considerable certainty that vaccines can be crossed off the list of suspects. Despite Wakefield’s study being utterly discredited, and despite the weight of evidence against his claims, concerns about MMR continue to linger, in Britain and elsewhere. It didn’t take long for the panic over MMR to cross the Atlantic, where the anti-vaccination cause was taken up by celebrities such as Jenny McCarthy and her then-boyfriend Jim Carrey. Along the way, the claims mutated and merged with other fears. A particular concern in the United States was the presence in various vaccines of a mercury-based preservative, thimerosal, which was held by some anti-vaccine activists to be responsible for the increasing prevalence of autism. (Studies have shown this claim to be mistaken, too.) For many concerned parents, the controversy has thrown suspicion on the entire vaccine schedule. According to a 2009 survey, more than one in ten American parents have refused at least one recommended vaccine for their child, and twice as many parents choose to delay certain shots, leaving their child unprotected for longer. Wakefield remains a polarizing figure, a hero to some and a dangerous quack to others. A recent article, written in the wake of a measles outbreak that began at Disneyland in California in December 2014, described Wakefield as the “father of the anti-vaccine movement.” Yet unfounded fears about vaccines predate Andrew Wakefield. In fact, this wasn’t the first time a British doctor had gone to the media with trumped-up claims of vaccine-related harm. An uncannily similar episode had transpired a few decades earlier. * The most common symptom of pertussis is uncontrollable fits of coughing. Because of narrowing of the throat, the struggle to draw a breath sometimes produces a high-pitched whooping noise, hence the disease’s colloquial name, whooping cough. The coughing can be violent enough to result in bleeding eyeballs, broken ribs, and hernias. In extreme cases, the coughing can last up to four months, sometimes leading to malnourishment, loss of sight or hearing, or brain damage. But pertussis is most dangerous in infants. Infants don’t whoop. Instead, unable to breathe, they sometimes quietly turn blue and die. The World Health Organization estimates that almost two hundred thousand people die each year from whooping cough around the world, most of them young children in developing countries. Fortunately, we have a vaccine that protects not only against pertussis, but also against diphtheria and tetanus: the DTaP shot, formerly known as DPT. Unfortunately, in the 1970s and ’80s, DPT became the subject of intense debate and fear, often with conspiratorial undertones. In 1973, a British doctor called John Wilson gave a presentation at an academic conference in which he claimed that the pertussis component of DPT was causing seizures and brain damage in infants. The research was based on a small number of children, and it has since emerged that many of the children were misdiagnosed, and some hadn’t even received the DPT vaccine. Regardless, Wilson took his findings to the media, appearing on prime-time television in a program that contained harrowing images of sick children and claimed that a hundred British children suffered brain damage every year as a result of the DPT vaccine. Uptake of the DPT vaccine fell from around 80 percent at the beginning of the decade to just 31 percent by 1978. This was followed by a pertussis epidemic during 1978 and ’79, in which a hundred thousand cases of whooping cough were reported in England and Wales. It’s estimated that around six hundred children died in the outbreak. Despite flaws in Wilson’s study, as well as a growing number of studies that found no evidence of the alleged link between DPT and brain damage, by the early eighties the fear had spread to America. In 1982 a documentary called DPT: Vaccine Roulette aired on U.S. television. Like its British precursor, it was full of emotive scenes of children who had allegedly been harmed by the DPT vaccine. The damage was being covered up or ignored by the government and medical establishment, the documentary argued. It stopped short of telling parents outright not to have their children vaccinated, but the implication was clear. One parent, a woman named Barbara Loe Fisher, watched Vaccine Roulette and came to believe that her own son had been injured by the DPT vaccine. Together with other parents who believed their children had been hurt by vaccines, Fisher formed a group called Dissatisfied Parents Together (or DPT for short). The group still exists, now going by the name National Vaccine Information Center. The change of name reflected the fact that their distrust of vaccines had broadened beyond the DPT shot. Over the years, Fisher’s group, and others like it, has questioned the safety and efficacy of practically every vaccine in use. Which brings us back to where we started. The May 2, 1998, issue of The Lancet carried a letter to the editor penned by none other than Barbara Loe Fisher. She referred to a critique of Andrew Wakefield’s research as a “pre-emptive strike by US vaccine policymakers.” Hinting at nefarious motives, she wrote, “it is perhaps understandable that health officials are tempted to discredit innovative clinical research into the biological mechanism of vaccine-associated health problems when they have steadfastly refused to conduct this kind of basic science research themselves.” Fisher’s National Vaccine Information Center later bestowed upon Andrew Wakefield an award for “Courage in Science.” So the current epidemic of fear over the MMR vaccine is in many ways simply an extension of the vaccine anxiety that blossomed in the 1970s. But it didn’t start there. In fact, people have been worried about the safety of vaccines—and the motives of the people who make and sell them—since the discovery of the very first vaccine. A pox on you Common symptoms of smallpox included foul-smelling and excruciatingly painful pus-filled blisters all over the face and body. Open sores inside the mouth poured virus particles into the mouth and throat, meaning that the disease was highly contagious, spread by coughing, sneezing, and even talking. Around one in three infected adults died of the disease, and four out of five children. Those who survived were often left disfigured, or worse—many were blinded, pregnant women miscarried, and children’s growth was stunted. Smallpox killed more people than any other disease throughout history. As recently as 1967, smallpox killed an estimated two million people around the world in that year alone. The virus shaped the course of history. Battles and wars were won and lost because of outbreaks of smallpox. It killed monarchs and rulers in office. It helped clear the way for the colonization of North and South America by European settlers by killing off millions of the native inhabitants. Fortunately, you’re not going to catch smallpox. The virus has been eradicated from the wild, thanks to the discovery, two centuries ago, of the world’s first vaccine. Unfortunately, the new practice of vaccination gave rise to the kind of vaccine anxiety and organized anti-vaccine movements that persist to this day. The vaccine was discovered by Edward Jenner. Jenner was a classic mildly eccentric eighteenth-century English country gentleman. He dabbled in things like fossil collecting, hot air ballooning, and growing oversized vegetables. His interest in smallpox was piqued when, flirting with a milkmaid one afternoon, he learned the folk wisdom that catching cowpox, a disease that caused blisters on cows’ udders, somehow seemed to protect milkmaids and other farm workers against smallpox. In humans, cowpox just caused a few harmless blisters on the hands, but it seemed to somehow offer lifelong immunity to smallpox. Jenner decided to put this folk wisdom to the test. He initially exposed fifteen farm workers who had previously suffered from cowpox to smallpox virus. None became infected. Then, in 1796, he undertook his boldest experiment to date. He deliberately infected a young boy with cowpox, and then exposed him to smallpox. The boy did not get sick. Jenner called the procedure vaccination, derived from the Latin vaccinae meaning “of the cow,” and published his findings in 1798. By 1820, millions of people had been vaccinated in Britain, Europe, and the United States, and the number of people dying from smallpox was cut in half. Not everyone was impressed. There immediately arose some sporadic opposition to the vaccine. Objections were occasionally raised on religious grounds—to vaccinate oneself, some argued, was to question God’s divine plan. Others objected for economic reasons, or simply out of disgust at a vaccine derived from sick cows, coupled with distrust of the doctors who administered them. By 1800, Jenner was moved to defend his vaccine from detractors, writing “the feeble efforts of a few individuals to depreciate the new practice are sinking fast into contempt.” His optimism was misplaced. The first truly organized anti-vaccination movements have their origins in the Compulsory Vaccination Acts passed by British Parliament in the 1850s and ’60s. The first law, introduced in 1853, threatened parents who failed to vaccinate their children with fines and imprisonment. The law was widely accepted at first, due in large part to a particularly bad smallpox epidemic that had swept through England the year before, but vaccination rates fell off again when people realized that the law simply wasn’t enforced. Parliament passed a new tougher law in 1867. It was in reaction to these laws that the first dedicated and well-organized anti-vaccination leagues were formed. Critics claimed that the vaccine was at best useless, at worst a scam or a poison. By 1900 there were in the region of two hundred anti-vaccination groups across England. The United States quickly followed suit; American anti-vaccination societies began to spring up in the 1870s. In 1898, the English critics of vaccination won. The British government gave in, passing a law that allowed so-called conscientious objectors to opt out of vaccinating their children. Objection certificates were made easier to obtain in 1907. Vaccination rates fell, and outbreaks of smallpox rose once again in parts of England. In neighboring Scotland and Ireland, where anti-vaccination movements had not gained as much traction, vaccination continued to be readily accepted, and smallpox continued to decline. * So vaccine anxiety was a side effect of the very first vaccine, and the symptoms have never quite cleared up. Perhaps the most remarkable thing about the long-standing unease about vaccines is how little the arguments have changed over the centuries. Jenner’s critics created elaborate cartoons depicting doctors as unfeeling monsters, intent on sacrificing innocent, helpless children. Twenty-first-century anti-vaccinationists write blog posts with titles like “Doctors want power to kill disabled babies.” Nineteenth-century activists claimed that the smallpox vaccine contained “poison of adders, the blood, entrails, and excretion of bats, toads and suckling whelps” and fought for their right to remain “pure and unpolluted.” The modern-day “green our vaccines” movement doesn’t go so far as to say vaccines contain entrails, but they still misconstrue vaccines as containing “toxins” including antifreeze, insect repellent, and spermicide. And, as Paul Offit has pointed out, the current concerns about MMR somehow causing autism are about as plausible, biologically speaking, as the claim, widely reported in the early 1800s, that the smallpox vaccine caused recipients to sprout horns, run about on all fours, and low and squint like cows. And throughout it all, there have been theories alleging a vast international conspiracy to trump up the dangers of the diseases that vaccines, to hide the truth about vaccine side effects, and to ensure profits for Big Pharma and the government. One nineteenth-century British activist wrote of smallpox, “this infection scare is a sham, fostered, if not got up originally by doctors as a means of raising their own importance and tightening their grasp on the throat of the nation’s common sense which has lain so long paralysed and inert in their clutches.” More than a century later, Barbara Loe Fisher called the HPV vaccine “one of the biggest money making schemes in the history of medicine.” In some parts of the world, conspiracist fears about vaccines have provoked more drastic measures than simply opting out of vaccination. In parts of Pakistan, local religious leaders have denounced vaccination as an American ploy to sterilize Muslims. According to the BBC, more than sixty polio workers, or their drivers or guards, have been murdered in Pakistan since 2012. (The CIA, it’s worth pointing out, inadvertently fanned the flames of distrust by setting up a fake vaccination program in Abbottabad in 2011, as part of an effort to confirm Osama Bin Laden’s whereabouts by having vaccine workers surreptitiously collect DNA samples from Bin Laden’s family members. When the stunningly misguided plan came to light, it put every vaccine worker in the country under suspicion.) Similar killings of polio workers have taken place in Nigeria. Pakistan and Nigeria, not coincidentally, are two of only three countries in the world where polio remains endemic. * Of course, not every parent of an unvaccinated child is a raving conspiracy theorist. Some unvaccinated children are too young to have received the vaccine. Others have medical conditions that make vaccination impossible. And many parents who fail to stick to the recommended vaccine schedule do so not out of fear of a Big Pharma conspiracy, but because they lack the time or money for doctor visits, and because they have fallen through the cracks in the health care system. These are the children who rely on “herd immunity”—the protection that comes from most of the people around us being immune to a disease. Parents who consciously choose to deny their children vaccines are putting not only their own child in harm’s way, but other children, too. And yet it would be a mistake to demonize parents who choose to reject vaccines. They are thoughtful, caring, well intentioned, and often well informed. Thanks to a small but vocal minority of dedicated anti-vaccinists, the Internet is rife with conspiracy-laced misinformation urging us not to trust vaccines. Making matters worse, the media often portrays the controversy with a false sense of balance. Most parents have heard the claims about autism and vaccines, and, according to a recent study, merely reading anti-vaccine conspiracy theories can reduce parents’ willingness to have their children vaccinated. The science is clear: Vaccines do not cause autism. But conspiracy theories erode our trust in science, allowing controversy to linger long after the questions have been settled. Excerpted from "Suspicious Minds: Why We Believe Conspiracy Theories" by Rob Brotherton. Published by Bloomsbury USA. Copyright 2015 by Rob Brotherton. Reprinted with permission of the publisher. All rights reserved.






Published on November 29, 2015 09:00
The truth about the white working class: Why it’s really allergic to voting for Democrats
What’s up with working-class whites? It’s a question that’s been asked for decades, and has been raised again recently in the discussion surrounding an Alec MacGillis piece examining Matt Bevin’s recent election gubernatorial win in Kentucky, which could leave many in Kentucky without Medicaid. Though there are many explanations for why working class whites vote Republican and many are certainly true, the overwhelming reason is rather simple: racism. To see why working class whites -- defined as non-Hispanic whites without a college degree, although there are extensive debates as to the best way to define “working class” -- aren’t voting Democratic, I use the American National Election Studies 2012 survey. To begin, I examined raw vote shares among working class whites, and then vote shares among working class whites in the South (the former 11 states of the Confederacy) and non-South. Immediately, it is obvious that a key divide is the South/non-South distinction: only 28 percent of Southern working class whites identify as Democratic, compared with 40 percent of non-South working class whites.
Next, I examined whether racial stereotyping had any effect. The stereotype question asks respondents to rate Blacks on a scale of 1 (hard-working) to 7 (lazy). I examined the party identification of working class whites in each category and the results are rather suggestive: among working class whites who ranked Blacks as hard-working, 40 percen were Democrats and 38 percent Republicans, among those who said Blacks are lazy, 20 percent were Democrats and 60 percent were Republicans.
But how does this affect the votes of working class whites? My next analysis teases out whether social issues play a role in white votes, as Thomas Frank has suggested; whether it’s concerns about the role of government, as John Judis (and others) have argued; or whether it’s racism, as Ian Haney-Lopez has argued. Specifically, I examine three questions that allow respondents to place themselves on a scale and also place the major parties (or candidates) on the same scale. I examine three questions: a four-point scale regarding abortion, a seven-point scale regarding government services and spending and a seven-point scale regarding aid to Blacks (see here for exact wording). The first two ask respondents to put themselves on a scale and also place the Republican and Democratic party on the scale, the last asks the respondents to place themselves as well as Mitt Romney and Obama on the scale (this may skew results because people may perceive Obama as more supportive of aid to Blacks than Democrats in general, likely because of racism). I find that 62 percent of working-class whites either put Romney at the same place as them on aid to Blacks or within 1 point in either direction, compared with only 35 percent of working-class whites who felt that way about Obama. About 40 percent of working class whites placed themselves at the same place or within one point in either as Democrats on government services or spending, compared with 53 percent who perceived closeness with the Republican party. On the abortion scale 31 percent of respondents placed themselves as the same as Republicans, compared with 39 percent who felt the same way about Democrats. Because abortion was only a four point scale, I didn’t compare what percentage of people placed themselves within one point of either party (the chart shows the percentage placing themselves the same as the parties).
Working class whites say they are overwhelmingly more liberal than the Republican party on abortion and modestly more liberal on government services and spending. However, they are more conservative than Romney on aid to Blacks. When compared to the Democratic party, working class whites say they are more conservative on abortion (only slightly) and dramatically more conservative on services and spending. More than 70 percent of working class whites say they are more conservative than Obama on aid to Blacks. This suggests that working class whites see themselves as far closer to the Democratic party on abortion and further away from the party on services and spending. They see themselves furthest away from Obama on the issue of aid to Blacks.
These results are suggestive, and they fit into a broader academic literature. In a recent National Bureau of Economic Research working paper, Ilyana Kuziemko and Ebonya Washington find that racism can explain almost all of the decline of Southern white support for Democrats between 1958 and 2000. Larry Bartels performed a similar (although far more detailed) analysis in a 2006 paper criticizing Frank and found similarly, that working class voters were closer to Republicans on economic and racial issues, but agreed with Democrats on abortion and women’s role in the family. In his masterful work, "Why Americans Hate Welfare," Martin Gilens finds that opposition to welfare is driven by racial stereotypes about blacks. In a seminal book, "Race and the Decline of Class in American Politics," Robert Huckfeldt and Carol Weitzel Kohfeld show that the more a state-level Democratic party relies on Black votes, the less likely low income whites in the states were to vote Democratic. Ian Haney-Lopez argues that Republican politicians have consciously played up racial tensions and animosity to peel white votes away from the Democratic party. Another clue comes from the fact that working class Latinos and Blacks all overwhelmingly prefer Democrats, and the non-white working class as a whole prefers Democrats to Republicans 68 percent to 16 percent. The only defectors are the white working class.
So if the working class generally likes Democrats (with the exception of working-class whites), why do Democrats lose elections? The key is turnout, a point MacGillis makes, citing some of my research. The core question then, for Democrats, is how to mobilize the low-income voters who are disproportionately harmed by Republican policies. Here, the Affordable Care Act includes a self-inflicted wound by Democrats. For decades, many states have failed to meet the NVRA requirement, which states that Medicaid offices and other public assistance agencies ask recipients whether they want to register to vote. Further, NVRA covers the federally mandated health care exchanges, but the Obama administration has failed to require the exchanges to offer participants an opportunity to register to vote. Both offer a huge missed chance to register millions of new voters, disproportionately low-income and non-white. Understanding why Democrats have lost working-class whites is a key to understanding the future. On the positive side, the decline of the white working class and the increasing racial diversity of the nation could help Democrats, if they could mobilize non-white members of the working class to vote at the same rate as working class whites. As I’ve noted, the rise in diversity of the general population has only slowly been reflected in the diversity of the electorate. On the other hand, the idea that Democrats are losing votes because of their socially progressive stances on abortion and gay rights are clearly incorrect. Further, while it’s clear that economic progressivism might struggle because Americans fail to link public policy to rising inequality, there is also evidence that many economically progressive policies are popular. The problem, as new research by political scientists Torben Iversen and David Soskice shows, is that the U.S. doesn’t have a strong union movement to mobilize low income people. As they note, and as Fowler and Michele Margolis show, another factor is information: When people are informed they shift their toward the Democratic party. Political scientists Jan Leighley and Jonathan Nagler note that people who see greater differences between the parties were more likely to vote, but low-income people are less likely to perceive large differences. Progressives must see registering and mobilizing low-income voters as a central priority. Supporting unions, which serve an important role in mobilizing the working-class, is also vital. However, progressive must also give the working class a good reason to vote for them.What’s up with working-class whites? It’s a question that’s been asked for decades, and has been raised again recently in the discussion surrounding an Alec MacGillis piece examining Matt Bevin’s recent election gubernatorial win in Kentucky, which could leave many in Kentucky without Medicaid. Though there are many explanations for why working class whites vote Republican and many are certainly true, the overwhelming reason is rather simple: racism. To see why working class whites -- defined as non-Hispanic whites without a college degree, although there are extensive debates as to the best way to define “working class” -- aren’t voting Democratic, I use the American National Election Studies 2012 survey. To begin, I examined raw vote shares among working class whites, and then vote shares among working class whites in the South (the former 11 states of the Confederacy) and non-South. Immediately, it is obvious that a key divide is the South/non-South distinction: only 28 percent of Southern working class whites identify as Democratic, compared with 40 percent of non-South working class whites.
Next, I examined whether racial stereotyping had any effect. The stereotype question asks respondents to rate Blacks on a scale of 1 (hard-working) to 7 (lazy). I examined the party identification of working class whites in each category and the results are rather suggestive: among working class whites who ranked Blacks as hard-working, 40 percen were Democrats and 38 percent Republicans, among those who said Blacks are lazy, 20 percent were Democrats and 60 percent were Republicans.
But how does this affect the votes of working class whites? My next analysis teases out whether social issues play a role in white votes, as Thomas Frank has suggested; whether it’s concerns about the role of government, as John Judis (and others) have argued; or whether it’s racism, as Ian Haney-Lopez has argued. Specifically, I examine three questions that allow respondents to place themselves on a scale and also place the major parties (or candidates) on the same scale. I examine three questions: a four-point scale regarding abortion, a seven-point scale regarding government services and spending and a seven-point scale regarding aid to Blacks (see here for exact wording). The first two ask respondents to put themselves on a scale and also place the Republican and Democratic party on the scale, the last asks the respondents to place themselves as well as Mitt Romney and Obama on the scale (this may skew results because people may perceive Obama as more supportive of aid to Blacks than Democrats in general, likely because of racism). I find that 62 percent of working-class whites either put Romney at the same place as them on aid to Blacks or within 1 point in either direction, compared with only 35 percent of working-class whites who felt that way about Obama. About 40 percent of working class whites placed themselves at the same place or within one point in either as Democrats on government services or spending, compared with 53 percent who perceived closeness with the Republican party. On the abortion scale 31 percent of respondents placed themselves as the same as Republicans, compared with 39 percent who felt the same way about Democrats. Because abortion was only a four point scale, I didn’t compare what percentage of people placed themselves within one point of either party (the chart shows the percentage placing themselves the same as the parties).
Working class whites say they are overwhelmingly more liberal than the Republican party on abortion and modestly more liberal on government services and spending. However, they are more conservative than Romney on aid to Blacks. When compared to the Democratic party, working class whites say they are more conservative on abortion (only slightly) and dramatically more conservative on services and spending. More than 70 percent of working class whites say they are more conservative than Obama on aid to Blacks. This suggests that working class whites see themselves as far closer to the Democratic party on abortion and further away from the party on services and spending. They see themselves furthest away from Obama on the issue of aid to Blacks.
These results are suggestive, and they fit into a broader academic literature. In a recent National Bureau of Economic Research working paper, Ilyana Kuziemko and Ebonya Washington find that racism can explain almost all of the decline of Southern white support for Democrats between 1958 and 2000. Larry Bartels performed a similar (although far more detailed) analysis in a 2006 paper criticizing Frank and found similarly, that working class voters were closer to Republicans on economic and racial issues, but agreed with Democrats on abortion and women’s role in the family. In his masterful work, "Why Americans Hate Welfare," Martin Gilens finds that opposition to welfare is driven by racial stereotypes about blacks. In a seminal book, "Race and the Decline of Class in American Politics," Robert Huckfeldt and Carol Weitzel Kohfeld show that the more a state-level Democratic party relies on Black votes, the less likely low income whites in the states were to vote Democratic. Ian Haney-Lopez argues that Republican politicians have consciously played up racial tensions and animosity to peel white votes away from the Democratic party. Another clue comes from the fact that working class Latinos and Blacks all overwhelmingly prefer Democrats, and the non-white working class as a whole prefers Democrats to Republicans 68 percent to 16 percent. The only defectors are the white working class.
So if the working class generally likes Democrats (with the exception of working-class whites), why do Democrats lose elections? The key is turnout, a point MacGillis makes, citing some of my research. The core question then, for Democrats, is how to mobilize the low-income voters who are disproportionately harmed by Republican policies. Here, the Affordable Care Act includes a self-inflicted wound by Democrats. For decades, many states have failed to meet the NVRA requirement, which states that Medicaid offices and other public assistance agencies ask recipients whether they want to register to vote. Further, NVRA covers the federally mandated health care exchanges, but the Obama administration has failed to require the exchanges to offer participants an opportunity to register to vote. Both offer a huge missed chance to register millions of new voters, disproportionately low-income and non-white. Understanding why Democrats have lost working-class whites is a key to understanding the future. On the positive side, the decline of the white working class and the increasing racial diversity of the nation could help Democrats, if they could mobilize non-white members of the working class to vote at the same rate as working class whites. As I’ve noted, the rise in diversity of the general population has only slowly been reflected in the diversity of the electorate. On the other hand, the idea that Democrats are losing votes because of their socially progressive stances on abortion and gay rights are clearly incorrect. Further, while it’s clear that economic progressivism might struggle because Americans fail to link public policy to rising inequality, there is also evidence that many economically progressive policies are popular. The problem, as new research by political scientists Torben Iversen and David Soskice shows, is that the U.S. doesn’t have a strong union movement to mobilize low income people. As they note, and as Fowler and Michele Margolis show, another factor is information: When people are informed they shift their toward the Democratic party. Political scientists Jan Leighley and Jonathan Nagler note that people who see greater differences between the parties were more likely to vote, but low-income people are less likely to perceive large differences. Progressives must see registering and mobilizing low-income voters as a central priority. Supporting unions, which serve an important role in mobilizing the working-class, is also vital. However, progressive must also give the working class a good reason to vote for them.



















Published on November 29, 2015 08:59
The end of migraines is close: A new drug could stop debilitating headaches before they start








Published on November 29, 2015 08:00
November 28, 2015
You’re wrong about Common Core math: Sorry, parents, but it makes more sense than you think
By now everyone has seen the outrage-inducing image of a third grader’s paper in which he is marked down for stating that 5 x 3 = 5 + 5 + 5 = 15. In case you missed it, here are videos and stories from several groups ripping Common Core for it: Business Insider, IFLScience, Huffington Post and mom.me — and I’m sure you can find many more as the photo of this kid’s paper has gone viral.
You might also remember the photo of a check that went viral not that long ago — a man filled out a check to his son’s school by attempting to write the check amount in ten frames. There have been countless other similar photos with accompanying derision that have gone viral via social media and email (more on these examples in a moment).
In the national discussion on America’s perceived educational woes, the Common Core Standards have become a bit of a unifying punching bag, especially with respect to elementary school math. Everyone seems to love a photo of a test question, homework problem or corrected work that vilifies the Common Core. You know the type — the question asks the students to show a seemingly straightforward elementary math topic, but it requires the answer to be given in what seems to be an overly complicated way. We look at it and say, “Why can’t they just do it the normal way?!?” We are alarmed at the representation of something that we see as so basic and elementary in a new and unfamiliar arrangement, and we are outraged when we see a student’s work marked down when it appears to be correct. The vast majority of the comments and coverage of these viral images and stories has been highly critical of the Common Core. Here’s the thing though — all of these criticisms boil down to a fundamental misunderstanding of the Common Core State Standards (CCSS). Virtually every example of one of these attacks on the Common Core fall into one of two categories: The people who spread the example (and trash it) missed the point of the Common Core Standard in question The educator responsible for the example missed the point of the Common Core Standard in question Consider the ten frames check (which falls into the first category) — the father was frustrated by a representation of numbers with which he wasn’t familiar and it fit nicely into his preconceived notion that the Common Core is terrible, serving only to confuse students and parents. Here is an article that does a more elaborate job of skewering his response, but in short, this father is upset because he doesn’t immediately recognize and understand a concept being taught to his second grader. Rather than try to make sense of it and understand the purpose, he ridicules it, and other similarly frustrated parents jump on board. In fact, ten frames are a way of visually modeling our counting system that help kids to better understand it. They were never meant to replace our current way of writing numbers — they are designed as a supplemental aid to assist in deeper understanding. It can be frustrating to parents, to be sure, to be initially stumped by your kids’ homework, especially when they’re in the earliest grades. Certainly, there are teachers out there who don’t always hit the mark with an assignment, or who fail to provide resources for parents to understand something that may be new to them, but in the end, let’s not forget that we’re all looking for the best educational outcomes for our kids. And let’s be honest, the way we’ve been teaching math for generations in America has not worked for everyone, which is why we have a very sizeable segment of our population who simply says, “I can’t do math.” So then why are we closed off to considering new ways of conceptualizing the foundational ideas of mathematics? Now consider the 5 x 3 question. According to IFLScience (which I love, by the way), Reddit and Imgur commenters expressed outrage at “the overly pedantic ‘by-the-book’ thinking.” The whole thing reads as an incrimination of Common Core as stifling mathematical thinking in favor of stringent and arbitrary definitions and algorithms. And yet, this is a completely wrong-headed interpretation of Common Core. The outrage is warranted, it’s just misplaced — this example is of the second type I mentioned above, in which the educator has misunderstood and misapplied the standards with an overly narrow literal reading of them. The standard in question says, “Interpret products of whole numbers, e.g., interpret 5 × 7 as the total number of objects in 5 groups of 7 objects each.” This teacher obviously read this standard as saying that the only way to view 5 x 7 (or in the case of the paper in question, 5 x 3) is as 5 groups of 7 objects each. So for 5 groups of 3 objects, that might look like 3 + 3 + 3 + 3 + 3. And yet, “e.g.” means “for example,” not “This is the only valid interpretation.” A reasonable reading of the standard by a mathematically literate person should allow for interpreting 5 x 3 as 5+5+5, or 3 groups of 5 objects each, especially when you consider that four standards down on the list is the one about commutation (along with other properties) in multiplication, e.g., 5 x 3 is equal to 3 x 5 (note, the e.g. I just used means that this is just one example; the property applies to infinitely many other pairs of numbers as well — see how that works?). The ultimate goal of these standards is to help our children to develop their foundational understandings of our number system and of basic arithmetic, and so if a student intuitively knows that 5 x 3 is equal to 3 x 5 and that they can both be represented as 3 rows of 5 items or 5 rows of 3 items or 3 stacks of 5 pennies or 5 piles of 3 apples or … well, you get the picture, then we’ve accomplished our goal! Interpreting multiplication in the way described above is not remotely new; rather, it’s pretty standard fare for understanding what multiplication is. Perhaps the idea of making students show the example on paper is more of a new phenomenon, and yes, the Common Core definitely advocates that educators encourage students to interact with ways of modeling the mathematical concepts they are learning so as to better master them. It does not, however, require stringent adherence to narrow, arbitrarily chosen interpretations of these models, and educators who focus their teaching in that way are doing it wrong. For another example, consider this image, which I first received in a forwarded email (this particular version of it was apparently taken from the web site of David Van Sant, a recent Republican candidate for the Georgia State House, who lost to a fellow Republican), about a math problem that has gone viral, helping to fire up people against Common Core. At first glance, the diagram of the Common Core approach might appear unnecessarily complicated, especially when compared to the setup of the standard subtraction algorithm that we all grew up with (not to mention that the image shows the setup of the standard algorithm but doesn’t actually show the process, which actually isn’t quite so simple with the borrowing that will be required). Most will look at the diagram of the number line, at a glance see a bunch of steps that don’t seem to make a lot of sense, and accept it as further evidence to support an already burgeoning outrage toward Common Core, thanks in part to a healthy side dish of confirmation bias. A closer look at the method, however, can reveal that the use of the number line (an important visual tool in arithmetic and algebra) allows this method to get at a different way of thinking about addition and subtraction and their relation to each other — a vital way of thinking for students who we would like to understand arithmetic on a deep enough level to facilitate the learning of higher levels of mathematics in a meaningful way (which should be essentially all students). If you haven’t yet made sense of the second diagram, think about the way that people used to give change at the store (perhaps a bit of a lost art these days). Suppose you purchased something that cost $8.27 and paid with a $20. The clerk would start at the value of the item purchased (in this case $8.27), then start with the change, bringing you first to $8.30, then to the 50 cent level, then to an even dollar amount, then a ten dollar amount, and so forth, until the value was brought up to the $20 you paid with:
This problem is a bit cherry-picked in that it requires no “borrowing” in the standard algorithm, so this one is much more easily completed by that traditional method. And yet, the parent’s humor aside, the point of the number line model is not to teach the most efficient algorithm to perform subtraction, it’s to help students understand what subtraction is. I’m certainly not suggesting that we shouldn’t teach the standard subtraction algorithm that we all grew up learning, and neither is the Common Core. In the CCSS, it’s actually referred to as the “standard algorithm” for addition and subtraction, and the CCSS require that students completely master it for multi-digit numbers by the end of fourth grade. I am also not suggesting that the primary goal of math classes should be to enable students to perform complicated mental math without using pencil and paper. The goal of math classes should be to foster a deep-level understanding of the mechanisms that we teach, and that’s where compelling students to learn a variety of techniques for subtraction, for example, can allow students to approach a concept from a variety of different directions, using a variety of different tools, and tying it to other concepts they learn. This is, in a nutshell, what teaching mathematical understandings at a deep level should be doing. And this variety of approaches to build deep level understandings is exactly what the Common Core seeks to do. For all the griping about the CCSS, and in particular the math standards that people love to hate with these images of unfamiliar methods and models or of work that is inexplicably marked down, they’re actually pretty good. The idea behind them — in an attempt to improve on many of the learning standards that were out before — is to encourage depth of understanding, and they are definitely geared at that. They’re not perfect — as a high school math teacher, I have some issues with how much is crammed into certain courses and certain topics that may be overemphasized. But these things can be adjusted as time goes on, without scrapping them entirely. Or even if they’re not improved, a capable and competent teacher should find the CCSS to absolutely be a workable set of standards — an improvement over what we had before. What we need, in order to do the best job of teaching with these standards, is the best group of teachers possible, as well as a stronger national emphasis on the value of education. The last thing we need right now, in my opinion as an educator, is to start over again with a new set of standards just as we’re getting used to the Common Core.By now everyone has seen the outrage-inducing image of a third grader’s paper in which he is marked down for stating that 5 x 3 = 5 + 5 + 5 = 15. In case you missed it, here are videos and stories from several groups ripping Common Core for it: Business Insider, IFLScience, Huffington Post and mom.me — and I’m sure you can find many more as the photo of this kid’s paper has gone viral.
You might also remember the photo of a check that went viral not that long ago — a man filled out a check to his son’s school by attempting to write the check amount in ten frames. There have been countless other similar photos with accompanying derision that have gone viral via social media and email (more on these examples in a moment).
In the national discussion on America’s perceived educational woes, the Common Core Standards have become a bit of a unifying punching bag, especially with respect to elementary school math. Everyone seems to love a photo of a test question, homework problem or corrected work that vilifies the Common Core. You know the type — the question asks the students to show a seemingly straightforward elementary math topic, but it requires the answer to be given in what seems to be an overly complicated way. We look at it and say, “Why can’t they just do it the normal way?!?” We are alarmed at the representation of something that we see as so basic and elementary in a new and unfamiliar arrangement, and we are outraged when we see a student’s work marked down when it appears to be correct. The vast majority of the comments and coverage of these viral images and stories has been highly critical of the Common Core. Here’s the thing though — all of these criticisms boil down to a fundamental misunderstanding of the Common Core State Standards (CCSS). Virtually every example of one of these attacks on the Common Core fall into one of two categories: The people who spread the example (and trash it) missed the point of the Common Core Standard in question The educator responsible for the example missed the point of the Common Core Standard in question Consider the ten frames check (which falls into the first category) — the father was frustrated by a representation of numbers with which he wasn’t familiar and it fit nicely into his preconceived notion that the Common Core is terrible, serving only to confuse students and parents. Here is an article that does a more elaborate job of skewering his response, but in short, this father is upset because he doesn’t immediately recognize and understand a concept being taught to his second grader. Rather than try to make sense of it and understand the purpose, he ridicules it, and other similarly frustrated parents jump on board. In fact, ten frames are a way of visually modeling our counting system that help kids to better understand it. They were never meant to replace our current way of writing numbers — they are designed as a supplemental aid to assist in deeper understanding. It can be frustrating to parents, to be sure, to be initially stumped by your kids’ homework, especially when they’re in the earliest grades. Certainly, there are teachers out there who don’t always hit the mark with an assignment, or who fail to provide resources for parents to understand something that may be new to them, but in the end, let’s not forget that we’re all looking for the best educational outcomes for our kids. And let’s be honest, the way we’ve been teaching math for generations in America has not worked for everyone, which is why we have a very sizeable segment of our population who simply says, “I can’t do math.” So then why are we closed off to considering new ways of conceptualizing the foundational ideas of mathematics? Now consider the 5 x 3 question. According to IFLScience (which I love, by the way), Reddit and Imgur commenters expressed outrage at “the overly pedantic ‘by-the-book’ thinking.” The whole thing reads as an incrimination of Common Core as stifling mathematical thinking in favor of stringent and arbitrary definitions and algorithms. And yet, this is a completely wrong-headed interpretation of Common Core. The outrage is warranted, it’s just misplaced — this example is of the second type I mentioned above, in which the educator has misunderstood and misapplied the standards with an overly narrow literal reading of them. The standard in question says, “Interpret products of whole numbers, e.g., interpret 5 × 7 as the total number of objects in 5 groups of 7 objects each.” This teacher obviously read this standard as saying that the only way to view 5 x 7 (or in the case of the paper in question, 5 x 3) is as 5 groups of 7 objects each. So for 5 groups of 3 objects, that might look like 3 + 3 + 3 + 3 + 3. And yet, “e.g.” means “for example,” not “This is the only valid interpretation.” A reasonable reading of the standard by a mathematically literate person should allow for interpreting 5 x 3 as 5+5+5, or 3 groups of 5 objects each, especially when you consider that four standards down on the list is the one about commutation (along with other properties) in multiplication, e.g., 5 x 3 is equal to 3 x 5 (note, the e.g. I just used means that this is just one example; the property applies to infinitely many other pairs of numbers as well — see how that works?). The ultimate goal of these standards is to help our children to develop their foundational understandings of our number system and of basic arithmetic, and so if a student intuitively knows that 5 x 3 is equal to 3 x 5 and that they can both be represented as 3 rows of 5 items or 5 rows of 3 items or 3 stacks of 5 pennies or 5 piles of 3 apples or … well, you get the picture, then we’ve accomplished our goal! Interpreting multiplication in the way described above is not remotely new; rather, it’s pretty standard fare for understanding what multiplication is. Perhaps the idea of making students show the example on paper is more of a new phenomenon, and yes, the Common Core definitely advocates that educators encourage students to interact with ways of modeling the mathematical concepts they are learning so as to better master them. It does not, however, require stringent adherence to narrow, arbitrarily chosen interpretations of these models, and educators who focus their teaching in that way are doing it wrong. For another example, consider this image, which I first received in a forwarded email (this particular version of it was apparently taken from the web site of David Van Sant, a recent Republican candidate for the Georgia State House, who lost to a fellow Republican), about a math problem that has gone viral, helping to fire up people against Common Core. At first glance, the diagram of the Common Core approach might appear unnecessarily complicated, especially when compared to the setup of the standard subtraction algorithm that we all grew up with (not to mention that the image shows the setup of the standard algorithm but doesn’t actually show the process, which actually isn’t quite so simple with the borrowing that will be required). Most will look at the diagram of the number line, at a glance see a bunch of steps that don’t seem to make a lot of sense, and accept it as further evidence to support an already burgeoning outrage toward Common Core, thanks in part to a healthy side dish of confirmation bias. A closer look at the method, however, can reveal that the use of the number line (an important visual tool in arithmetic and algebra) allows this method to get at a different way of thinking about addition and subtraction and their relation to each other — a vital way of thinking for students who we would like to understand arithmetic on a deep enough level to facilitate the learning of higher levels of mathematics in a meaningful way (which should be essentially all students). If you haven’t yet made sense of the second diagram, think about the way that people used to give change at the store (perhaps a bit of a lost art these days). Suppose you purchased something that cost $8.27 and paid with a $20. The clerk would start at the value of the item purchased (in this case $8.27), then start with the change, bringing you first to $8.30, then to the 50 cent level, then to an even dollar amount, then a ten dollar amount, and so forth, until the value was brought up to the $20 you paid with:
This problem is a bit cherry-picked in that it requires no “borrowing” in the standard algorithm, so this one is much more easily completed by that traditional method. And yet, the parent’s humor aside, the point of the number line model is not to teach the most efficient algorithm to perform subtraction, it’s to help students understand what subtraction is. I’m certainly not suggesting that we shouldn’t teach the standard subtraction algorithm that we all grew up learning, and neither is the Common Core. In the CCSS, it’s actually referred to as the “standard algorithm” for addition and subtraction, and the CCSS require that students completely master it for multi-digit numbers by the end of fourth grade. I am also not suggesting that the primary goal of math classes should be to enable students to perform complicated mental math without using pencil and paper. The goal of math classes should be to foster a deep-level understanding of the mechanisms that we teach, and that’s where compelling students to learn a variety of techniques for subtraction, for example, can allow students to approach a concept from a variety of different directions, using a variety of different tools, and tying it to other concepts they learn. This is, in a nutshell, what teaching mathematical understandings at a deep level should be doing. And this variety of approaches to build deep level understandings is exactly what the Common Core seeks to do. For all the griping about the CCSS, and in particular the math standards that people love to hate with these images of unfamiliar methods and models or of work that is inexplicably marked down, they’re actually pretty good. The idea behind them — in an attempt to improve on many of the learning standards that were out before — is to encourage depth of understanding, and they are definitely geared at that. They’re not perfect — as a high school math teacher, I have some issues with how much is crammed into certain courses and certain topics that may be overemphasized. But these things can be adjusted as time goes on, without scrapping them entirely. Or even if they’re not improved, a capable and competent teacher should find the CCSS to absolutely be a workable set of standards — an improvement over what we had before. What we need, in order to do the best job of teaching with these standards, is the best group of teachers possible, as well as a stronger national emphasis on the value of education. The last thing we need right now, in my opinion as an educator, is to start over again with a new set of standards just as we’re getting used to the Common Core.



“Okay, $8.27, 30 cents , and 20 more is 50 cents , and two quarters makes nine , and ten , and ten more makes twenty .”The approach is a perfectly sensible way to give change for humans — it focuses on round figures that we can add and subtract more easily, and it focuses on the true essence of subtraction — the difference between the two referenced amounts. In the case of the change, it’s the difference between what you were supposed to pay and what you did pay (in other words, your change). The vertical subtraction algorithm we learned in school doesn’t make this clear — it is a memorized algorithm that can be done efficiently with pencil and paper by someone who has practiced it and it certainly can be made to make sense through study of our base ten number system by emphasizing the places of the various digits and the concept of borrowing when needed. When it comes to doing subtraction problems like this in your head, I suspect that most people who excel at this type of mental math use a method similar to the number line diagram that is shown (the supposedly laughable Common Core example). The ability to visualize and break up the problem allows someone to keep track of the values more easily and to more consistently and efficiently produce the correct result without putting pencil to paper. Here’s a hilarious example from a parent lampooning the Common Core (I found this in a Google image search, but I believe I remember seeing this one make the rounds through email or Facebook):



“Okay, $8.27, 30 cents , and 20 more is 50 cents , and two quarters makes nine , and ten , and ten more makes twenty .”The approach is a perfectly sensible way to give change for humans — it focuses on round figures that we can add and subtract more easily, and it focuses on the true essence of subtraction — the difference between the two referenced amounts. In the case of the change, it’s the difference between what you were supposed to pay and what you did pay (in other words, your change). The vertical subtraction algorithm we learned in school doesn’t make this clear — it is a memorized algorithm that can be done efficiently with pencil and paper by someone who has practiced it and it certainly can be made to make sense through study of our base ten number system by emphasizing the places of the various digits and the concept of borrowing when needed. When it comes to doing subtraction problems like this in your head, I suspect that most people who excel at this type of mental math use a method similar to the number line diagram that is shown (the supposedly laughable Common Core example). The ability to visualize and break up the problem allows someone to keep track of the values more easily and to more consistently and efficiently produce the correct result without putting pencil to paper. Here’s a hilarious example from a parent lampooning the Common Core (I found this in a Google image search, but I believe I remember seeing this one make the rounds through email or Facebook):







Published on November 28, 2015 15:30
Moving the dial on what’s possible: When it comes to extremely premature infants, what we can and what we should do are not always the same
Changes in what we can do always lead to new questions about what we should do—questions about what is prudent or loving or wise, about what serves human well-being or even that of the broader web of life. Recent medical advances around resuscitation and life support for extremely premature infants are no exception, and new options have opened a set of difficult conversations that many would rather avoid. Earlier Viability This fall, two groups of experts lowered their bar for neonatal resuscitation from 23 weeks gestation to 22 weeks because new medical technologies allow an increasing number of infants delivered at that stage to survive by completing their gestation outside of the womb. For couples who are yearning for a baby but faced with a tenuous pregnancy, this news offers new hope. There is good reason to believe that sometime in the future it will be possible to incubate a healthy human infant outside the womb from the time of conception, and medical practice is inexorably moving toward that point. If we ignore, for the moment, the question of healthy development after the fact and focus simply on survival, the statistics are already impressive: About 72 percent of infants born at 25 weeks gestation survive, followed by 55 percent of those born at 24 weeks, 26 percent of those born at 23 weeks, and 6 percent of those born at 22 weeks. In the past, the second trimester miscarriage of a wanted pregnancy led to loss and grief. Now things are more complicated. The potential for survival means that doctors and families faced with late miscarriage are also faced with a complicated decision—whether or not to initiate life support that includes incubation, intubation, a chemical bath to stimulate lung development, and possible repeated surgeries and transfusions during the coming months or even years. Life and Quality of Life Were this simply a matter of life and death, the questions faced by families and medical ethicists might be much clearer. Unfortunately, at this point in history, most extremely premature infants grow into children who experience a lifetime of cognitive or physical disability, sometimes subtle and sometimes severe, and doctors are unable to predict in advance which few will go on to lead healthy, normal lives. In one study of 357 live births at 22 weeks, published in the New England Journal of Medicine, active treatment was started in 79 cases and 18 survived. Of those, 11 were moderately or severely impaired as toddlers, presaging a lifetime of special needs and intensive support services. The extent of more subtle mental and health impairments in the other seven remains to be seen. The challenges come in part because at this stage in history even the best state-of-the-art care outside the womb fails to provide a perfect incubation environment for a developing fetus. Also, the process of transitioning a fetus from womb to external incubation is imperfect. Interruptions in the flow of oxygen, temperature fluctuations and other aspects of the transition can have lifelong consequences, whether that life is short or long. Nature’s Imperfect Wisdom But another set of challenges comes from the fact that early miscarriages are pregnancies that nature herself is rejecting, often (though not always) because the mother’s body has not been able to provide a healthy gestational environment or because the fetus is defective. In other words, these are budding lives with the odds stacked against survival and subsequent health even before any question of imperfect medical technology or care comes into play. Miscarriage—or in medical terms, spontaneous abortion—is one of nature’s mechanisms for stacking the odds in favor of healthy children. Sexual reproduction, which combines DNA from two individuals of a species, is a vastly imperfect process, and nature optimizes for healthy offspring by rejecting most combinations at some point along the path. Very defective eggs or sperm may fail to form an embryo. Most embryos fail to implant or else spontaneously self-abort. As pregnancy progresses, spontaneous abortion becomes less likely, but a high death rate from imperfect reproduction continues clear into infancy for most species, as it has historically for ours. In human beings, an estimated 60 to 80 percent of fertilized eggs fail to reach the live birth stage even without therapeutic abortion as part of the mix. In humans as in other species, this failure rate is—to put it in tech terms—a feature, not a bug. It allows for the mother’s body to put energy into those offspring most likely to survive and thrive and go on to have healthy children of their own. But this process too is imperfect. Sometimes a healthy fetus gets rejected from a healthy mother; sometimes horrible defects slip through—even those that are incompatible with any form of life outside the womb. With Great Power, Great Responsibility Acknowledging this brings us face to face with a sobering set of questions. We have more and more ability to override nature’s mechanism for increasing healthy births, to sustain some of the budding lives that nature rejects. We also now have the ability to augment nature’s winnowing process, both by preventing high-risk pregnancies and by inducing abortion of those conceived under adverse circumstances or known to be faulty. As medical technologies move the dial on what’s possible, viability becomes an increasingly poor guide to human flourishing—in other words, to what we should do. Our public conversations about this are heated, but also spotty. Women or couples who choose to abort ill-conceived pregnancies are shamed and called irresponsible or selfish or even murderers—from the pulpit, sidewalks and halls of Congress. More quietly, they also are honored for prudence and wisdom, for following through on their commitments to schooling or community service or the children they already have, or the children they hope to bring into the world when the time and partnership is right. Our debate about therapeutic abortion may be ugly and polarized, but it is vigorous. But when it comes to the other side of the equation, mostly silence prevails; and despite the fact that early life-support decisions are enormously far-reaching, doctors may be sanctioned for offering honest opinions about prospects. Incubation outside the womb has the potential to produce a healthy child, one who is desperately loved and wanted. This is morally consequential. But when we are talking about extreme prematurity, it has even more potential at this point in history to set in motion a trajectory of ill health and constrained development, a trajectory that affects not only the family in question but a whole community. Parents who decide to pursue external incubation of extremely premature infants rather than letting go and starting over—or hospital ethicists who sometimes override the wishes of parents—are committing deeply to an act of love and hope. They also are committing not only that family but the whole community to pivot from other endeavors, to invest instead in a prospective life with the odds stacked against much that we think of as human flourishing. The joys can be enormous, but cost in suffering can be enormous, too, as can the cost of resources diverted. The financial price tag for neonatal care—running to a million dollars or more per case—is merely a crude indicator of the enormous diversion of energy and resources required for such an undertaking, but it begs us to consider the broader opportunity costs—the services and even alternate lives forgone. Only the most hardened narcissist thinks that it doesn’t matter—that the ways in which our decisions ripple through the lives of others are inconsequential. Only the most naive socialist thinks we can have it all. There is no right answer to these questions, nor is there likely to be a broad consensus any time soon. What we can do is in motion, which means what we should do is in motion as well. As in all areas of scientific and technological discovery, advances in medical practice present us with an evolving flow of hard decisions that pit our own deepest values against each other, forcing us to prioritize one over another in situations where the outcomes are obscure even if the risks are clear—and where the cost-benefit equation itself is constantly changing. The best we can hope for is a vigorous conversation, one that is guided by scientific information and our deepest values—one that isn’t short-circuited by wishful thinking or our strong desire to avoid difficult topics, or our even stronger tendency to fall back on outdated agreements reached by our ancestors who faced similar struggles under very different social and technological conditions. Above all, we must remember that all of us are guided by a deep yearning to live the lives of our choosing within thriving communities and to give our children the very best of our love.







Published on November 28, 2015 14:30
Growing up in Sonic Youth








Published on November 28, 2015 13:30
We still think like ancient Romans: “The clash between the demands of homeland security and rights of the individual? We haven’t solved that”
Cambridge University classics scholar, outspoken feminist, slayer of Internet trolls -- Mary Beard is a genuine heroine in the United Kingdom. Her new book, “SPQR: A History of Ancient Rome” begins with a dusty village on the Tiber, debunks Romulus and Remus, looks closely at Cicero, moves through the period of growth and conquest, spends a chapter on the empire’s early heyday under Augustus, touches down on the weirdness of Nero and Caligula, and quits before things get really bad. The book’s cryptic title – “the senate and people of Rome” – should not scare off readers interested in a smart, accessible, argumentative history written with great clarity. "In some ways to explore the ancient Rome from the 21st century is rather like walking on a tightrope, a very careful balancing act,” she writes. “If you look down on one side, everything looks reassuringly familiar: there are conversations going on that we almost join, about the nature of freedom or problems of sex; there are buildings and monuments we understand, with all their troublesome adolescents; and there are jokes we 'get.' On the other side, it seems like completely alien territory." Salon spoke to Beard from her home in Cambridge, England. The interview has been lightly edited for clarity. You seem especially interested in misconceptions about Rome – scholarly and otherwise. What have people who’ve studied Rome gotten wrong? I think in some ways they’ve been too trusting of what they read. And I want to leave people with the fun of the Romans – Romulus and Remus, wicked emperors, I don’t want to take that fun away. We can have the fun, but we can still realize that all these stories we know about the Romans are constructed in much more complicated ways than that – and in some ways are much more interesting. In some ways, you have to think hard about what the Romans are telling you. You have to think hard, for example, about what the story of Romulus and Remus is really about. Someone named Rolumus never existed. But the story is so important for understanding what Rome thought it was all about, that it’s worth looking very hard at. So it is a bit of a balancing act I try to do. I don’t want to come along and say, “Oh, look, don’t believe any of this. This is all written by people who didn’t know…” I want to say, “Look, this is not necessarily literally true, and we can have fun with these stories. But there’s something really much more important: Why these stories are told is really interesting. I think the same thing about the wickednesses of the emperors, like what Tiberius got up to the in swimming pool… It’s important to understand why those were told. They were part of a propaganda machine. We have no idea of what Tiberius did in his swimming pool in Capri, and neither do the Romans who tell us about it. But we can also see we can learn a lot about Roman perceptions of power, about the wicknedness of autocracy that has gone wrong from those stories. In some ways we’ve inherited them. We still have a terrible anxiety… Look at Berlusconi and swimming pools. So we don’t have to believe [any of this] is literally true – but we can think hard about how we still think in Roman terms, how the Romans have given us our sense of what corruption is, what excess is… When we close our eyes and think about decadence, we have a Roman image in mind. I wonder if this is true in Britain as well: In the States, a lot of us think of the Greeks as having come up with theater, democracy, philosophy, while the Romans were pragmatic, warlike, and – as you say – eventually decadent. Is there anything to these stereotypes? That is the standard stereotype now… I’m actually quite keen on running water and lavatories and transport systems. So I think we shouldn’t knock those practical things. But I also think that the heart of the Roman republic was an equally powerful and really important idea, which was liberty. Your founding fathers were well-read in Roman views in libertas – they were not all that interested in Greek ideals of democracy. At least, the Athenian idea of democracy – most city-states were not democratic and thought democracy was a very bad thing indeed. What the Romans constructed was a version of civil liberties that we’ve inherited – what are the rights of the citizen against the power of the state? What about the right of the citizen to trial, to no arbitrary punishment – that’s an issue we face with terrorism every day now. As the Romans did: What’s the clash between the demands of homeland security and rights of the individual? We haven’t solved that, and they hadn’t solved that. The other thing about the Romans: They’re trying to do popular politics on a massive scale. Fifth-century Athens – maximum 40,000 citizens. That’s the size of a college campus. The Romans are thinking about it with 700,000 citizens – that’s very different, and that’s our problem. You have a lively chapter on Augustus. How similar was the real Octavian to the character in “I, Claudius”? I still remember the “I, Claudius” television series, with the long Augustus death scene that went on for minutes and minutes. The problem with Augustus: He’s a complete mystery. Here’s the guy who actually manages to establish a system of one-man rule that lasts for almost two centuries. Relatively uncontested. And yet we have the problem we have with some of our political leaders: He starts out life as a nasty, illegitimate thug who raises a private army and is effectively a warlord. He manages a transformation from warlord to elder statesman in a way that no other politician I know has ever done so speedily and so completely. It’s the big mystery of that period of Roman history. It’s a mystery the Romans themselves reflected on. They said that basically he was a chameleon. Here’s a guy who took over the Roman state, defeated all his rivals, tearing out the eyes of his rivals with his bare hands… And then, fast-forward 10 years, and he’s the father of his country standing up for good old-fashioned Roman morality, and becoming the emperor that everyone else wanted to be! Will the real Augustus please stand up? A lot of scholars don’t enjoy being out in the larger world. Do you wish more academics would exist in the broader society the way you do? Yes, I do wish they would – it’s a great privilege to work, and be paid, studying the Romans. It makes you study history, think about now, quite intensely. And to some extend, because the academy is still relatively protected, and people can speak without losing their jobs, there is an obligation to speak – not to rant, not to shout… The connection between the academy and the wider world should be stronger than it is. I don’t mean at all that everyone [needs to be outspoken]. I hope there is still a role for people who spent their lives in the library looking at three lines of Aeschylus, or Homer, or whatever…. Not every academic has to be like me. Do you still make an effort to chase down bullies on Twitter? Yeah! They tend to keep off me now – so I have rather less opportunity to do this. I never had a plan or a strategy – I just did what came naturally. When people say awful things about you, I’d say, “That isn’t true. Please apologize, please take that down.” And it’s amazing how often they do.Cambridge University classics scholar, outspoken feminist, slayer of Internet trolls -- Mary Beard is a genuine heroine in the United Kingdom. Her new book, “SPQR: A History of Ancient Rome” begins with a dusty village on the Tiber, debunks Romulus and Remus, looks closely at Cicero, moves through the period of growth and conquest, spends a chapter on the empire’s early heyday under Augustus, touches down on the weirdness of Nero and Caligula, and quits before things get really bad. The book’s cryptic title – “the senate and people of Rome” – should not scare off readers interested in a smart, accessible, argumentative history written with great clarity. "In some ways to explore the ancient Rome from the 21st century is rather like walking on a tightrope, a very careful balancing act,” she writes. “If you look down on one side, everything looks reassuringly familiar: there are conversations going on that we almost join, about the nature of freedom or problems of sex; there are buildings and monuments we understand, with all their troublesome adolescents; and there are jokes we 'get.' On the other side, it seems like completely alien territory." Salon spoke to Beard from her home in Cambridge, England. The interview has been lightly edited for clarity. You seem especially interested in misconceptions about Rome – scholarly and otherwise. What have people who’ve studied Rome gotten wrong? I think in some ways they’ve been too trusting of what they read. And I want to leave people with the fun of the Romans – Romulus and Remus, wicked emperors, I don’t want to take that fun away. We can have the fun, but we can still realize that all these stories we know about the Romans are constructed in much more complicated ways than that – and in some ways are much more interesting. In some ways, you have to think hard about what the Romans are telling you. You have to think hard, for example, about what the story of Romulus and Remus is really about. Someone named Rolumus never existed. But the story is so important for understanding what Rome thought it was all about, that it’s worth looking very hard at. So it is a bit of a balancing act I try to do. I don’t want to come along and say, “Oh, look, don’t believe any of this. This is all written by people who didn’t know…” I want to say, “Look, this is not necessarily literally true, and we can have fun with these stories. But there’s something really much more important: Why these stories are told is really interesting. I think the same thing about the wickednesses of the emperors, like what Tiberius got up to the in swimming pool… It’s important to understand why those were told. They were part of a propaganda machine. We have no idea of what Tiberius did in his swimming pool in Capri, and neither do the Romans who tell us about it. But we can also see we can learn a lot about Roman perceptions of power, about the wicknedness of autocracy that has gone wrong from those stories. In some ways we’ve inherited them. We still have a terrible anxiety… Look at Berlusconi and swimming pools. So we don’t have to believe [any of this] is literally true – but we can think hard about how we still think in Roman terms, how the Romans have given us our sense of what corruption is, what excess is… When we close our eyes and think about decadence, we have a Roman image in mind. I wonder if this is true in Britain as well: In the States, a lot of us think of the Greeks as having come up with theater, democracy, philosophy, while the Romans were pragmatic, warlike, and – as you say – eventually decadent. Is there anything to these stereotypes? That is the standard stereotype now… I’m actually quite keen on running water and lavatories and transport systems. So I think we shouldn’t knock those practical things. But I also think that the heart of the Roman republic was an equally powerful and really important idea, which was liberty. Your founding fathers were well-read in Roman views in libertas – they were not all that interested in Greek ideals of democracy. At least, the Athenian idea of democracy – most city-states were not democratic and thought democracy was a very bad thing indeed. What the Romans constructed was a version of civil liberties that we’ve inherited – what are the rights of the citizen against the power of the state? What about the right of the citizen to trial, to no arbitrary punishment – that’s an issue we face with terrorism every day now. As the Romans did: What’s the clash between the demands of homeland security and rights of the individual? We haven’t solved that, and they hadn’t solved that. The other thing about the Romans: They’re trying to do popular politics on a massive scale. Fifth-century Athens – maximum 40,000 citizens. That’s the size of a college campus. The Romans are thinking about it with 700,000 citizens – that’s very different, and that’s our problem. You have a lively chapter on Augustus. How similar was the real Octavian to the character in “I, Claudius”? I still remember the “I, Claudius” television series, with the long Augustus death scene that went on for minutes and minutes. The problem with Augustus: He’s a complete mystery. Here’s the guy who actually manages to establish a system of one-man rule that lasts for almost two centuries. Relatively uncontested. And yet we have the problem we have with some of our political leaders: He starts out life as a nasty, illegitimate thug who raises a private army and is effectively a warlord. He manages a transformation from warlord to elder statesman in a way that no other politician I know has ever done so speedily and so completely. It’s the big mystery of that period of Roman history. It’s a mystery the Romans themselves reflected on. They said that basically he was a chameleon. Here’s a guy who took over the Roman state, defeated all his rivals, tearing out the eyes of his rivals with his bare hands… And then, fast-forward 10 years, and he’s the father of his country standing up for good old-fashioned Roman morality, and becoming the emperor that everyone else wanted to be! Will the real Augustus please stand up? A lot of scholars don’t enjoy being out in the larger world. Do you wish more academics would exist in the broader society the way you do? Yes, I do wish they would – it’s a great privilege to work, and be paid, studying the Romans. It makes you study history, think about now, quite intensely. And to some extend, because the academy is still relatively protected, and people can speak without losing their jobs, there is an obligation to speak – not to rant, not to shout… The connection between the academy and the wider world should be stronger than it is. I don’t mean at all that everyone [needs to be outspoken]. I hope there is still a role for people who spent their lives in the library looking at three lines of Aeschylus, or Homer, or whatever…. Not every academic has to be like me. Do you still make an effort to chase down bullies on Twitter? Yeah! They tend to keep off me now – so I have rather less opportunity to do this. I never had a plan or a strategy – I just did what came naturally. When people say awful things about you, I’d say, “That isn’t true. Please apologize, please take that down.” And it’s amazing how often they do.Cambridge University classics scholar, outspoken feminist, slayer of Internet trolls -- Mary Beard is a genuine heroine in the United Kingdom. Her new book, “SPQR: A History of Ancient Rome” begins with a dusty village on the Tiber, debunks Romulus and Remus, looks closely at Cicero, moves through the period of growth and conquest, spends a chapter on the empire’s early heyday under Augustus, touches down on the weirdness of Nero and Caligula, and quits before things get really bad. The book’s cryptic title – “the senate and people of Rome” – should not scare off readers interested in a smart, accessible, argumentative history written with great clarity. "In some ways to explore the ancient Rome from the 21st century is rather like walking on a tightrope, a very careful balancing act,” she writes. “If you look down on one side, everything looks reassuringly familiar: there are conversations going on that we almost join, about the nature of freedom or problems of sex; there are buildings and monuments we understand, with all their troublesome adolescents; and there are jokes we 'get.' On the other side, it seems like completely alien territory." Salon spoke to Beard from her home in Cambridge, England. The interview has been lightly edited for clarity. You seem especially interested in misconceptions about Rome – scholarly and otherwise. What have people who’ve studied Rome gotten wrong? I think in some ways they’ve been too trusting of what they read. And I want to leave people with the fun of the Romans – Romulus and Remus, wicked emperors, I don’t want to take that fun away. We can have the fun, but we can still realize that all these stories we know about the Romans are constructed in much more complicated ways than that – and in some ways are much more interesting. In some ways, you have to think hard about what the Romans are telling you. You have to think hard, for example, about what the story of Romulus and Remus is really about. Someone named Rolumus never existed. But the story is so important for understanding what Rome thought it was all about, that it’s worth looking very hard at. So it is a bit of a balancing act I try to do. I don’t want to come along and say, “Oh, look, don’t believe any of this. This is all written by people who didn’t know…” I want to say, “Look, this is not necessarily literally true, and we can have fun with these stories. But there’s something really much more important: Why these stories are told is really interesting. I think the same thing about the wickednesses of the emperors, like what Tiberius got up to the in swimming pool… It’s important to understand why those were told. They were part of a propaganda machine. We have no idea of what Tiberius did in his swimming pool in Capri, and neither do the Romans who tell us about it. But we can also see we can learn a lot about Roman perceptions of power, about the wicknedness of autocracy that has gone wrong from those stories. In some ways we’ve inherited them. We still have a terrible anxiety… Look at Berlusconi and swimming pools. So we don’t have to believe [any of this] is literally true – but we can think hard about how we still think in Roman terms, how the Romans have given us our sense of what corruption is, what excess is… When we close our eyes and think about decadence, we have a Roman image in mind. I wonder if this is true in Britain as well: In the States, a lot of us think of the Greeks as having come up with theater, democracy, philosophy, while the Romans were pragmatic, warlike, and – as you say – eventually decadent. Is there anything to these stereotypes? That is the standard stereotype now… I’m actually quite keen on running water and lavatories and transport systems. So I think we shouldn’t knock those practical things. But I also think that the heart of the Roman republic was an equally powerful and really important idea, which was liberty. Your founding fathers were well-read in Roman views in libertas – they were not all that interested in Greek ideals of democracy. At least, the Athenian idea of democracy – most city-states were not democratic and thought democracy was a very bad thing indeed. What the Romans constructed was a version of civil liberties that we’ve inherited – what are the rights of the citizen against the power of the state? What about the right of the citizen to trial, to no arbitrary punishment – that’s an issue we face with terrorism every day now. As the Romans did: What’s the clash between the demands of homeland security and rights of the individual? We haven’t solved that, and they hadn’t solved that. The other thing about the Romans: They’re trying to do popular politics on a massive scale. Fifth-century Athens – maximum 40,000 citizens. That’s the size of a college campus. The Romans are thinking about it with 700,000 citizens – that’s very different, and that’s our problem. You have a lively chapter on Augustus. How similar was the real Octavian to the character in “I, Claudius”? I still remember the “I, Claudius” television series, with the long Augustus death scene that went on for minutes and minutes. The problem with Augustus: He’s a complete mystery. Here’s the guy who actually manages to establish a system of one-man rule that lasts for almost two centuries. Relatively uncontested. And yet we have the problem we have with some of our political leaders: He starts out life as a nasty, illegitimate thug who raises a private army and is effectively a warlord. He manages a transformation from warlord to elder statesman in a way that no other politician I know has ever done so speedily and so completely. It’s the big mystery of that period of Roman history. It’s a mystery the Romans themselves reflected on. They said that basically he was a chameleon. Here’s a guy who took over the Roman state, defeated all his rivals, tearing out the eyes of his rivals with his bare hands… And then, fast-forward 10 years, and he’s the father of his country standing up for good old-fashioned Roman morality, and becoming the emperor that everyone else wanted to be! Will the real Augustus please stand up? A lot of scholars don’t enjoy being out in the larger world. Do you wish more academics would exist in the broader society the way you do? Yes, I do wish they would – it’s a great privilege to work, and be paid, studying the Romans. It makes you study history, think about now, quite intensely. And to some extend, because the academy is still relatively protected, and people can speak without losing their jobs, there is an obligation to speak – not to rant, not to shout… The connection between the academy and the wider world should be stronger than it is. I don’t mean at all that everyone [needs to be outspoken]. I hope there is still a role for people who spent their lives in the library looking at three lines of Aeschylus, or Homer, or whatever…. Not every academic has to be like me. Do you still make an effort to chase down bullies on Twitter? Yeah! They tend to keep off me now – so I have rather less opportunity to do this. I never had a plan or a strategy – I just did what came naturally. When people say awful things about you, I’d say, “That isn’t true. Please apologize, please take that down.” And it’s amazing how often they do.







Published on November 28, 2015 12:30
Right-wing Trump denialism: Conservatives like Bill Kristol are struggling with Trump’s popularity
The Donald Trump 2016 freakshow keeps on gaining momentum as it slides deeper into the pit of human misery and despair. For those who haven’t been following, the proto-fascist and nakedly xenophobic Republican presidential front-runner found a new way to broadcast his utter lack of human emotion: mocking a New York Times reporter for his physical disability. Trump denies he did any such thing, which is just another lie to throw on the pile. And if history is any guide, the whole episode will merely cement his supporters’ affection for him. In one way or another, Republicans are struck with Trump. The durability of his support means you can’t just brush him off as a non-credible threat to win the nomination. And even if he does collapse at some point, he’s already succeeded in dragging the primary down to his own level. Other candidates in the race are reacting to Trump, shifting further rightward to better align themselves with his extremism, and eschewing direct criticism of the man so that they can position themselves to poach his constituency. Like it or not, the GOP is the party of Trump. What’s remarkable, though, is how many Republican and conservative elites deny this reality even as it screams “MAKE AMERICA GREAT AGAIN” right in their faces. On Friday morning, the consistently wrong and bafflingly influential William Kristol tweeted that Trump, despite all outward indicators, lacks “genuine staying power.” [embedtweet id=”670245014291816448”] It’s an amusing take for several reasons. First off, Trump has been the dominant front-runner in state and national polling for four months running, which would seem to indicate that he has some measure of “staying power.” The Atlantic’s Molly Ball went to a Trump rally in South Carolina and came away with the impression that Trump’s people are clear-eyed and determined in their choice of candidate: “Perhaps the people who first glommed on to his celebrity got bored and drifted away. But if so, they didn’t find anybody else they liked. And they came back. And now, they are not leaving.” Also, there’s the inconvenient fact that Kristol has been incorrectly predicting Trump’s collapse for a long time now, going back to his July warning that Trump’s attack on John McCain’s military service would be “the beginning of the end.” In the months since, he’s said that we’ve passed Peak Trump, that Trump’s political stock was poised to crash, that “normal Americans” had grown sick of him, that he’d begun to fade, and that Trump had once again reached “the beginning of the end.” Lastly, it was Kristol, you may recall, who gave the GOP the gift of Sarah Palin. Palin’s and Trump’s political styles are very similar – policy-light, resentment-heavy, personality-driven – and after the former Alaska governor was vaulted to the top of Republican politics in 2008, she won the adoration of conservative activists and mainstream Republicans alike with her folksy, inane, “you betcha” shtick. She was an early beneficiary of the same conservative backlash against establishment Republicans that Trump is currently profiting from. You could rightly argue that Palin differs from Trump in that she actually held elected office and had something of a political background to undergird her rise, but she remained popular well into the cartoonish, “death panel” phase of her post-government career. So it’s a bit strange that after he helped make a conservative star out of Palin, Kristol can’t believe that the party would also coalesce around Trump. That gets to the conservative denialism surrounding Trump: The elites of the movement and the Republican Party happily encouraged and nurtured the same forces that have empowered Trump because they offered the promise of short-term political gain. The Trump phenomenon shows how those forces have grown beyond their control. As Brian Beutler writes at the New Republic, some Republicans and conservatives are, at this late hour, recognizing the threat posed to them by Trump and starting to grapple with the fact that they are the authors of their own political misfortune. But there’s still a sizeable contingent of right-wing power brokers who just can’t believe that Donald Trump is the candidate they deserve.The Donald Trump 2016 freakshow keeps on gaining momentum as it slides deeper into the pit of human misery and despair. For those who haven’t been following, the proto-fascist and nakedly xenophobic Republican presidential front-runner found a new way to broadcast his utter lack of human emotion: mocking a New York Times reporter for his physical disability. Trump denies he did any such thing, which is just another lie to throw on the pile. And if history is any guide, the whole episode will merely cement his supporters’ affection for him. In one way or another, Republicans are struck with Trump. The durability of his support means you can’t just brush him off as a non-credible threat to win the nomination. And even if he does collapse at some point, he’s already succeeded in dragging the primary down to his own level. Other candidates in the race are reacting to Trump, shifting further rightward to better align themselves with his extremism, and eschewing direct criticism of the man so that they can position themselves to poach his constituency. Like it or not, the GOP is the party of Trump. What’s remarkable, though, is how many Republican and conservative elites deny this reality even as it screams “MAKE AMERICA GREAT AGAIN” right in their faces. On Friday morning, the consistently wrong and bafflingly influential William Kristol tweeted that Trump, despite all outward indicators, lacks “genuine staying power.” [embedtweet id=”670245014291816448”] It’s an amusing take for several reasons. First off, Trump has been the dominant front-runner in state and national polling for four months running, which would seem to indicate that he has some measure of “staying power.” The Atlantic’s Molly Ball went to a Trump rally in South Carolina and came away with the impression that Trump’s people are clear-eyed and determined in their choice of candidate: “Perhaps the people who first glommed on to his celebrity got bored and drifted away. But if so, they didn’t find anybody else they liked. And they came back. And now, they are not leaving.” Also, there’s the inconvenient fact that Kristol has been incorrectly predicting Trump’s collapse for a long time now, going back to his July warning that Trump’s attack on John McCain’s military service would be “the beginning of the end.” In the months since, he’s said that we’ve passed Peak Trump, that Trump’s political stock was poised to crash, that “normal Americans” had grown sick of him, that he’d begun to fade, and that Trump had once again reached “the beginning of the end.” Lastly, it was Kristol, you may recall, who gave the GOP the gift of Sarah Palin. Palin’s and Trump’s political styles are very similar – policy-light, resentment-heavy, personality-driven – and after the former Alaska governor was vaulted to the top of Republican politics in 2008, she won the adoration of conservative activists and mainstream Republicans alike with her folksy, inane, “you betcha” shtick. She was an early beneficiary of the same conservative backlash against establishment Republicans that Trump is currently profiting from. You could rightly argue that Palin differs from Trump in that she actually held elected office and had something of a political background to undergird her rise, but she remained popular well into the cartoonish, “death panel” phase of her post-government career. So it’s a bit strange that after he helped make a conservative star out of Palin, Kristol can’t believe that the party would also coalesce around Trump. That gets to the conservative denialism surrounding Trump: The elites of the movement and the Republican Party happily encouraged and nurtured the same forces that have empowered Trump because they offered the promise of short-term political gain. The Trump phenomenon shows how those forces have grown beyond their control. As Brian Beutler writes at the New Republic, some Republicans and conservatives are, at this late hour, recognizing the threat posed to them by Trump and starting to grapple with the fact that they are the authors of their own political misfortune. But there’s still a sizeable contingent of right-wing power brokers who just can’t believe that Donald Trump is the candidate they deserve.







Published on November 28, 2015 11:00