Helen H. Moore's Blog, page 863

February 15, 2016

Our Barbies, ourselves: The new diverse Barbies are progress — but we’re giving them too much power

When I was a little girl I wrote the American Girl company, suggesting that they create a Jewish American doll. At the time, there were just a few American Girl characters—Kirsten, a blond, blue-eyed little girl who represented the Pioneer era; Samantha, a turn of the 20th century city girl with baby bangs and long, dark curls; and Molly, a WWII era suburban girl with auburn hair in braids and glasses. Each girl was not merely a doll you could play with, but came with her own unique story set within a specific historical time period. I loved learning about Kirsten, Samantha and Molly, and was thrilled when other girls were introduced to the series, like Felicity, a tenacious redhead growing up in Colonial America, and Addy, the first African-American American Girl, who represented the Civil War era. I’m not sure what I exactly expected when I sent my letter. Perhaps I thought that they would issue my own personal doll, with my specific family story, traditions, experiences, with my hair and eye color, and personality. Instead, I received a letter back saying that the doll I dreamed of was not going to be in the works. The letter suggested that I could simply pretend that one of their current dolls was Jewish, since Judaism was a religion that any of the girls could practice. Today, long after I’ve already grown up, the American Girl company has released other girls, ones who reflect other time periods and cultural, ethnic and religious groups. Indeed, today there is a Jewish-American historical doll named Rebecca Rubin, the daughter of Russian-Jewish immigrants in early 20th century New York City. I’m not sure if Rebecca would have quelled my desire to be represented when I was young or that I would recognize myself in Rebecca Rubin; her world of New York City life may have seemed equally foreign to a child who grew up eating guava paste and cream cheese for dessert and hearing stories about the Cuban Missile Crisis. I’m sure another part of the American Girl company’s response to my letter was that there just aren’t enough Cuban-Jewish-American girls to make my niche concerns seem important, or certainly economically viable. Corporations aren’t moved by earnest letters from small children; they make changes in order to adapt to new markets. Which is why the new squad of diverse Barbies, for the first time featuring not only a range of body types, but much greater diversity in types of hair and facial features, is both bittersweet and frustrating. Designed to appeal to “millennial moms,” new Barbie is about celebrating difference, rather than simply conforming to narrow beauty standards. Commercials for these new Barbies are carefully calculated feel good feminist PSAs featuring girls with lots of different body types, from all sorts of ethnic and racial backgrounds, enjoying playing with these new toys. As someone who grew up with Barbie, it’s hard not to feel a little emotional when viewing these ads, which are a far cry from the Barbies I grew up with in the ‘80s and ‘90s, which boasted hair crimping technology, fake tattoos and a hatred of math. Unlike previous iterations of “diverse” dolls the newly updated Barbies offer an impressive seven skin tones, 22 eye colors, and 24 hair styles. They come in sets where girls and boys can imagine Barbie as everything from computer programmer to president. Some critics say that Barbie has too much baggage to be a successful reboot. A stirring piece by Mona Awad says that, “Barbie will always be Barbie,” and I think it’s true that “traditional” Barbie is still currently the standard by which these new Barbies are compared. Certainly for women of my generation, there is only one iconic Barbie. At the same time, it would surprise me if girls today look back at a time with only one kind of Barbie to choose from as a relic from a bygone era. Mattel’s reboot of Barbie may successfully do what smaller companies have been trying to do for years—offer more realistic-looking dolls that girls can look up to. For a long time, parents had the option of buying any number of “body positive” dolls from a range of smaller companies. “Tree Change Dolls” created by Tasmanian artist Sonia Singh, are recycled dolls with the heavy makeup removed, giving dolls a more natural, childlike look. Likewise, the “Lammily” doll, created by Nickolay Lamm, is modeled on the average American woman’s height and weight measurements, and allows girls to customize their dolls with “real” features like freckles and cellulite. Though companies competing with Mattel always showed kids preferring these dolls to more traditional Barbies, it’s also clear that Barbie still sets the standard for dolls today. By shifting the standard Barbie, Mattel also seems to be giving us, the consumer, the permission to be more body positive. On the surface this seems to be a sea change in the ways that toys are marketed to young people. Yet the bottom line with many of the progressive initiatives to revamp our toys—from Lego unveiling its first mini figure with a wheelchair to Goldiblox’s toys for girl engineers—is that our identity is shaped by consumer culture more than ever. Just because companies are noticing the diversity of human experience (and the innumerable untapped markets they’ve yet to explore) doesn’t mean that it is good for us, any more than it means that big name brands hawking these goods have any real interest in our self-esteem. After all, corporations have long used progressive values to sell products—from the “You’ve Come a Long Way, Baby” Virginia Slim ads of the ‘60s and ‘70s, to the more recent Dove body-positive campaigns funded by the same corporation that sells us Axe’s brand of neo-masculinity. Certainly, offering kids greater space to see themselves in the toys they play with is a good thing, but we also don’t have any tangible evidence that body positive campaigns have done anything to make women feel better about themselves over the years. If anything, our current fascinations with plastic surgery, thigh gap selfies and waist trainers seem to suggest that women are as obsessed with bodily perfection as ever. We do, however, have evidence that women are interacting directly with consumer culture in new and interesting ways. While my handwritten letter to the American Girl company is now lost somewhere in my parents' attic, in today’s world enthusiastic consumers are constantly demanding that companies create more diverse and inclusive products and sharing company responses on Twitter. Today, companies also have sizable competition from creative and savvy consumers. Nigerian scientist Haneefa Adam, for example, created “Hijarbie” who wears more modest clothes and a headscarf. On her Instagram, Adam describes her excitement at featuring more diverse Barbies of color in the future, “I couldn’t find the different types in Nigeria…I’d have loved to dress up a black doll myself too.” Of course, one of the most exciting aspects of the new Barbie initiative is holding up real-life women as “sheroes” for women of all ages to look up to, starting with the Ava DuVernay doll, complete with her very own director’s chair, and comfortable-looking sneakers. The choice to feature an accomplished black female film director is a revolutionary reimagining of what it means to be an American doll icon. In Ariel Levy’s essay “Dolls and Feelings” Jill Soloway revels in how playing with dolls teaches girls how to be fiercely competent directors: “We all know how to do it. We fucking grew up doing it! It’s dolls. How did men make us think we weren’t good at this? It’s dolls and feelings.” My hope for the world of New Barbie is that the emphasis continues to be on allowing girls to imagine new worlds and new possibilities, both for how they see themselves and how they see the world around them. The demand for greater representation in the toys children see and play with is genuine and necessary, but it’s also important that girls (and boys) learn to see themselves as more than just the sum of the pre-packaged plastic bodies they love to play with. I think we can appreciate these creative steps, while still being skeptical about the extent that consumer culture can help set us free from stereotypes and poor body image. After all, even if they now come in various shapes and sizes, every single newly purchased Barbie still has to be broken out of her box.When I was a little girl I wrote the American Girl company, suggesting that they create a Jewish American doll. At the time, there were just a few American Girl characters—Kirsten, a blond, blue-eyed little girl who represented the Pioneer era; Samantha, a turn of the 20th century city girl with baby bangs and long, dark curls; and Molly, a WWII era suburban girl with auburn hair in braids and glasses. Each girl was not merely a doll you could play with, but came with her own unique story set within a specific historical time period. I loved learning about Kirsten, Samantha and Molly, and was thrilled when other girls were introduced to the series, like Felicity, a tenacious redhead growing up in Colonial America, and Addy, the first African-American American Girl, who represented the Civil War era. I’m not sure what I exactly expected when I sent my letter. Perhaps I thought that they would issue my own personal doll, with my specific family story, traditions, experiences, with my hair and eye color, and personality. Instead, I received a letter back saying that the doll I dreamed of was not going to be in the works. The letter suggested that I could simply pretend that one of their current dolls was Jewish, since Judaism was a religion that any of the girls could practice. Today, long after I’ve already grown up, the American Girl company has released other girls, ones who reflect other time periods and cultural, ethnic and religious groups. Indeed, today there is a Jewish-American historical doll named Rebecca Rubin, the daughter of Russian-Jewish immigrants in early 20th century New York City. I’m not sure if Rebecca would have quelled my desire to be represented when I was young or that I would recognize myself in Rebecca Rubin; her world of New York City life may have seemed equally foreign to a child who grew up eating guava paste and cream cheese for dessert and hearing stories about the Cuban Missile Crisis. I’m sure another part of the American Girl company’s response to my letter was that there just aren’t enough Cuban-Jewish-American girls to make my niche concerns seem important, or certainly economically viable. Corporations aren’t moved by earnest letters from small children; they make changes in order to adapt to new markets. Which is why the new squad of diverse Barbies, for the first time featuring not only a range of body types, but much greater diversity in types of hair and facial features, is both bittersweet and frustrating. Designed to appeal to “millennial moms,” new Barbie is about celebrating difference, rather than simply conforming to narrow beauty standards. Commercials for these new Barbies are carefully calculated feel good feminist PSAs featuring girls with lots of different body types, from all sorts of ethnic and racial backgrounds, enjoying playing with these new toys. As someone who grew up with Barbie, it’s hard not to feel a little emotional when viewing these ads, which are a far cry from the Barbies I grew up with in the ‘80s and ‘90s, which boasted hair crimping technology, fake tattoos and a hatred of math. Unlike previous iterations of “diverse” dolls the newly updated Barbies offer an impressive seven skin tones, 22 eye colors, and 24 hair styles. They come in sets where girls and boys can imagine Barbie as everything from computer programmer to president. Some critics say that Barbie has too much baggage to be a successful reboot. A stirring piece by Mona Awad says that, “Barbie will always be Barbie,” and I think it’s true that “traditional” Barbie is still currently the standard by which these new Barbies are compared. Certainly for women of my generation, there is only one iconic Barbie. At the same time, it would surprise me if girls today look back at a time with only one kind of Barbie to choose from as a relic from a bygone era. Mattel’s reboot of Barbie may successfully do what smaller companies have been trying to do for years—offer more realistic-looking dolls that girls can look up to. For a long time, parents had the option of buying any number of “body positive” dolls from a range of smaller companies. “Tree Change Dolls” created by Tasmanian artist Sonia Singh, are recycled dolls with the heavy makeup removed, giving dolls a more natural, childlike look. Likewise, the “Lammily” doll, created by Nickolay Lamm, is modeled on the average American woman’s height and weight measurements, and allows girls to customize their dolls with “real” features like freckles and cellulite. Though companies competing with Mattel always showed kids preferring these dolls to more traditional Barbies, it’s also clear that Barbie still sets the standard for dolls today. By shifting the standard Barbie, Mattel also seems to be giving us, the consumer, the permission to be more body positive. On the surface this seems to be a sea change in the ways that toys are marketed to young people. Yet the bottom line with many of the progressive initiatives to revamp our toys—from Lego unveiling its first mini figure with a wheelchair to Goldiblox’s toys for girl engineers—is that our identity is shaped by consumer culture more than ever. Just because companies are noticing the diversity of human experience (and the innumerable untapped markets they’ve yet to explore) doesn’t mean that it is good for us, any more than it means that big name brands hawking these goods have any real interest in our self-esteem. After all, corporations have long used progressive values to sell products—from the “You’ve Come a Long Way, Baby” Virginia Slim ads of the ‘60s and ‘70s, to the more recent Dove body-positive campaigns funded by the same corporation that sells us Axe’s brand of neo-masculinity. Certainly, offering kids greater space to see themselves in the toys they play with is a good thing, but we also don’t have any tangible evidence that body positive campaigns have done anything to make women feel better about themselves over the years. If anything, our current fascinations with plastic surgery, thigh gap selfies and waist trainers seem to suggest that women are as obsessed with bodily perfection as ever. We do, however, have evidence that women are interacting directly with consumer culture in new and interesting ways. While my handwritten letter to the American Girl company is now lost somewhere in my parents' attic, in today’s world enthusiastic consumers are constantly demanding that companies create more diverse and inclusive products and sharing company responses on Twitter. Today, companies also have sizable competition from creative and savvy consumers. Nigerian scientist Haneefa Adam, for example, created “Hijarbie” who wears more modest clothes and a headscarf. On her Instagram, Adam describes her excitement at featuring more diverse Barbies of color in the future, “I couldn’t find the different types in Nigeria…I’d have loved to dress up a black doll myself too.” Of course, one of the most exciting aspects of the new Barbie initiative is holding up real-life women as “sheroes” for women of all ages to look up to, starting with the Ava DuVernay doll, complete with her very own director’s chair, and comfortable-looking sneakers. The choice to feature an accomplished black female film director is a revolutionary reimagining of what it means to be an American doll icon. In Ariel Levy’s essay “Dolls and Feelings” Jill Soloway revels in how playing with dolls teaches girls how to be fiercely competent directors: “We all know how to do it. We fucking grew up doing it! It’s dolls. How did men make us think we weren’t good at this? It’s dolls and feelings.” My hope for the world of New Barbie is that the emphasis continues to be on allowing girls to imagine new worlds and new possibilities, both for how they see themselves and how they see the world around them. The demand for greater representation in the toys children see and play with is genuine and necessary, but it’s also important that girls (and boys) learn to see themselves as more than just the sum of the pre-packaged plastic bodies they love to play with. I think we can appreciate these creative steps, while still being skeptical about the extent that consumer culture can help set us free from stereotypes and poor body image. After all, even if they now come in various shapes and sizes, every single newly purchased Barbie still has to be broken out of her box.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 15, 2016 12:30

The big money behind best in show: How the Westminster Dog Show’s four-legged TV celebrities fuel the high-dollar puppy business

When the beagle Miss P took Best in Show honors at the 2015 Westminster Kennel Club Dog Show, it was reported as a huge upset within the fancier world. Miss P, who came from a small Canadian kennel, outshined the odds-on favorite, Matisse, a Portuguese water dog who was as close to American royalty as it gets — a cousin of President Obama’s second puppy Sunny, with a big-money co-owner in Milan Lint, managing director and COO for Portfolio Management Group at the global investment firm BlackRock. Matisse, at least in terms of the people around him, was the dog far more like Sky, the wire fox terrier who won Westminster in 2014—another dog in whom wealthy people owned shares like a corporation. In fact, according to multiple sources, Sky had no fewer than five owners whose financial backing originated in South America, Europe and beyond. They included Victor Malzoni Jr., a construction magnate and owner of Hampton Court Kennels in Brazil who sponsors the campaigns for a half-dozen show dogs a year in the United States, plus more at shows in Europe. Another of Sky’s owners was reportedly Torie Steele of Malibu, California, who made her fortune helping Italian fashion designers such as Valentino and Versace enter the U.S. market through her flagship Torie Steele Boutique on Rodeo Drive in Beverly Hills, California. She was the second wife of billionaire Sam Wyly, who has owned everything from the craft-store chain Michaels to the Maverick Capital hedge fund. People who participate at events like Westminster are loathe to talk about how much money is really at stake in the show ring, but the sale of purebred dogs worldwide is a multibillion-dollar industry — not just for the award-winning breeders, but also for the owners of large-scale breeding farms, pet-store puppy distributors, auctioneers and pet store owners who will ultimately fill the majority of demand for puppies who look like the television stars Miss P and Matisse and Sky, purebred puppies that must be produced en masse after a high-profile win. The American Kennel Club, which sanctions the Westminster show, admits freely that driving up business is one of the main purposes of such televised events in a Kennel Spotlight magazine article: “How do these AKC events help breeders? They create preference and demand for purebreds, no matter where the consumer chooses to buy their purebred dog.” Learning how that demand is filled goes far beyond what most of us have seen in depressing “puppy mill” commercials. Following the money to see just how big of a market dogs are worldwide today can be absolutely mind-boggling. For starters, there’s the booming business of champion dogs’ frozen semen. That money-maker is old news in the breeding world; the American Kennel Club first recognized a litter of puppies conceived from frozen semen back in 1981. With the ability to create an unnatural number of champion-lineage puppies from the stored stuff, a Westminster winner no longer even has to be alive to keep the money coming in for his owners. Such was the case of a Standard Poodle named Peter, the late Westminster Best in Show winner whose frozen semen was accidentally thawed and destroyed at a Pennsylvania clinic in 2010. His owners sued and in a precedent-setting case were awarded more than $200,000, which the jury determined was the value of the samples. Had the owners sued for the value of the puppies that could have come from each sample, the court award easily could have risen to more than $1 million. Then there are the dog auctions, where the owners of large-scale kennels (some would call them puppy mills) gather to outbid one another while purchasing new breeding stock. The largest legal dog auction in America is Southwest Auction Service in Wheaton, Missouri, whose owner, Bob Hughes, once sold an English Bulldog for $12,600. Hughes’ largest auction to date, according to his marketing materials, brought in $514,371. On advertised days inside his barn, anyone wanting to get into the breeding business can walk in off the street, grab a snack at his concession stand, sit in his bleachers and bid on everything from Chihuahuas to Golden Retrievers to Rottweilers to Yorkshire Terriers, sometimes brought out at a pace of more than 300 dogs per auction. “We’ll only know what these dogs are really worth when the American Kennel Club lets me hold an auction just after Westminster one year, using the champion,” he told me. “They do it with horses. Some horses go for a quarter-million dollars apiece. What if Tiger Woods’ wife likes a certain kind of dog and wants to bid up the price to get the best one? We’ll never know the truth on these prices until they let me do an auction of the champion dogs.” There also are the owners of legal, regulated puppy farms, guys like Dave Miller, former president of the Missouri Pet Breeders Association. He was a bird hunter who had raised some hunting dogs and, by a stroke of luck, got in on the Puggle craze that hit America in the early 2000s. His first year with just three females, he sold $10,000 worth of Puggle puppies, and he eventually invested in about $180,000 to build outdoor pens and kennels that meet state and federal requirements. His farm grew to become just one among about 2,600 commercial puppy operations in the central United States large enough to require licensing by the federal government. Miller has fifty or so adult Newfoundlands, Beagles, Shiba Inus, Corgis and Puggles living outside behind his home. As of 2012, he says, he was grossing about $140,000 per year. And if Miller’s operation sounds big, it’s nothing compared with the Hunte Corporation, which, at its height before the 2007 global recession, was moving some 90,000 puppies a year from farms owned by people like Miller into pet stores all across America, generating annual revenues of $26 million. Today, Hunte’s business is about half that, all coming out of its $10 million facility in Goodman, Missouri, near the corporate headquarters of Walmart and Tyson Foods. Hunte’s kennels can house somewhere in the vicinity of 1,300 puppies at a time, and the inventory of puppies turns over once a week, starting with “buy day” on Tuesdays and ending with distribution to pet stores via semi-trailer trucks that cost about $350,000 to custom-outfit with climate-control systems, LED lighting and, of course, cages to hold the merchandise. If anybody has a problem with any of that, Hunte’s officials say, they’re welcome to try to find a puppy distributor who does things better when demand spikes after televised dog shows like Westminster. “Go up the road to our competitor,” says Michael Stolkey, director of corporate sales. “See if they’ll let you in.” Trying to see the scope of the business beyond the dog show can be like looking through binoculars into a fog. The co-owners of Miss P, the winning beagle from the 2015 Westminster Kennel Club Dog Show, say their kennel in Canada produces just one or two litters of puppies a year. The effect of that beagle being beamed onto television sets worldwide seems like a whole lot of nothing—upper-class dog fanciers navel-gazing and self-congratulating on live television—but in reality, they’re setting the top end of a market that is vast, entrenched and highly profitable for thousands of other people. Lest there be any confusion about why the AKC sanctions events like the Westminster show, simply refer back to that article that the American Kennel Club published in Kennel Spotlight magazine for commercial breeders to read. The title? Pure and simple: “AKC Creates Demand for Purebred Dogs.” Adapted from "The Dog Merchants: Inside the Big Business of Breeders, Pet Stores, and Rescuers," to be published May 2 by Pegasus Books.When the beagle Miss P took Best in Show honors at the 2015 Westminster Kennel Club Dog Show, it was reported as a huge upset within the fancier world. Miss P, who came from a small Canadian kennel, outshined the odds-on favorite, Matisse, a Portuguese water dog who was as close to American royalty as it gets — a cousin of President Obama’s second puppy Sunny, with a big-money co-owner in Milan Lint, managing director and COO for Portfolio Management Group at the global investment firm BlackRock. Matisse, at least in terms of the people around him, was the dog far more like Sky, the wire fox terrier who won Westminster in 2014—another dog in whom wealthy people owned shares like a corporation. In fact, according to multiple sources, Sky had no fewer than five owners whose financial backing originated in South America, Europe and beyond. They included Victor Malzoni Jr., a construction magnate and owner of Hampton Court Kennels in Brazil who sponsors the campaigns for a half-dozen show dogs a year in the United States, plus more at shows in Europe. Another of Sky’s owners was reportedly Torie Steele of Malibu, California, who made her fortune helping Italian fashion designers such as Valentino and Versace enter the U.S. market through her flagship Torie Steele Boutique on Rodeo Drive in Beverly Hills, California. She was the second wife of billionaire Sam Wyly, who has owned everything from the craft-store chain Michaels to the Maverick Capital hedge fund. People who participate at events like Westminster are loathe to talk about how much money is really at stake in the show ring, but the sale of purebred dogs worldwide is a multibillion-dollar industry — not just for the award-winning breeders, but also for the owners of large-scale breeding farms, pet-store puppy distributors, auctioneers and pet store owners who will ultimately fill the majority of demand for puppies who look like the television stars Miss P and Matisse and Sky, purebred puppies that must be produced en masse after a high-profile win. The American Kennel Club, which sanctions the Westminster show, admits freely that driving up business is one of the main purposes of such televised events in a Kennel Spotlight magazine article: “How do these AKC events help breeders? They create preference and demand for purebreds, no matter where the consumer chooses to buy their purebred dog.” Learning how that demand is filled goes far beyond what most of us have seen in depressing “puppy mill” commercials. Following the money to see just how big of a market dogs are worldwide today can be absolutely mind-boggling. For starters, there’s the booming business of champion dogs’ frozen semen. That money-maker is old news in the breeding world; the American Kennel Club first recognized a litter of puppies conceived from frozen semen back in 1981. With the ability to create an unnatural number of champion-lineage puppies from the stored stuff, a Westminster winner no longer even has to be alive to keep the money coming in for his owners. Such was the case of a Standard Poodle named Peter, the late Westminster Best in Show winner whose frozen semen was accidentally thawed and destroyed at a Pennsylvania clinic in 2010. His owners sued and in a precedent-setting case were awarded more than $200,000, which the jury determined was the value of the samples. Had the owners sued for the value of the puppies that could have come from each sample, the court award easily could have risen to more than $1 million. Then there are the dog auctions, where the owners of large-scale kennels (some would call them puppy mills) gather to outbid one another while purchasing new breeding stock. The largest legal dog auction in America is Southwest Auction Service in Wheaton, Missouri, whose owner, Bob Hughes, once sold an English Bulldog for $12,600. Hughes’ largest auction to date, according to his marketing materials, brought in $514,371. On advertised days inside his barn, anyone wanting to get into the breeding business can walk in off the street, grab a snack at his concession stand, sit in his bleachers and bid on everything from Chihuahuas to Golden Retrievers to Rottweilers to Yorkshire Terriers, sometimes brought out at a pace of more than 300 dogs per auction. “We’ll only know what these dogs are really worth when the American Kennel Club lets me hold an auction just after Westminster one year, using the champion,” he told me. “They do it with horses. Some horses go for a quarter-million dollars apiece. What if Tiger Woods’ wife likes a certain kind of dog and wants to bid up the price to get the best one? We’ll never know the truth on these prices until they let me do an auction of the champion dogs.” There also are the owners of legal, regulated puppy farms, guys like Dave Miller, former president of the Missouri Pet Breeders Association. He was a bird hunter who had raised some hunting dogs and, by a stroke of luck, got in on the Puggle craze that hit America in the early 2000s. His first year with just three females, he sold $10,000 worth of Puggle puppies, and he eventually invested in about $180,000 to build outdoor pens and kennels that meet state and federal requirements. His farm grew to become just one among about 2,600 commercial puppy operations in the central United States large enough to require licensing by the federal government. Miller has fifty or so adult Newfoundlands, Beagles, Shiba Inus, Corgis and Puggles living outside behind his home. As of 2012, he says, he was grossing about $140,000 per year. And if Miller’s operation sounds big, it’s nothing compared with the Hunte Corporation, which, at its height before the 2007 global recession, was moving some 90,000 puppies a year from farms owned by people like Miller into pet stores all across America, generating annual revenues of $26 million. Today, Hunte’s business is about half that, all coming out of its $10 million facility in Goodman, Missouri, near the corporate headquarters of Walmart and Tyson Foods. Hunte’s kennels can house somewhere in the vicinity of 1,300 puppies at a time, and the inventory of puppies turns over once a week, starting with “buy day” on Tuesdays and ending with distribution to pet stores via semi-trailer trucks that cost about $350,000 to custom-outfit with climate-control systems, LED lighting and, of course, cages to hold the merchandise. If anybody has a problem with any of that, Hunte’s officials say, they’re welcome to try to find a puppy distributor who does things better when demand spikes after televised dog shows like Westminster. “Go up the road to our competitor,” says Michael Stolkey, director of corporate sales. “See if they’ll let you in.” Trying to see the scope of the business beyond the dog show can be like looking through binoculars into a fog. The co-owners of Miss P, the winning beagle from the 2015 Westminster Kennel Club Dog Show, say their kennel in Canada produces just one or two litters of puppies a year. The effect of that beagle being beamed onto television sets worldwide seems like a whole lot of nothing—upper-class dog fanciers navel-gazing and self-congratulating on live television—but in reality, they’re setting the top end of a market that is vast, entrenched and highly profitable for thousands of other people. Lest there be any confusion about why the AKC sanctions events like the Westminster show, simply refer back to that article that the American Kennel Club published in Kennel Spotlight magazine for commercial breeders to read. The title? Pure and simple: “AKC Creates Demand for Purebred Dogs.” Adapted from "The Dog Merchants: Inside the Big Business of Breeders, Pet Stores, and Rescuers," to be published May 2 by Pegasus Books.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 15, 2016 11:00

Hillary Clinton’s strategy shift: Why she’s wise to run for a “third Obama term”

After Thursday's debate it's clear what Hillary Clinton's closing message is: “Vote for me if you want a third Obama term.” Clinton mentioned Obama 21 times during the debate, and almost every time it was to imply that she's a stronger supporter of the president and, more important, better prepared to continue his legacy. She even dodged questions about her financial ties to Wall Street by noting that Obama accepted more donations from bankers than any Democratic candidate in history. The point wasn't to bash Obama but rather to reject the idea that “if you take donations from Wall Street, you can't be independent.” The most contentious – and revealing – moment of the debate was arguably when Clinton went right after Sanders on his criticisms of the president:
“I want to follow up on something having to do with leadership, because, you know, today Senator Sanders said that President Obama failed the presidential leadership test. And this is not the first time that he has criticized President Obama. In the past he has called him weak. He has called him a disappointment. He wrote a forward for a book that basically argued voters should have buyers' remorse when it comes to President Obama's leadership and legacy. And I just couldn't disagree more with those kinds of comments. You know, from my perspective, maybe because I understand what President Obama inherited, not only the worst financial crisis but the antipathy of the Republicans in Congress, I don't think he gets credit he deserves for being a president who got us out of that...The kind of criticism that we've heard from Senator Sanders about our president I expect from Republicans. I do not expect from someone running for the Democratic nomination to suceed President Obama.”
The subtext is hard to miss here: Clinton is suggesting that Sanders is unrealistic, doesn't support Obama's agenda, and can't be relied upon to carry on his work. Sanders, for his part, pushed back: “Madam Secretary, that is a low blow. I have worked with President Obama for the last seven years...But you know what? Last I heard we lived in a democratic society. Last I heard, a United States senator had the right to disagree with the president, including a president who has done such an extraordinary job.” Tactics aside, Clinton's decision to align herself with the president is an interesting one, and it's more than defensible. Clinton's philosophy and message is much closer to Obama's than Sanders'. In a recent conversation with Vox's Ezra Klein, veterans of Obama's 2008 and 2012 campaigns talked about their frustrations with and admiration of Sanders. “Obama got in the race to be president,” said Dan Pfeiffer, Obama's communications director in 2008, “and Sanders got in the race to send a message. And you can see that difference in their approaches to policymaking. Obama wouldn't support a policy unless he felt it was feasible if he was president. Sanders doesn't seem to have that limitation, which gives him more message purity...but is a huge substantive and political problem if he ends up in the White House.” There is a lot of truth to this. Clinton is a pragmatic incrementalist, like Obama. She appears to share several of the same goals with Sanders, but is much more cautious – or cynical, depending on whom you ask – about the prospects of achieving them. Obama, despite his transformational message, wasn't, as David Axelrod says, “the candidate of the left.” Instead, he wanted to change the climate of Washington, make it more congenial and less ideological. Bernie, on the other hand, is a pugilist by nature; he wants to shake everything up. Perhaps the biggest difference between Sanders and Obama is their contrasting approaches to corporate power. Jon Favreau, Obama's speechwriter in 2008, elaborates:
“It's not just that Obama doesn't think that's feasible [purging corporate interests from the process], it's that he doesn't think that's the right way to govern in a pluralistic democracy where everyone gets a voice. Obama believes that there's too many Americans who don't have a voice, and too many Americans don't have opportunity, and that a big reason for that is the power of special interests and big corporations. But he also believes that there's a place for those interests and corporations in our system.”
This is an area in which Obama and Clinton are in lockstep. I'm not sure Sanders wants to completely expunge Big Business from the process, but he certainly wants something close to that. Which of course is a great idea. The problem, though, at least from the perspective of Clinton and Obama, is that it's not possible. Even on an issue like single-payer healthcare, which Obama (and Clinton) would likely prefer, the president couldn't even get his Democratic colleagues to back him on the public option. It's not at all clear how Sanders could overcome this. The point, in any case, is that Clinton and Obama are very much alike. They may hold progressive principles (although Clinton's record invites skepticism), but they govern like calculating centrists. Sanders speaks for those tired of this approach, who regard it as an ethos of capitulation, not pragmatism. This is a difficult position for Clinton, mostly because telling people what they can't do isn't inspiring. But it may be enough to secure her the nomination. We'll find out soon enough.After Thursday's debate it's clear what Hillary Clinton's closing message is: “Vote for me if you want a third Obama term.” Clinton mentioned Obama 21 times during the debate, and almost every time it was to imply that she's a stronger supporter of the president and, more important, better prepared to continue his legacy. She even dodged questions about her financial ties to Wall Street by noting that Obama accepted more donations from bankers than any Democratic candidate in history. The point wasn't to bash Obama but rather to reject the idea that “if you take donations from Wall Street, you can't be independent.” The most contentious – and revealing – moment of the debate was arguably when Clinton went right after Sanders on his criticisms of the president:
“I want to follow up on something having to do with leadership, because, you know, today Senator Sanders said that President Obama failed the presidential leadership test. And this is not the first time that he has criticized President Obama. In the past he has called him weak. He has called him a disappointment. He wrote a forward for a book that basically argued voters should have buyers' remorse when it comes to President Obama's leadership and legacy. And I just couldn't disagree more with those kinds of comments. You know, from my perspective, maybe because I understand what President Obama inherited, not only the worst financial crisis but the antipathy of the Republicans in Congress, I don't think he gets credit he deserves for being a president who got us out of that...The kind of criticism that we've heard from Senator Sanders about our president I expect from Republicans. I do not expect from someone running for the Democratic nomination to suceed President Obama.”
The subtext is hard to miss here: Clinton is suggesting that Sanders is unrealistic, doesn't support Obama's agenda, and can't be relied upon to carry on his work. Sanders, for his part, pushed back: “Madam Secretary, that is a low blow. I have worked with President Obama for the last seven years...But you know what? Last I heard we lived in a democratic society. Last I heard, a United States senator had the right to disagree with the president, including a president who has done such an extraordinary job.” Tactics aside, Clinton's decision to align herself with the president is an interesting one, and it's more than defensible. Clinton's philosophy and message is much closer to Obama's than Sanders'. In a recent conversation with Vox's Ezra Klein, veterans of Obama's 2008 and 2012 campaigns talked about their frustrations with and admiration of Sanders. “Obama got in the race to be president,” said Dan Pfeiffer, Obama's communications director in 2008, “and Sanders got in the race to send a message. And you can see that difference in their approaches to policymaking. Obama wouldn't support a policy unless he felt it was feasible if he was president. Sanders doesn't seem to have that limitation, which gives him more message purity...but is a huge substantive and political problem if he ends up in the White House.” There is a lot of truth to this. Clinton is a pragmatic incrementalist, like Obama. She appears to share several of the same goals with Sanders, but is much more cautious – or cynical, depending on whom you ask – about the prospects of achieving them. Obama, despite his transformational message, wasn't, as David Axelrod says, “the candidate of the left.” Instead, he wanted to change the climate of Washington, make it more congenial and less ideological. Bernie, on the other hand, is a pugilist by nature; he wants to shake everything up. Perhaps the biggest difference between Sanders and Obama is their contrasting approaches to corporate power. Jon Favreau, Obama's speechwriter in 2008, elaborates:
“It's not just that Obama doesn't think that's feasible [purging corporate interests from the process], it's that he doesn't think that's the right way to govern in a pluralistic democracy where everyone gets a voice. Obama believes that there's too many Americans who don't have a voice, and too many Americans don't have opportunity, and that a big reason for that is the power of special interests and big corporations. But he also believes that there's a place for those interests and corporations in our system.”
This is an area in which Obama and Clinton are in lockstep. I'm not sure Sanders wants to completely expunge Big Business from the process, but he certainly wants something close to that. Which of course is a great idea. The problem, though, at least from the perspective of Clinton and Obama, is that it's not possible. Even on an issue like single-payer healthcare, which Obama (and Clinton) would likely prefer, the president couldn't even get his Democratic colleagues to back him on the public option. It's not at all clear how Sanders could overcome this. The point, in any case, is that Clinton and Obama are very much alike. They may hold progressive principles (although Clinton's record invites skepticism), but they govern like calculating centrists. Sanders speaks for those tired of this approach, who regard it as an ethos of capitulation, not pragmatism. This is a difficult position for Clinton, mostly because telling people what they can't do isn't inspiring. But it may be enough to secure her the nomination. We'll find out soon enough.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 15, 2016 09:00

How does the gun industry live with itself? The psychology behind inhumanity

The gun industry attributes the blame for the escalating lethality of its firearms to public demand. Recall Olberg’s explanation of why Smith & Wesson was selling small lethal pistols that are easy to conceal: “I sell the guns that the market is demanding” (Albright, Alexander, Arvidson, & Eason, 1981). As we have seen, innovative lethality was driven by sagging gun sales and the battle among gun makers for the small-gun market, not by consumer demand. Another form of moral evasion by selective attribution of blame is to disembody the gun from the shooter in a false dichotomy that places the blame entirely on the shooter. This is apparent in the NRA’s exonerative causal slogan “Guns don’t kill people. People kill people.” Removal of the gun from the mix of causal factors absolves guns—and by extension their manufacturers—of any role in gun violence. This causal detachment is analogous to claiming that it is people, not carcinogenic cigarettes, that are a major determinant of lung cancer. Human agency is executed through means. The gun toter is the agent. The gun industry provides lethal means to achieve desired ends. Guns vary in killing power. In a mass shooting with a semiautomatic rifle, the size of the magazine and the ease of reloading determine the scope of the slaughter. For example, in the massacre in Fort Hood in 2009, Nidal Hasan was able to exact a heavy toll by extending the 10-round magazine to 30 rounds with easy reloading: “He was dropping his magazines and reloading in a matter of seconds” (Fernandez, 2013). Because of the semiautomatic feature, mass shootings last less than five minutes (Schmidt, 2014). The make of the gun and the size of the magazine in the hands of the shooter determine how many people die and are maimed. For someone intent on killing a lot of people, a semiautomatic weapon with a large-capacity magazine is the means of choice. Its killing power is tragically displayed in the rash of mass shootings. In accord with modeling theory, the recent years have witnessed a sharp rise in the number of mass shootings and the number of people killed in each tragic assault (Schmidt, 2014). A gun is inanimate, but through its intended firepower it also influences a person’s sense of agency (Selinger, 2012). The agentic transformative power of a gun is captured in an ad from an earlier era for the Colt .45 gun. It was often called the great equalizer that made the little man as big as the largest man in the West. The amount of carnage in a shooting spree is determined by the complex interplay of the psychological makeup of the shooter, the social conditions that drove the shooter to commit mass murder, the nature of the setting, and the lethality of the gun. From a moral standpoint, in addition to the shooter, the gun industry bears some responsibility for developing and marketing guns of ever-greater lethality in the competition for market share. The gun lobby is also a contributor to the causal mix by blocking any gun reforms regarding the lethality of the firearms and bullets being marketed and by staunchly defending easy access to them. Lawmakers beholden to the gun lobby are facilitators as well. In short, mass killing is determined by a complex array of facilitators, rather than solely by the psychological makeup of the assailant. The causal cliché is widely used for deflecting accountability. William Ruger, a gun manufacturer, argued, “Guns are a matter of individual responsibility. You keep coming back to the fact that people kill people, not guns” (Ayres, 1994). Professed helplessness to do anything about mass killings is another form of moral evasion for opposing any constraints on the gun industry. One lobbyist used this type of evasion in opposing a community’s effort to ban bullets that can penetrate police body armor. In his argument, in which he euphemistically refers to armor-piercing bullets as “objects,” he claimed that you can’t moderate behavior by controlling objects. Quite the contrary. Fewer police are likely to be slain with armor-piercing bullets if they are banned with strict enforcement than if the bullets are freely available. Claimed Futility of Gun Regulations Construing gun regulation as futile is a novel form of moral disengagement in gun violence operating at the effects locus of moral control. In a mass shooting at a movie theater in Aurora, Colorado, one of the weapons was a military-style semiautomatic rifle. As we have seen, this type of assault weapon had been banned for civilian use. However, lawmakers allowed the ban to expire in 2004. Immediately after the mass killing, the governor of Colorado argued against stricter regulation of the gun industry: “If there were no assault weapons available and no this or no that, this guy is going to find something, right? He’s going to know how to create a bomb” (Crummy, 2012). Recall that the killing power of firearms is heightened by enlarged ammunition clips and semiautomatic firing. The Aurora gunman used a semiautomatic rifle equipped with a 100-bullet magazine. Countless lives were spared because his gun jammed partway through his planned massacre. The governor used the alleged inevitability of mass killings by other means as a justification for exempting firearms from regulatory consideration. Little attention is paid to the government’s obligation to protect the public’s right to safety. A gunman in a suburban Wisconsin town killed three women and wounded four others in a spa where his estranged wife worked. The mayor of the town repeated the futility justifications that fend off public demands for gun reforms: “Try as we might, these can’t be avoided” (Yaccino & Davey, 2012). Gun-rights advocates point to the fact that killers obtain guns legally as evidence that regulations won’t stop massacres. They argue that mentally unstable people, not guns, are the problem. A gunman intent on killing as many people as possible obviously can massacre more of them with magazines that hold 30 rounds than with ones that hold 10 bullets. In arguing against a proposed law in Colorado to limit the capacity of ammunition clips, one lawmaker claimed that “it makes no difference to public safety if there are 10 rounds in a magazine, whether there are 15 rounds in a magazine or whether there are 30 rounds” (Frosch, 2013). Gun advocates who oppose the regulation of firearms on the grounds that they cannot be regulated run the risk of hoisting themselves in their own petard. If a society faces the threat of repeated massacres of innocent people by killers using military-style weapons that no amount of regulation can prevent, society has a moral obligation to protect its citizens by banning such weapons. It cannot be a helpless victim of a gun industry. What the claim of the futility of gun regulations ignores is a fundamental question: What are military-style semiautomatic weapons with unlimited ammunition clips doing as merchandisable lethal products in a civil society? In keeping with the selective allocation of blame, the solution proposed by the gun industry is to increase the severity of punishment for crimes committed with a gun. This translates into lengthier prison terms, “Harsh sentences for gun criminals,” as Froman (2007) puts it. The massive growth of the prison population is imposing increasing social and economic burdens on society. We turn to the high societal cost of the remedy proposed by the gun industry next. Types of Public Victimization by Crime Crimes victimize people in three major ways. The crime itself victimizes them. It also impairs the quality of life in the community at large. A few random shootings can strike fear in an entire community. Fear for their own safety permeates people’s lives. Many people are arming themselves. They live behind bolted doors, avoid most downtown areas, and desert their streets at night. A district attorney describes in concrete terms the life-constricting effects of feared violence: “Gun violence is what makes people afraid to go to the corner store at night” (Ludwig & Cook, 2003). Crime rates have been declining but, paradoxically, fear of criminal victimization is rising. A campaign to arm the populous requires high arousal of public fear. Carrying concealed guns in public places has been legalized by state and local legislatures. The fact that people are walking around with concealed weapons renders more of the public environment threatening. One senator introduced a bill on the Senate floor that would allow individuals from states permitting concealed gun carrying to arm themselves while visiting other states (Collins, 2009). The senator argued, for example, that this interstate gun-carrying license would make Central Park “a much safer place.” The ill-chosen example, which probably contributed to the narrow defeat of his bill, backfired. Introducing guns into Central Park could not make it safer because there had not been a gun homicide in the park for years. Converting Central Park into a gun-carrying zone could only make it a scary place. The second societal cost of gun violence, which bears on the gun industry’s prescription of longer jail terms, is the heavy drain of prison costs on tax revenues. Lengthier and mandatory prison terms cram the prisons. In California it costs about $47,000 per year to incarcerate an inmate (Legislative Analyst’s Office, 2014). Lengthier prison sentences have a significant impact on how governmental resources are spent. Higher education and prisons compete for money from the same general fund. The public demands that criminals be put away for long stretches, but is unwilling to pay the heavy costs. Indeed, legislators get voted out of office if they raise taxes. As a consequence, prisons are draining funds for higher education. After adjustment for inflation, since 1980 spending has decreased by 13% for higher education but has swelled by 436% for prisons. California now spends more on prisons than on higher education (Sankin, 2012). As university budgets and financial aid shrinks, tuition increases are used to cover the shortfalls. The diversion of scarce resources from education to prisons drives out the neediest students from higher education. The irony of this budgetary diversion is that education provides the best escape from crime and poverty. In the third public victimization by crime, the huge cost of operating the prison system detracts from educational and developmental programs during children’s early formative phase of life. The enabling guidance equips them with personal resources for a prosocial life path. School failure, accompanied by association with antisocial peers, forecloses many prosocial options in later life (Patterson, 1986). Dropping out of school increases the likelihood of incarceration and joblessness, which incur high social and economic costs (Sum, Khatiwada, McLaughlin, & Palma, 2009). Once youths get into trouble with the law, they cycle through the prison system, with most coming out worse than when they went in. Investment in developmental programs that cultivate children’s interests, aspirations, competencies, and resilient beliefs in their efficacy to realize their hopes pays large future dividends. Hawkins and his collaborators demonstrate how early efforts to promote academic and social development yield huge long-term benefits (Hawkins, Catalano, Kosterman, Abbott, & Hill, 1999). In this school-based program offered in the elementary grades, teachers were taught how to manage classroom behavior and promote academic development. Parents were taught parenting skills and how to support their children’s academic work. And the students were taught how to manage interpersonal problems and resist peer pressure to engage in transgressive activities. The effects of this early multifaceted effort were assessed in a six-year follow-up when the students were 17 years old and in high school. Compared with children in matched control schools that did not offer the program, those who had the benefit of this early developmental aid were more likely to remain in school, were less likely to repeat grades, had higher academic achievement, and were less likely to commit violent crimes, take up heavy drinking, father a baby, or give birth to one. The children from poor, crime-ridden areas benefitted the most. Society can provide prosocial guidance for children living in disadvantaged conditions or pay dearly later. Youth violence is better reduced by investment in education than investment in incarceration. Derogation of Opponents A poorly regulated lethal product that is shielded from civil liability and incurs high economic and social costs predictably draws heavy critical fire. Challenges to the morality and civic responsibility of the gun industry’s troubling practices are met with infuriated reactions by progun advocates. They see themselves as patriotic defenders of freedom, unjustly attacked by an arrogant, elitist minority bent on banning guns from society. During his presidency of the NRA, Charlton Heston launched the most blistering counterattack. He directed his heaviest fire at the press, especially their coverage of publicly alarming shootings (Heston, 1999). In his portrayal, “[t] his harvest of hatred is . . . sold as news, as entertainment, as governmental policy” as “reporters perch like vultures” and “news anchors race to drench their microphones in the tears of victims” (Heston, 1999). The reason for the “screeching hyperbole leveled at the gun owners” is that “their story needs a villain. . . . And we’re often cast as the villain.” Concerning media requests for interviews after a tragic shooting, Heston said, “The countless requests we’ve received for media appearances are in fact summons [sic] to public floggings, where those who hate firearms will predictably don the white hat and hand us the black.” Heston turned his wrath on political advocates of gun regulation as well. Members of the Clinton administration, whom he called “Clinton’s cultural shock troops,” were his archenemies (Heston, 1997). With his election to the presidency, Obama drew the heavy fire. Wayne LaPierre likened him to a “South American dictator” bent on eradicating the Second Amendment. In his conspiratorial analysis, LaPierre warned NRA members that Obama was trying to “lull gun owners to sleep to win re-election.” “Lip service to gun owners,” LaPierre warned, is just “part of a massive Obama conspiracy” to deceive voters and hide his true intentions to destroy the Second Amendment during his second term (Markon, 2012). In his rallying cry at an NRA convention, LaPierre (2012b) beseeched his followers to “save America and our freedom.” Other members of the NRA also characterize gun policies in terms of repressive police control. For example, gun regulation is called “political terrorism.” Development of a gun-tracking system is “police control,” and federal agents are “jack-booted government thugs.” Proponents of gun regulation are “loony leftists.” It works both ways, however. One former top lobbyist for the gun industry had some uncharitable things to say about the NRA: “You have a situation where you have a bunch of right-wing wackos at the NRA who are controlling everything” (Butterfield, 2003). And the NRA resents being called a “merchant of death.” The fierce factional dispute is not about guns per se, Heston explained. Rather, it is just one aspect of the larger cultural war construed by progun advocates as between arrogant elitists and rank-and-file Americans who love their country and are courageous guardians of America’s cherished values and freedoms (Heston, 1997). The reframing of the nature of this war is larded with widely used oppressive imagery of “thought police,” “lock-step conformity,” “cultural warlords,” “self-appointed social engineers,” “Clinton’s cultural warriors,” and “apologist for criminals.” Within this wrathful declamation, Heston (1999) incongruously presents himself as a judicious conciliator seeking to restore harmony between the warring factors. “I am asking all of us, on both sides, to take one step back from the edge of that cliff. Then another step and another, however many it takes to get back to that place where we’re all Americans again.” The mission of the NRA, in Heston’s clarion call to his constituents, is to defend hard-fought freedoms from zealous gun haters. In the emotive discourse, guns are linked to a list of other types of freedoms:
"Our mission is to remain a steady beacon of strength and support for the Second Amendment, even if it has no other friend on the planet. We cannot let tragedy lay waste to [sic] the most rare and hard-won human right in history. A nation cannot gain safety by giving up freedom. This truth is older than our country. 'Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty not safety.' Ben Franklin said that. If you like your freedoms of speech and of religion, freedom from search and seizure, freedom of the press and of privacy, to assemble and to redress grievances, then you’d better give them that eternal bodyguard called the Second Amendment. The individual right to bear arms is freedom’s insurance policy, not just for your children but for infinite generations to come. That is its singular, sacred beauty, and why we preserve it so fiercely.” (Heston, 1999)
Guns are sanctified not only by association with other cherished freedoms but also by linkage to broader sociopolitical matters that resonate strongly with most NRA constituents. This is achieved by establishing one’s moral credentials through past conduct. Having behaved charitably or righteously establishes one as a good person with license to behave prejudicially in the future. This process of self-entitlement to prejudicial conduct is well documented by Monin and his collaborators across diverse areas of functioning (Effron, Cameron, & Monin, 2009; Monin & Miller, 2001). Heston used his march with Martin Luther King Jr. for civil rights as his moral voucher to courageously champion “white pride” in the nation’s founders, who created the constitutional gun right. This moral self-license spilled over into indiscriminate condemnation of entire categories of people   who, in Heston’s view, undermine the social and moral order, including feminists, homosexuals, African Americans, and new age religionists: The Constitution was handed down to guide us by a bunch of those wise old dead white guys who invented this country. Now some flinch when I say that. Why? It’s true . . . they were white guys. So were most of the guys who died in Lincoln’s name opposing slavery in the 1860s. So why should I be ashamed of white guys? . . . Now, Chuck Heston can get away with saying I’m proud of those wise old dead white guys because Jesse Jackson and Louie Farrakhan know I fought in their cultural war. I was one of the first white soldiers in the civil rights movement in 1961, long before it was fashionable in Hollywood, believe me, or in Washington for that matter. . . . Mainstream America is depending on you, counting on you to draw your sword and fight for them. These people have precious little time or resources to battle misguided Cinderella attitudes, the fringe propaganda of the homosexual coalition, the feminists who preach that it’s a divine duty for women to hate men, blacks who raise a militant fist with one hand while they seek preference with the other, and all the New-Age apologists for juvenile crime, who see roving gangs as a means of youthful expression. . . . Freedom is our fortune and honor is our saving grace. (Heston, 1997) This call to arms also illustrates the moral engagement subfunction in the mechanism of social and moral justification. Fighting gun regulation becomes a source of patriotic honor, moral courage, and self-pride. Each year the National Council of Teachers of English presents its Doublespeak Award to public figures or organizations employing deceptive, euphemistic, or self-contradictory ways. In 1999, the award went to the National Rifle Association, with special recognition of Charlton Heston for his “artful twisting of language to blur issues,” and the “invocation of patriotism, reverence, love of freedom, and the opposing use of dread words to color the opposition” (National Council of Teachers of English, 1999). Former mayor of New York Michael Bloomberg, who cofounded a coalition of mayors and supports grassroots activism for gun reform, has been especially targeted by gun enthusiasts. They have branded him a “nanny statist fascist” and an “anti-gun bigot” (Barbaro & Goldstein, 2013). Their intense hatred went beyond words. One man sent letters to him and the director of his advocacy organization that were laced with the poison ricin. The letters asserted that the right to bear arms is a “God-given right” that the sender would protect to his death. The NRA’s uncompromising opposition to any restriction on firearms gives gun-regulation advocates a lot to be incensed about. Here are some of the restrictions opposed by the NRA on the basis of the slippery-slope scenario: banning semiautomatic assault weapons, armor-piercing bullets, and easily concealable street crime guns; requiring safety trigger locks; limiting purchases to one gun a month; background checks for purchases at gun shows; requiring gun dealers to examine their inventories for lost or stolen guns; implementing a national system for tracing guns used in crimes; imposing civil liability for egregious sales practices; and banning gun carrying in public parks and recreational areas. Excerpted from "Moral Disengagement: How People Do Harm and Live With Themselves" by Albert Bandura. Published by Worth Publishers. Copyright © 2016 by Worth Publishers. Reprinted with permission of the publisher. All rights reserved.The gun industry attributes the blame for the escalating lethality of its firearms to public demand. Recall Olberg’s explanation of why Smith & Wesson was selling small lethal pistols that are easy to conceal: “I sell the guns that the market is demanding” (Albright, Alexander, Arvidson, & Eason, 1981). As we have seen, innovative lethality was driven by sagging gun sales and the battle among gun makers for the small-gun market, not by consumer demand. Another form of moral evasion by selective attribution of blame is to disembody the gun from the shooter in a false dichotomy that places the blame entirely on the shooter. This is apparent in the NRA’s exonerative causal slogan “Guns don’t kill people. People kill people.” Removal of the gun from the mix of causal factors absolves guns—and by extension their manufacturers—of any role in gun violence. This causal detachment is analogous to claiming that it is people, not carcinogenic cigarettes, that are a major determinant of lung cancer. Human agency is executed through means. The gun toter is the agent. The gun industry provides lethal means to achieve desired ends. Guns vary in killing power. In a mass shooting with a semiautomatic rifle, the size of the magazine and the ease of reloading determine the scope of the slaughter. For example, in the massacre in Fort Hood in 2009, Nidal Hasan was able to exact a heavy toll by extending the 10-round magazine to 30 rounds with easy reloading: “He was dropping his magazines and reloading in a matter of seconds” (Fernandez, 2013). Because of the semiautomatic feature, mass shootings last less than five minutes (Schmidt, 2014). The make of the gun and the size of the magazine in the hands of the shooter determine how many people die and are maimed. For someone intent on killing a lot of people, a semiautomatic weapon with a large-capacity magazine is the means of choice. Its killing power is tragically displayed in the rash of mass shootings. In accord with modeling theory, the recent years have witnessed a sharp rise in the number of mass shootings and the number of people killed in each tragic assault (Schmidt, 2014). A gun is inanimate, but through its intended firepower it also influences a person’s sense of agency (Selinger, 2012). The agentic transformative power of a gun is captured in an ad from an earlier era for the Colt .45 gun. It was often called the great equalizer that made the little man as big as the largest man in the West. The amount of carnage in a shooting spree is determined by the complex interplay of the psychological makeup of the shooter, the social conditions that drove the shooter to commit mass murder, the nature of the setting, and the lethality of the gun. From a moral standpoint, in addition to the shooter, the gun industry bears some responsibility for developing and marketing guns of ever-greater lethality in the competition for market share. The gun lobby is also a contributor to the causal mix by blocking any gun reforms regarding the lethality of the firearms and bullets being marketed and by staunchly defending easy access to them. Lawmakers beholden to the gun lobby are facilitators as well. In short, mass killing is determined by a complex array of facilitators, rather than solely by the psychological makeup of the assailant. The causal cliché is widely used for deflecting accountability. William Ruger, a gun manufacturer, argued, “Guns are a matter of individual responsibility. You keep coming back to the fact that people kill people, not guns” (Ayres, 1994). Professed helplessness to do anything about mass killings is another form of moral evasion for opposing any constraints on the gun industry. One lobbyist used this type of evasion in opposing a community’s effort to ban bullets that can penetrate police body armor. In his argument, in which he euphemistically refers to armor-piercing bullets as “objects,” he claimed that you can’t moderate behavior by controlling objects. Quite the contrary. Fewer police are likely to be slain with armor-piercing bullets if they are banned with strict enforcement than if the bullets are freely available. Claimed Futility of Gun Regulations Construing gun regulation as futile is a novel form of moral disengagement in gun violence operating at the effects locus of moral control. In a mass shooting at a movie theater in Aurora, Colorado, one of the weapons was a military-style semiautomatic rifle. As we have seen, this type of assault weapon had been banned for civilian use. However, lawmakers allowed the ban to expire in 2004. Immediately after the mass killing, the governor of Colorado argued against stricter regulation of the gun industry: “If there were no assault weapons available and no this or no that, this guy is going to find something, right? He’s going to know how to create a bomb” (Crummy, 2012). Recall that the killing power of firearms is heightened by enlarged ammunition clips and semiautomatic firing. The Aurora gunman used a semiautomatic rifle equipped with a 100-bullet magazine. Countless lives were spared because his gun jammed partway through his planned massacre. The governor used the alleged inevitability of mass killings by other means as a justification for exempting firearms from regulatory consideration. Little attention is paid to the government’s obligation to protect the public’s right to safety. A gunman in a suburban Wisconsin town killed three women and wounded four others in a spa where his estranged wife worked. The mayor of the town repeated the futility justifications that fend off public demands for gun reforms: “Try as we might, these can’t be avoided” (Yaccino & Davey, 2012). Gun-rights advocates point to the fact that killers obtain guns legally as evidence that regulations won’t stop massacres. They argue that mentally unstable people, not guns, are the problem. A gunman intent on killing as many people as possible obviously can massacre more of them with magazines that hold 30 rounds than with ones that hold 10 bullets. In arguing against a proposed law in Colorado to limit the capacity of ammunition clips, one lawmaker claimed that “it makes no difference to public safety if there are 10 rounds in a magazine, whether there are 15 rounds in a magazine or whether there are 30 rounds” (Frosch, 2013). Gun advocates who oppose the regulation of firearms on the grounds that they cannot be regulated run the risk of hoisting themselves in their own petard. If a society faces the threat of repeated massacres of innocent people by killers using military-style weapons that no amount of regulation can prevent, society has a moral obligation to protect its citizens by banning such weapons. It cannot be a helpless victim of a gun industry. What the claim of the futility of gun regulations ignores is a fundamental question: What are military-style semiautomatic weapons with unlimited ammunition clips doing as merchandisable lethal products in a civil society? In keeping with the selective allocation of blame, the solution proposed by the gun industry is to increase the severity of punishment for crimes committed with a gun. This translates into lengthier prison terms, “Harsh sentences for gun criminals,” as Froman (2007) puts it. The massive growth of the prison population is imposing increasing social and economic burdens on society. We turn to the high societal cost of the remedy proposed by the gun industry next. Types of Public Victimization by Crime Crimes victimize people in three major ways. The crime itself victimizes them. It also impairs the quality of life in the community at large. A few random shootings can strike fear in an entire community. Fear for their own safety permeates people’s lives. Many people are arming themselves. They live behind bolted doors, avoid most downtown areas, and desert their streets at night. A district attorney describes in concrete terms the life-constricting effects of feared violence: “Gun violence is what makes people afraid to go to the corner store at night” (Ludwig & Cook, 2003). Crime rates have been declining but, paradoxically, fear of criminal victimization is rising. A campaign to arm the populous requires high arousal of public fear. Carrying concealed guns in public places has been legalized by state and local legislatures. The fact that people are walking around with concealed weapons renders more of the public environment threatening. One senator introduced a bill on the Senate floor that would allow individuals from states permitting concealed gun carrying to arm themselves while visiting other states (Collins, 2009). The senator argued, for example, that this interstate gun-carrying license would make Central Park “a much safer place.” The ill-chosen example, which probably contributed to the narrow defeat of his bill, backfired. Introducing guns into Central Park could not make it safer because there had not been a gun homicide in the park for years. Converting Central Park into a gun-carrying zone could only make it a scary place. The second societal cost of gun violence, which bears on the gun industry’s prescription of longer jail terms, is the heavy drain of prison costs on tax revenues. Lengthier and mandatory prison terms cram the prisons. In California it costs about $47,000 per year to incarcerate an inmate (Legislative Analyst’s Office, 2014). Lengthier prison sentences have a significant impact on how governmental resources are spent. Higher education and prisons compete for money from the same general fund. The public demands that criminals be put away for long stretches, but is unwilling to pay the heavy costs. Indeed, legislators get voted out of office if they raise taxes. As a consequence, prisons are draining funds for higher education. After adjustment for inflation, since 1980 spending has decreased by 13% for higher education but has swelled by 436% for prisons. California now spends more on prisons than on higher education (Sankin, 2012). As university budgets and financial aid shrinks, tuition increases are used to cover the shortfalls. The diversion of scarce resources from education to prisons drives out the neediest students from higher education. The irony of this budgetary diversion is that education provides the best escape from crime and poverty. In the third public victimization by crime, the huge cost of operating the prison system detracts from educational and developmental programs during children’s early formative phase of life. The enabling guidance equips them with personal resources for a prosocial life path. School failure, accompanied by association with antisocial peers, forecloses many prosocial options in later life (Patterson, 1986). Dropping out of school increases the likelihood of incarceration and joblessness, which incur high social and economic costs (Sum, Khatiwada, McLaughlin, & Palma, 2009). Once youths get into trouble with the law, they cycle through the prison system, with most coming out worse than when they went in. Investment in developmental programs that cultivate children’s interests, aspirations, competencies, and resilient beliefs in their efficacy to realize their hopes pays large future dividends. Hawkins and his collaborators demonstrate how early efforts to promote academic and social development yield huge long-term benefits (Hawkins, Catalano, Kosterman, Abbott, & Hill, 1999). In this school-based program offered in the elementary grades, teachers were taught how to manage classroom behavior and promote academic development. Parents were taught parenting skills and how to support their children’s academic work. And the students were taught how to manage interpersonal problems and resist peer pressure to engage in transgressive activities. The effects of this early multifaceted effort were assessed in a six-year follow-up when the students were 17 years old and in high school. Compared with children in matched control schools that did not offer the program, those who had the benefit of this early developmental aid were more likely to remain in school, were less likely to repeat grades, had higher academic achievement, and were less likely to commit violent crimes, take up heavy drinking, father a baby, or give birth to one. The children from poor, crime-ridden areas benefitted the most. Society can provide prosocial guidance for children living in disadvantaged conditions or pay dearly later. Youth violence is better reduced by investment in education than investment in incarceration. Derogation of Opponents A poorly regulated lethal product that is shielded from civil liability and incurs high economic and social costs predictably draws heavy critical fire. Challenges to the morality and civic responsibility of the gun industry’s troubling practices are met with infuriated reactions by progun advocates. They see themselves as patriotic defenders of freedom, unjustly attacked by an arrogant, elitist minority bent on banning guns from society. During his presidency of the NRA, Charlton Heston launched the most blistering counterattack. He directed his heaviest fire at the press, especially their coverage of publicly alarming shootings (Heston, 1999). In his portrayal, “[t] his harvest of hatred is . . . sold as news, as entertainment, as governmental policy” as “reporters perch like vultures” and “news anchors race to drench their microphones in the tears of victims” (Heston, 1999). The reason for the “screeching hyperbole leveled at the gun owners” is that “their story needs a villain. . . . And we’re often cast as the villain.” Concerning media requests for interviews after a tragic shooting, Heston said, “The countless requests we’ve received for media appearances are in fact summons [sic] to public floggings, where those who hate firearms will predictably don the white hat and hand us the black.” Heston turned his wrath on political advocates of gun regulation as well. Members of the Clinton administration, whom he called “Clinton’s cultural shock troops,” were his archenemies (Heston, 1997). With his election to the presidency, Obama drew the heavy fire. Wayne LaPierre likened him to a “South American dictator” bent on eradicating the Second Amendment. In his conspiratorial analysis, LaPierre warned NRA members that Obama was trying to “lull gun owners to sleep to win re-election.” “Lip service to gun owners,” LaPierre warned, is just “part of a massive Obama conspiracy” to deceive voters and hide his true intentions to destroy the Second Amendment during his second term (Markon, 2012). In his rallying cry at an NRA convention, LaPierre (2012b) beseeched his followers to “save America and our freedom.” Other members of the NRA also characterize gun policies in terms of repressive police control. For example, gun regulation is called “political terrorism.” Development of a gun-tracking system is “police control,” and federal agents are “jack-booted government thugs.” Proponents of gun regulation are “loony leftists.” It works both ways, however. One former top lobbyist for the gun industry had some uncharitable things to say about the NRA: “You have a situation where you have a bunch of right-wing wackos at the NRA who are controlling everything” (Butterfield, 2003). And the NRA resents being called a “merchant of death.” The fierce factional dispute is not about guns per se, Heston explained. Rather, it is just one aspect of the larger cultural war construed by progun advocates as between arrogant elitists and rank-and-file Americans who love their country and are courageous guardians of America’s cherished values and freedoms (Heston, 1997). The reframing of the nature of this war is larded with widely used oppressive imagery of “thought police,” “lock-step conformity,” “cultural warlords,” “self-appointed social engineers,” “Clinton’s cultural warriors,” and “apologist for criminals.” Within this wrathful declamation, Heston (1999) incongruously presents himself as a judicious conciliator seeking to restore harmony between the warring factors. “I am asking all of us, on both sides, to take one step back from the edge of that cliff. Then another step and another, however many it takes to get back to that place where we’re all Americans again.” The mission of the NRA, in Heston’s clarion call to his constituents, is to defend hard-fought freedoms from zealous gun haters. In the emotive discourse, guns are linked to a list of other types of freedoms:
"Our mission is to remain a steady beacon of strength and support for the Second Amendment, even if it has no other friend on the planet. We cannot let tragedy lay waste to [sic] the most rare and hard-won human right in history. A nation cannot gain safety by giving up freedom. This truth is older than our country. 'Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty not safety.' Ben Franklin said that. If you like your freedoms of speech and of religion, freedom from search and seizure, freedom of the press and of privacy, to assemble and to redress grievances, then you’d better give them that eternal bodyguard called the Second Amendment. The individual right to bear arms is freedom’s insurance policy, not just for your children but for infinite generations to come. That is its singular, sacred beauty, and why we preserve it so fiercely.” (Heston, 1999)
Guns are sanctified not only by association with other cherished freedoms but also by linkage to broader sociopolitical matters that resonate strongly with most NRA constituents. This is achieved by establishing one’s moral credentials through past conduct. Having behaved charitably or righteously establishes one as a good person with license to behave prejudicially in the future. This process of self-entitlement to prejudicial conduct is well documented by Monin and his collaborators across diverse areas of functioning (Effron, Cameron, & Monin, 2009; Monin & Miller, 2001). Heston used his march with Martin Luther King Jr. for civil rights as his moral voucher to courageously champion “white pride” in the nation’s founders, who created the constitutional gun right. This moral self-license spilled over into indiscriminate condemnation of entire categories of people   who, in Heston’s view, undermine the social and moral order, including feminists, homosexuals, African Americans, and new age religionists: The Constitution was handed down to guide us by a bunch of those wise old dead white guys who invented this country. Now some flinch when I say that. Why? It’s true . . . they were white guys. So were most of the guys who died in Lincoln’s name opposing slavery in the 1860s. So why should I be ashamed of white guys? . . . Now, Chuck Heston can get away with saying I’m proud of those wise old dead white guys because Jesse Jackson and Louie Farrakhan know I fought in their cultural war. I was one of the first white soldiers in the civil rights movement in 1961, long before it was fashionable in Hollywood, believe me, or in Washington for that matter. . . . Mainstream America is depending on you, counting on you to draw your sword and fight for them. These people have precious little time or resources to battle misguided Cinderella attitudes, the fringe propaganda of the homosexual coalition, the feminists who preach that it’s a divine duty for women to hate men, blacks who raise a militant fist with one hand while they seek preference with the other, and all the New-Age apologists for juvenile crime, who see roving gangs as a means of youthful expression. . . . Freedom is our fortune and honor is our saving grace. (Heston, 1997) This call to arms also illustrates the moral engagement subfunction in the mechanism of social and moral justification. Fighting gun regulation becomes a source of patriotic honor, moral courage, and self-pride. Each year the National Council of Teachers of English presents its Doublespeak Award to public figures or organizations employing deceptive, euphemistic, or self-contradictory ways. In 1999, the award went to the National Rifle Association, with special recognition of Charlton Heston for his “artful twisting of language to blur issues,” and the “invocation of patriotism, reverence, love of freedom, and the opposing use of dread words to color the opposition” (National Council of Teachers of English, 1999). Former mayor of New York Michael Bloomberg, who cofounded a coalition of mayors and supports grassroots activism for gun reform, has been especially targeted by gun enthusiasts. They have branded him a “nanny statist fascist” and an “anti-gun bigot” (Barbaro & Goldstein, 2013). Their intense hatred went beyond words. One man sent letters to him and the director of his advocacy organization that were laced with the poison ricin. The letters asserted that the right to bear arms is a “God-given right” that the sender would protect to his death. The NRA’s uncompromising opposition to any restriction on firearms gives gun-regulation advocates a lot to be incensed about. Here are some of the restrictions opposed by the NRA on the basis of the slippery-slope scenario: banning semiautomatic assault weapons, armor-piercing bullets, and easily concealable street crime guns; requiring safety trigger locks; limiting purchases to one gun a month; background checks for purchases at gun shows; requiring gun dealers to examine their inventories for lost or stolen guns; implementing a national system for tracing guns used in crimes; imposing civil liability for egregious sales practices; and banning gun carrying in public parks and recreational areas. Excerpted from "Moral Disengagement: How People Do Harm and Live With Themselves" by Albert Bandura. Published by Worth Publishers. Copyright © 2016 by Worth Publishers. Reprinted with permission of the publisher. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 15, 2016 08:59

February 14, 2016

Too real for reality TV — or even memoir: The new novels that dare use fiction to reveal secret truths

My MFA in nonfiction made me a novelist. On the one hand, spending two years practicing the art of sentence-making made me a more ambitious writer. On the other hand, I didn’t agree with the standards by which memoirs and personal essays were judged. As James Wood recently wrote in a review of Lauren Groff's "Fates and Furies," “a novel that can be truly ‘spoiled’ by the summary of its plot is a novel that was already spoiled.” The same could be said of memoir: Any good book is much more than a litany of events. But nonfiction, it seemed from my writing workshops, was judged more by its plot than its language: what mattered was not the way a story was told, but how intriguing the events of the story were and how neatly those events all tied up together in the end. “Brave” was the highest—and most common—compliment a piece of writing could engender, which usually meant that the author had revealed personal information considered shocking or humiliating, followed by a clear-cut moral. We were supposed to be writing true stories, but this formula—confession, epiphany—was not at all true to the way I experience life. By the end of my MFA, I was both weary enough of plot-centered critiques of writing and confident enough in my own use of language to believe I could at least attempt to write a novel. And what I discovered in the course of writing "Wreck and Order" is that I could communicate much more truth in fiction than I had ever been able to through personal essays. This is partly because I was no longer beholden to the often dull or inconvenient details of real life, but mostly because I could imagine truths other than my own. Writing a novel requires close observation of both the inner and outer world, which allowed me to imagine my way into other people’s stories or into other possibilities for my own life. I had to listen to deeper layers of my brain, pay closer attention to what in real life are just passing rages or fancies, imagining what would happen if I acted upon them. As Jeanette Winterson once said, “I’m not happy for words to simply convey meaning. [They] can if it’s journalism and it’s perfectly all right if you’re doing a certain kind of record or memoir, but it’s not all right in fiction.” Rather, as she argues in her essay “Imagination and Reality,” the “true effort” of art is to “open to us dimensions of the spirit and of the self that normally lie smothered under the weight of living.” The distance between reality (record or memoir) and truth (the rich, unseen underlayer of daily life) has widened in recent years, with the rise in reality TV; tell-all blogs; podcasts like "Serial" that turn real crimes into juicy murder mysteries; celebrity memoirs that sell for millions, while author income drops significantly overall. Anyone who’s watched an episode of "The Real Housewives" knows that just because something is real does not mean it is reflective of reality. Such deliciously ridiculous drama does not seek to depict life as it actually exists for most of humanity, but rather to use the titillating external details of strangers’ real lives as a distraction from the viewer’s own inner life. The James Frey debacle revealed just how highly the public values confession, even at the expense of quality. Unable to find a publisher for his novel "A Million Little Pieces," Frey decided to shop the same book around as a memoir. Random House purchased it, Oprah Winfrey extolled it, and it found millions of fans—until readers discovered the supposed memoir was largely fabricated and Frey instantly, dramatically fell from grace, with Oprah going so far as to invite him on her show to publicly excoriate him for “betraying millions.” It’s odd that the exact same book—composed of the same sentences, the same voice, the same structure—could be judged so differently depending only on its genre. In his three-page acknowledgment of the lies in his memoir, now included in all editions of the book, Frey writes that he made things up because he wanted his book to have “the tension that all great stories require.” But a story doesn’t need drug rings, suicide and jail time to have tension: "Mrs. Dalloway" is about a woman giving a party; "Notes from Underground" is the misanthropic rant of a retired civil servant. If Frey were capable of writing a great story, he wouldn’t have had to claim his novel was true in the first place. The only reason Frey’s book found readers is because they believed he had actually lived through this unbelievable litany of hardships and depravities. The public’s obsession with confessionals risks lowering our standards for storytelling and making us largely blind to language. A welcome antidote to pop culture’s reality craze is the recent crop of popular novels that are both beautifully written and committed to depicting the parts of real life that cannot be readily seen or explained. Ben Lerner, Edouard Levé, Lydia Davis, Elena Ferrante, Eileen Myles, Janet Frame, Teju Cole, Ben Metcalf and Eimear McBride all reveal specific and often overlooked “dimensions of the spirit and of the self” as much through language as through plot. Ben Lerner’s second novel "10:04" tells the story of a novelist named Ben struggling to balance life and art in the wake of a book deal for the novel we are currently reading. Teju Cole’s "Open City" chronicles “the constant struggle to modulate the internal environment” through his narrator’s observations of city life and conversations with strangers. The epigraph of Eileen Myles’ “nonfiction novel” "Cool for You" is the title of a painting by artist and poet Antonin Artaud: “Jamais real, toujours vrai.” Myles is not concerned with depicting her life as a series of real events, but rather communicating how it felt to grow up poor and female and full of an inner power and joy that found no place in the outside world. The emotional power of Elena Ferrante’s novels—particularly "The Days of Abandonment"—is so specific and convincing that the reader hardly doubts the veracity of the events that cause and result from these emotions. Yet it’s entirely possible that most or all of Ferrante’s plots are concocted. By keeping her identity a secret, Ferrante refuses to allow any discussion of “real” versus “make-believe” to impact the way her books are judged. “I very much love,” she has written in justification of her anonymity, “those mysterious volumes, both ancient and modern, that have no definite author but have had and continue to have an intense life of their own.” Writers whose primary concern is a faithful depiction of the human heart and mind have a long tradition to draw upon, from Proust, Knut Hamsen, Sebald, and Henry Miller, to more recent classics such as Alice Munro’s "Lives of Girls and Women" and Norman Rush’s "Mating." Although "Mating" is clearly not autobiographical—Rush is male and his narrator is female—the book feels true both to his character’s particular voice and patterns of thought and to the way the mind creates language in general. Munro has said that "Lives of Girls and Women"—which portrays a seemingly average teenage girl getting to know herself as she gains and loses friends, acts in school plays, loses her virginity—is an “autobiography in form but not in fact.” This is perhaps the best way to characterize the difference between the real and the true in writing. Truth in writing—whether fiction or memoir—comes from finding an original form that accurately reflects a particular person’s particular experience, whether or not the details of that experience are based on real life. Edouard Levé’s "Suicide" is written in the second person, addressed to a friend of the narrator’s who committed suicide 15 years earlier. It certainly increases the intrigue of the novel to know that Levé took his own life soon after completing "Suicide," but the novel’s power is self-contained. Levé’s nonlinear stream of recollections about his friend denies the reader the self-forgetful pleasure of the traditional novel: that of entering another world, elaborate yet comprehensible. Levé sees such collective sense-making as an anxious denial of the essential absurdity of our lives. So "Suicide" does not seek to explain how circumstances could make a person desire death, but only to illumine the particulars of that desire. As the narrator’s revelations about his friend’s inner life become increasingly complex, the reader comes to see “tu” as an externalized form that allows the narrator empathic clarity about the most disturbed parts of his own being. The very fact that his friend is dead allows the narrator to have the kind of unchanging closeness with him that one often has with books, which allow one to feel less alone by virtue of feeling more connected with one’s self:
If you were still alive, would we be friends? I was more attached to other boys. But time has seen me drift apart from them without my even noticing. All that would be needed to renew the bond would be a telephone call, but none of us is willing to risk the disillusionment of a reunion. . . . I no longer think of them, with whom I was formerly so close. But you, who used to be so far-off, distant, mysterious, now seem quite close to me. When I am in doubt, I solicit your advice. Your responses satisfy me better than those the others could give me. You accompany me faithfully wherever I may be. It is they who have disappeared. You are the present. You are a book that speaks to me whenever I need it.
Consider, by contrast, the following passage from Catherine Millet’s memoir "Jealousy"—the follow-up to her bestselling "The Sexual Life of Catherine M."in which Millet agonizes over her husband’s affairs with younger women. Millet reveals, in the course of a page-long paragraph about her passing interest in plastic surgery, that her mother committed suicide when Millet was a child. When she tells her husband, “My mother’s death has broken me,” he responds, “What kind of cliché is that?”
Being caught in flagrante using a ready made formula only increased my feelings of helplessness and humiliation. But over the next few days I had decided that in fact I wouldn’t retract the word ‘broken,’ even if it was so often used incorrectly, hyperbolically, creating the reverse effect of what was originally intended, and making it sound somehow pompous. There is a reason why a commonplace is called common. When we use one, it is not just that we suddenly have a lapse of lucidity or intelligence or even of culture, which would otherwise enable us to make a more refined or appropriate choice of vocabulary, it is also that we need to feel part of something. When suffering from the shock which comes from joy or misfortune, human beings are not fitted for the solitude which extreme emotions often bring upon them, and so they try to share them, which usually means they must ‘relativise them,’ that is to say, play them down.
By the end of this treatise on “appropriate choice of vocabulary” and the human impulse to “relativise” emotions, the reader still has no clear sense of how Millet feels about her mother’s suicide or what, if anything, this suicide has to do with her pained relationship with her husband. We learn more in one sentence of Levé’s about both the precise nature of the narrator’s feelings for his dead friend and the complications of intimacy in general. Let’s look at a second example: In the wake of learning that their husbands are having affairs with younger women, the writers of "Jealousy" and "The Days of Abandonment" have a nearly identical physical reaction: they lose control over their bodily movements. Here is how Millet describes the change: “Firstly, my emotional state was such that I could only move extremely slowly. Sometimes one’s heart rate increases to a point at which it seems as though the heart is banging against the walls of the chest and that what one hears is the sound of it doing so.” And here is Ferrante on the same experience: “I wanted my movements to seem purposeful, but instead I scarcely had control over my body…. In the car I had nothing but trouble: I forgot I was driving. The street was replaced by the most vivid memories of the past or by bitter fantasies, and often I dented fenders, or braked at the last moment, but angrily, as if reality were inappropriate and had intervened to destroy a conjured world that was the only one that at that moment counted for me.” Millet may be describing real events from her real life and Ferrante may be inventing the life of an imaginary woman, but Millet’s awkward, stilted, and oddly generalized language merely allows us to gape at a version of reality we already think we understand. Ferrante, on the other hand, forces us into the unseen layers of that same reality. Just as her narrator’s belief in a coherent world of work and family is destroyed when her husband leaves her, Ferrante’s depiction of reality does not reinforce a common sense of the way the world is, but rather surprises the reader into new understandings. Janet Frame decided to publish her novel "Towards Another Summer" posthumously because she felt it was too personal to be shared in her lifetime, even though her life story was already well known thanks to her bestselling memoir "The Angel at the Table." Little happens in "Toward Another Summer"; the story’s momentum comes from the unusual, moving way the narrator’s mind struggles through ordinary human interaction. Fiction frees the writer to reveal not only the self as it appears to the outside world, but also the self that, as my narrator puts it in "Wreck and Order," “I feel myself to be late at night when I can’t sleep and I’m all alone with the minutes passing, and I’m wide awake with thoughts I want to force the minutes to understand, but they are too fast, they pass and pass, and then pass again.” Writing slows time down enough to force an awareness of secret thoughts from secret selves. A fictional narrator can be the person you fear you are, or the person part of you wants to become and parts of you wants to kill, or the person you were once for 30 seconds the one and only time you stood up on a surfboard, or the person you are in the gap between consciousness and unconsciousness at the moment you wake up in the morning, before your brain’s belief in your fixed identity starts bossing you around. When I began writing "Wreck and Order," I was writing the story of the person I feared I would become, and somehow, in the course of writing, it became the story of the person I still hope to be. Only fiction could allow me to bridge that distance, not with miraculous self-improvements and life changes (the drunken sex addict becomes a saintly Buddhist nun!), but by concretizing—through language, through imagined events—the invisible connections between seemingly disparate longings and personalities, times and places. I don’t mean to imply that memoir is inherently limited. There are countless examples of memoirs that are not merely factually accurate, but also manage to reveal complex truths about how it feels to live particular facts—"This Boy’s Life," "Name All the Animals," "Darkness Visible," "Survival in Auschwitz"—just as there are countless examples of novels that provide facile explanations of life solely through plot, without ever credibly depicting characters’ inner lives. Truth, in both memoirs and novels, does not come from relaying concrete events, but from giving language to the invisible, ineffable undercurrent of real life that makes every event much more than it appears to be.My MFA in nonfiction made me a novelist. On the one hand, spending two years practicing the art of sentence-making made me a more ambitious writer. On the other hand, I didn’t agree with the standards by which memoirs and personal essays were judged. As James Wood recently wrote in a review of Lauren Groff's "Fates and Furies," “a novel that can be truly ‘spoiled’ by the summary of its plot is a novel that was already spoiled.” The same could be said of memoir: Any good book is much more than a litany of events. But nonfiction, it seemed from my writing workshops, was judged more by its plot than its language: what mattered was not the way a story was told, but how intriguing the events of the story were and how neatly those events all tied up together in the end. “Brave” was the highest—and most common—compliment a piece of writing could engender, which usually meant that the author had revealed personal information considered shocking or humiliating, followed by a clear-cut moral. We were supposed to be writing true stories, but this formula—confession, epiphany—was not at all true to the way I experience life. By the end of my MFA, I was both weary enough of plot-centered critiques of writing and confident enough in my own use of language to believe I could at least attempt to write a novel. And what I discovered in the course of writing "Wreck and Order" is that I could communicate much more truth in fiction than I had ever been able to through personal essays. This is partly because I was no longer beholden to the often dull or inconvenient details of real life, but mostly because I could imagine truths other than my own. Writing a novel requires close observation of both the inner and outer world, which allowed me to imagine my way into other people’s stories or into other possibilities for my own life. I had to listen to deeper layers of my brain, pay closer attention to what in real life are just passing rages or fancies, imagining what would happen if I acted upon them. As Jeanette Winterson once said, “I’m not happy for words to simply convey meaning. [They] can if it’s journalism and it’s perfectly all right if you’re doing a certain kind of record or memoir, but it’s not all right in fiction.” Rather, as she argues in her essay “Imagination and Reality,” the “true effort” of art is to “open to us dimensions of the spirit and of the self that normally lie smothered under the weight of living.” The distance between reality (record or memoir) and truth (the rich, unseen underlayer of daily life) has widened in recent years, with the rise in reality TV; tell-all blogs; podcasts like "Serial" that turn real crimes into juicy murder mysteries; celebrity memoirs that sell for millions, while author income drops significantly overall. Anyone who’s watched an episode of "The Real Housewives" knows that just because something is real does not mean it is reflective of reality. Such deliciously ridiculous drama does not seek to depict life as it actually exists for most of humanity, but rather to use the titillating external details of strangers’ real lives as a distraction from the viewer’s own inner life. The James Frey debacle revealed just how highly the public values confession, even at the expense of quality. Unable to find a publisher for his novel "A Million Little Pieces," Frey decided to shop the same book around as a memoir. Random House purchased it, Oprah Winfrey extolled it, and it found millions of fans—until readers discovered the supposed memoir was largely fabricated and Frey instantly, dramatically fell from grace, with Oprah going so far as to invite him on her show to publicly excoriate him for “betraying millions.” It’s odd that the exact same book—composed of the same sentences, the same voice, the same structure—could be judged so differently depending only on its genre. In his three-page acknowledgment of the lies in his memoir, now included in all editions of the book, Frey writes that he made things up because he wanted his book to have “the tension that all great stories require.” But a story doesn’t need drug rings, suicide and jail time to have tension: "Mrs. Dalloway" is about a woman giving a party; "Notes from Underground" is the misanthropic rant of a retired civil servant. If Frey were capable of writing a great story, he wouldn’t have had to claim his novel was true in the first place. The only reason Frey’s book found readers is because they believed he had actually lived through this unbelievable litany of hardships and depravities. The public’s obsession with confessionals risks lowering our standards for storytelling and making us largely blind to language. A welcome antidote to pop culture’s reality craze is the recent crop of popular novels that are both beautifully written and committed to depicting the parts of real life that cannot be readily seen or explained. Ben Lerner, Edouard Levé, Lydia Davis, Elena Ferrante, Eileen Myles, Janet Frame, Teju Cole, Ben Metcalf and Eimear McBride all reveal specific and often overlooked “dimensions of the spirit and of the self” as much through language as through plot. Ben Lerner’s second novel "10:04" tells the story of a novelist named Ben struggling to balance life and art in the wake of a book deal for the novel we are currently reading. Teju Cole’s "Open City" chronicles “the constant struggle to modulate the internal environment” through his narrator’s observations of city life and conversations with strangers. The epigraph of Eileen Myles’ “nonfiction novel” "Cool for You" is the title of a painting by artist and poet Antonin Artaud: “Jamais real, toujours vrai.” Myles is not concerned with depicting her life as a series of real events, but rather communicating how it felt to grow up poor and female and full of an inner power and joy that found no place in the outside world. The emotional power of Elena Ferrante’s novels—particularly "The Days of Abandonment"—is so specific and convincing that the reader hardly doubts the veracity of the events that cause and result from these emotions. Yet it’s entirely possible that most or all of Ferrante’s plots are concocted. By keeping her identity a secret, Ferrante refuses to allow any discussion of “real” versus “make-believe” to impact the way her books are judged. “I very much love,” she has written in justification of her anonymity, “those mysterious volumes, both ancient and modern, that have no definite author but have had and continue to have an intense life of their own.” Writers whose primary concern is a faithful depiction of the human heart and mind have a long tradition to draw upon, from Proust, Knut Hamsen, Sebald, and Henry Miller, to more recent classics such as Alice Munro’s "Lives of Girls and Women" and Norman Rush’s "Mating." Although "Mating" is clearly not autobiographical—Rush is male and his narrator is female—the book feels true both to his character’s particular voice and patterns of thought and to the way the mind creates language in general. Munro has said that "Lives of Girls and Women"—which portrays a seemingly average teenage girl getting to know herself as she gains and loses friends, acts in school plays, loses her virginity—is an “autobiography in form but not in fact.” This is perhaps the best way to characterize the difference between the real and the true in writing. Truth in writing—whether fiction or memoir—comes from finding an original form that accurately reflects a particular person’s particular experience, whether or not the details of that experience are based on real life. Edouard Levé’s "Suicide" is written in the second person, addressed to a friend of the narrator’s who committed suicide 15 years earlier. It certainly increases the intrigue of the novel to know that Levé took his own life soon after completing "Suicide," but the novel’s power is self-contained. Levé’s nonlinear stream of recollections about his friend denies the reader the self-forgetful pleasure of the traditional novel: that of entering another world, elaborate yet comprehensible. Levé sees such collective sense-making as an anxious denial of the essential absurdity of our lives. So "Suicide" does not seek to explain how circumstances could make a person desire death, but only to illumine the particulars of that desire. As the narrator’s revelations about his friend’s inner life become increasingly complex, the reader comes to see “tu” as an externalized form that allows the narrator empathic clarity about the most disturbed parts of his own being. The very fact that his friend is dead allows the narrator to have the kind of unchanging closeness with him that one often has with books, which allow one to feel less alone by virtue of feeling more connected with one’s self:
If you were still alive, would we be friends? I was more attached to other boys. But time has seen me drift apart from them without my even noticing. All that would be needed to renew the bond would be a telephone call, but none of us is willing to risk the disillusionment of a reunion. . . . I no longer think of them, with whom I was formerly so close. But you, who used to be so far-off, distant, mysterious, now seem quite close to me. When I am in doubt, I solicit your advice. Your responses satisfy me better than those the others could give me. You accompany me faithfully wherever I may be. It is they who have disappeared. You are the present. You are a book that speaks to me whenever I need it.
Consider, by contrast, the following passage from Catherine Millet’s memoir "Jealousy"—the follow-up to her bestselling "The Sexual Life of Catherine M."in which Millet agonizes over her husband’s affairs with younger women. Millet reveals, in the course of a page-long paragraph about her passing interest in plastic surgery, that her mother committed suicide when Millet was a child. When she tells her husband, “My mother’s death has broken me,” he responds, “What kind of cliché is that?”
Being caught in flagrante using a ready made formula only increased my feelings of helplessness and humiliation. But over the next few days I had decided that in fact I wouldn’t retract the word ‘broken,’ even if it was so often used incorrectly, hyperbolically, creating the reverse effect of what was originally intended, and making it sound somehow pompous. There is a reason why a commonplace is called common. When we use one, it is not just that we suddenly have a lapse of lucidity or intelligence or even of culture, which would otherwise enable us to make a more refined or appropriate choice of vocabulary, it is also that we need to feel part of something. When suffering from the shock which comes from joy or misfortune, human beings are not fitted for the solitude which extreme emotions often bring upon them, and so they try to share them, which usually means they must ‘relativise them,’ that is to say, play them down.
By the end of this treatise on “appropriate choice of vocabulary” and the human impulse to “relativise” emotions, the reader still has no clear sense of how Millet feels about her mother’s suicide or what, if anything, this suicide has to do with her pained relationship with her husband. We learn more in one sentence of Levé’s about both the precise nature of the narrator’s feelings for his dead friend and the complications of intimacy in general. Let’s look at a second example: In the wake of learning that their husbands are having affairs with younger women, the writers of "Jealousy" and "The Days of Abandonment" have a nearly identical physical reaction: they lose control over their bodily movements. Here is how Millet describes the change: “Firstly, my emotional state was such that I could only move extremely slowly. Sometimes one’s heart rate increases to a point at which it seems as though the heart is banging against the walls of the chest and that what one hears is the sound of it doing so.” And here is Ferrante on the same experience: “I wanted my movements to seem purposeful, but instead I scarcely had control over my body…. In the car I had nothing but trouble: I forgot I was driving. The street was replaced by the most vivid memories of the past or by bitter fantasies, and often I dented fenders, or braked at the last moment, but angrily, as if reality were inappropriate and had intervened to destroy a conjured world that was the only one that at that moment counted for me.” Millet may be describing real events from her real life and Ferrante may be inventing the life of an imaginary woman, but Millet’s awkward, stilted, and oddly generalized language merely allows us to gape at a version of reality we already think we understand. Ferrante, on the other hand, forces us into the unseen layers of that same reality. Just as her narrator’s belief in a coherent world of work and family is destroyed when her husband leaves her, Ferrante’s depiction of reality does not reinforce a common sense of the way the world is, but rather surprises the reader into new understandings. Janet Frame decided to publish her novel "Towards Another Summer" posthumously because she felt it was too personal to be shared in her lifetime, even though her life story was already well known thanks to her bestselling memoir "The Angel at the Table." Little happens in "Toward Another Summer"; the story’s momentum comes from the unusual, moving way the narrator’s mind struggles through ordinary human interaction. Fiction frees the writer to reveal not only the self as it appears to the outside world, but also the self that, as my narrator puts it in "Wreck and Order," “I feel myself to be late at night when I can’t sleep and I’m all alone with the minutes passing, and I’m wide awake with thoughts I want to force the minutes to understand, but they are too fast, they pass and pass, and then pass again.” Writing slows time down enough to force an awareness of secret thoughts from secret selves. A fictional narrator can be the person you fear you are, or the person part of you wants to become and parts of you wants to kill, or the person you were once for 30 seconds the one and only time you stood up on a surfboard, or the person you are in the gap between consciousness and unconsciousness at the moment you wake up in the morning, before your brain’s belief in your fixed identity starts bossing you around. When I began writing "Wreck and Order," I was writing the story of the person I feared I would become, and somehow, in the course of writing, it became the story of the person I still hope to be. Only fiction could allow me to bridge that distance, not with miraculous self-improvements and life changes (the drunken sex addict becomes a saintly Buddhist nun!), but by concretizing—through language, through imagined events—the invisible connections between seemingly disparate longings and personalities, times and places. I don’t mean to imply that memoir is inherently limited. There are countless examples of memoirs that are not merely factually accurate, but also manage to reveal complex truths about how it feels to live particular facts—"This Boy’s Life," "Name All the Animals," "Darkness Visible," "Survival in Auschwitz"—just as there are countless examples of novels that provide facile explanations of life solely through plot, without ever credibly depicting characters’ inner lives. Truth, in both memoirs and novels, does not come from relaying concrete events, but from giving language to the invisible, ineffable undercurrent of real life that makes every event much more than it appears to be.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 16:30

Beyond the “F**k-off Fund”: Even if you’re “bad with money,” you can take control of your finances for the long-term

Last month, Paulette Perhach’s essay “A Story of a Fuck Off Fund” at the Billfold, which explored the ways not having any savings can keep us trapped in bad jobs and relationships, while having said fund can mean the freedom to make better decisions, took the Internet by storm. Parents recommended teaching their kids about such funds, and sites like Jezebel and The Frisky lauded the concept. One writer shared how a fuck-off fund helped them to leave abusive situations, while another recounted how challenging it can be to gather even that bare minimum. At Refinery29, Lindsey Stanberry took issue with the questionable financial choices Perhach recounts, asking, “Why does almost everyone insist on talking to women about money like we’re idiots?” It makes sense that it struck such a nerve when you consider that most Americans have less than $1,000 in savings, according to a December 2015 Google Consumer Survey. While Perhach details the practical implications of not having such a fund, there are also emotional costs involved. “I have been in a situation where I haven’t had [one]. It feels terrible,” Perhach told Salon. “You’re always scared when you’re broke. You can’t afford mistakes, you can’t afford surprises. When you have money, you feel more calm.” Perhach, a 33-year-old Seattle writer who detailed the economic struggles that led to her moving back home with her mom for Salon in 2013, not only saved up her fuck-off fund, but also paid off about $20,000 of debt between 2012 and 2015 thanks in large part to Excel—and outside accountability. Inspired by listening to financial author Dave Ramsey’s podcast, she started both an Excel spreadsheet listing all her bank accounts and their totals, and a blog she told a handful of people about. The only one who read it consistently was her best friend’s mom, but that minimum amount of outside accountability was enough to keep her motivated. Perhach has also gotten creative about how she handles her money, knowing that it’s not her strong suit. Despite her impressive savings, she says, “I’m really bad with money,” but has found a way to turn her weakness into a strength. “Taking the shame off that for myself and just saying that and not expecting myself to be magically good with money” has made a world of difference. For instance, for a while, she paid her rent in two checks per month, when she received her paycheck, because she didn’t trust herself to do it differently. Even when her boyfriend asked, “Can’t you just keep the money in your bank account?” she knew that she couldn’t promise that. According to Helaine Olen, Slate columnist and co-author of "The Index Card: Why Personal Finance Doesn’t Have to Be Complicated," the concept of a fuck-off fund isn’t new. “This is a concept that’s been around forever,” she said, better known by the term “fuck-you money.” She said it’s a vital concept for everyone (not just women), especially since surveys have shown that anywhere from 40 percent to 76 percent of Americans live paycheck to paycheck. “It becomes this chicken or egg problem; are they having a hard time because they have access to easy credit or is easy credit out there to help them because our salaries haven’t kept up?” Wherever you’re starting from, Olen emphasizes that even the most minimal efforts toward savings can have a powerful impact and are vital to being able to handle financial contingencies. “In 'The Index Card,' we say strive to save 10 to 20% of your income,” Olen said but admitted “that’s a long-term goal.” If you can’t do it all at once, don’t give up entirely. “I would rather see you save $10 a week than have you throw up your hands and say ‘I can’t save anything.’ It’s a good habit to get into.” Its effects may extend beyond just your bank account. Olen says having such a fund may lead to more confidence, which can have a ripple effect. “Often when you project confidence, you’re simply treated better,” explained Olen. “I think having this sort of money sometimes helps with that. It allows you to take risks you might not otherwise take; you might take a job that pays lower but you think has better long-term growth, or you decide to move to a different city without a job.” Perhach agrees. “Let’s say you get a dream job offer, but it’s an internship,” she posited. Maybe the New Yorker offers you an internship in Paris, but it doesn’t pay. With the help of your fuck-off fund, you can consider, in Perhach’s words, “If I go there, can I work nights and waitress? It gives you that little buffer. You can really look at what’s best for you and take a little more risk if you want to. It’s definitely a fuck-off fund, but if you have $5,000 in the bank, it can be renamed something else, like a dream job fund.” Yet a fuck-off fund isn’t an excuse to be unrealistic about just how far that money can go. When Perhach got accepted into NYU Paris’ low residency MFA program, which she wanted to attend “badly,” she was even studying French already. But she totaled up all the costs and found it would have been $90,000, far outside her budget, so she declined after having what she calls a “hard conversation” with Excel. Nicole Lapin, author of "Rich Bitch: A Simple 12-Step Plan for Getting Your Financial Life Together...Finally," warns that a fuck-you fund is not a career panacea. “Even if you have that fuck-you fund, that’s not going to last forever,” Lapin told Salon. Instead of simply relying on the fuck-off fund as a way to flounce off into the mythical sunset, she urges people to learn about their next step before they actually take it, so if they do quit a bad job, they’re prepared. “Maybe there’s something that you just can’t get out of your head and you’re obsessed with,” Lapin said, such as raising alpacas. Is this simply a fantasy job or a viable career move? “You need to figure out what that actually looks like. Is it just because you want to pet fluffy animals all day long? That’s one thing, or do you realize what it takes to actually start a business around that? You have to shovel poop, you have to deal with marketing, hiring people, contracts, all the stuff that goes into a business. Sometimes people think, I’m going to follow my passion and play with alpacas. Maybe that’s a cool thing to do, but maybe that’s not a business.” In other words, it’s not just about the grand gesture of leaving a job, but figuring out what you’ll be saying yes to next. “Sometimes it’s just as important to realize what you don’t like as realize what you do like,” advised Lapin, especially if you’re thinking of striking out on your own and starting a business. “I think a lot of us have this idea that the proverbial grass is always greener and sometimes that is the case, but sometimes you need to test it out and maybe you just get it out of your system.” Lapin recommends biding your time at the job you’re not in love with but test out your second job idea on the side. By doing it part-time, you’ll see if it’s a good match and better know what you’re getting into before you blow your fuck-you fund on a venture that might leave you back where you started financially. As for the fuck-off fund, both Olen and Lapin recommend first having an emergency fund, something that will cover essentials in case of a true emergency where you don’t so much decide to leave an ill-suited job or home, but the decision is made for you. Maybe you’re laid off, you get sick, your car breaks down or you need to help a family member. As a first step, Perhach recommends following Dave Ramsey’s advice: gathering a $1,000 emergency fund, as quickly as possible. Lapin suggests saving three to six months’ worth of funds to cover basic life necessities like rent, food, healthcare, etc., if you have a steady job, or nine months' to a year’s worth if you’re a freelancer or your income fluctuates. According to Lapin, a “fuck-you fund” should actually be the final step after your other financial necessities are in order. “First pay off any debt, especially credit card debt, because that’s going to accumulate faster than any of your savings is going to grow,” she advised. Then you can gather your emergency fund, ramp up your retirement find, and after that, focus on a fuck-you fund. Ultimately, according to Lapin, the key to savings is changing your mind-set, since studies have shown that saving money can lead to happiness and, Lapin suggests, confidence. “[Savings] is not a bill; you’re investing in your future self. I think of it as you’re spending on your amazing future. Focus on all the awesome things that you’re getting from it. Once you change from a sense of deprivation to a place of aspiration, that becomes a big motivating factor.” Additionally, spending just for the sake of spending, even on supposedly fun things, may not yield the intended results if it’s pushing you further into debt. For Perhach, being debt-free and having a fuck-off fund have changed how she feels about spending money on non-necessities like travel. She can enjoy it now in a way she didn’t before. “There’s a lot of things that you do supposedly for fun, but I think the base of having fun is not feeling like you’re in trouble,” she said. “[Being] on vacation and feeling you’re going to get home and get that credit card bill and be in trouble takes away from that. It’s nice to not feel guilty about it, to say, I can have this.” For those who can’t seem to get started saving toward any kind of fund, Lapin says, “There’s always ways to save a little bit. I break down a budget into 70 percent to the essentials, 15 percent for extras, which is the fun stuff and 15 percent for the end game. The end game is your savings, your investments, your retirement.” She recommends having even a small amount, such as $5, automatically deposited from your paycheck into a savings account, or otherwise automated so you don’t even notice it, then increasing that automated figure as you’re able to. An important caveat as you embark on your saving journey: “Make sure you’re invested in your employer’s retirement fund if it’s offered. Make sure you’re properly insured. If you have a fuck-off fund and you’re not properly insured for healthcare or life insurance, it could be all be gone in an instant,” cautioned Olen. Perhach believes that knowing why you’re saving and what it means to you can help you keep that paramount the next time you’re invited out for a fun but pricey evening. “Part of what I wrote about in fuck-off fund was spending money to impress other people and live the life that you think other people want you to live. There’s a kind of self-respect in saying ‘that’s not in my budget.” Staying strong in the face of peer pressure (or even the lure of whatever shiny new purchase catches your eye) can motivate you to keep saving, according to Perhach. “I definitely got a lot of confidence from paying off my debt and knowing I could do that. Saving up for a fuck off fund is a long-term goal and no one else cares about it but you. You’re the only person that can do that.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 16:30

Scalia told me a secret about George W. Bush

Eight years ago, I had the once-in-a-lifetime opportunity to ask two questions of the country’s most famous conservative Supreme Court justice. I’ll never forget how he answered. I was a junior in college at Washington University in St. Louis, an English major, but that spring I was spending a semester abroad at Christ Church, Oxford University. I was truly enjoying my tutorials in 19th century British novels, although to be honest, I was even more excited to be taking my meals in the actual Harry Potter dining hall. My one complaint was that the American students were a little segregated from the rest of the student body, so I tried to expand my friend group by joining the Oxford Union Society. Founded in 1823, the Union is probably the world’s most famous debating club. It’s the historic training ground for prime ministers and politicians in the U.K. and throughout the former British Empire. The Union regularly invites distinguished guests to debate or give speeches. When the notice came out that Justice Scalia would be visiting in February, I put my name in the drawing, thinking it would be a long shot. I was stunned by the email notifying me of my selection. Not only did I have a ticket to his speech on the Union floor, but I was one of 12 students selected for a private dinner and drinks with him beforehand! My place setting was to the guest of honor’s left, to the relief of liberal family members when I told the story later. I was very nervous to be in the same room with such a famous jurist, but probably more anxious about the other students. At that point, I’d read very few judicial opinions. I fully expected to be embarrassed that these well-spoken Brits knew more than me about American constitutional law. Mostly, I recall Justice Scalia making light conversation throughout the meal and seeming fully at ease. Of course, he had strong opinions on every subject that came up, whether it was the newly advanced EU constitution or Chelsea vs. Manchester United.  He spoke fondly of Justice Ginsburg as his “best friend on the Court.” He said he hoped C-SPAN would one day record oral arguments because he’d obviously outperform his fellow justices. He agreed with one of the Rhodes scholars in attendance that his “Platonic golf” line was an all-time best. After food came glasses of sherry. I gathered my courage and decided to ask the two questions I’d considered in advance. “Justice Scalia, as we’re coming to the end of Bush’s presidency, I wondered if I could ask your opinion on the president’s leadership qualities.” I recall Justice Scalia leaned back a little and examined my face. He may have thought I was a plant. Nonetheless, he was completely candid. “I have the utmost respect for the Bush family,” he said. “And I’m not a politician or a political figure. But a lot of my fellow Republicans think the other Bush brother is much brighter. That the wrong Bush brother became president.” There in the Gladstone Room of the historic Oxford Union, Justice Scalia indicated that he shared this opinion. Flash forward to today: Jeb’s presidential campaign is on life support. It’s clear he does not have his brother’s political or interpersonal instincts. But I have a strong suspicion that Jeb would have had Justice Scalia’s support. I had practiced my second question several times so that I could get it out easily. “Justice Scalia, if there’s one decision you would like reversed from your tenure on the Court, what would that be?” He didn’t hesitate. “McConnell,” he said, referring to McConnell v. FEC. The case upheld the constitutionality of the Bipartisan Campaign Reform Act of 2002. The BCRA, more commonly called the McCain-Feingold Act, had put in place certain campaign finance rules for corporate and union spending in elections. The Court in 2003 upheld those reforms, and Justice Scalia issued a blistering dissent. As we drank sherry in Oxford, Justice Scalia employed one of his classic hyperboles. “If you’re not free to use money in the political process, then the First Amendment is dead.” Justice Scalia would get his reversal just two years later. That case is huge and infamous – Citizens United v. FEC – which effectively wiped out bipartisan campaign finance reform and led directly to the super PACs and money free-for-all of the contemporary election cycle. Justice Scalia described his victories on the Court as “damned few,” but there’s no doubt Citizens United was a fundamental change. We walked with Scalia toward the Union floor, where he was to give his pre-written speech. He told us he was rather disappointed that he wasn’t debating anybody that night. “I would have enjoyed it!” he said with a mischievous grin. But as it turned out, he got his wish anyway. When it came time for Q&A, he flipped every question back into an argument in support of his originalism. For example, when a student asked about slavery and the “three-fifths” clause of the Constitution, Justice Scalia said this was a perfect example of the system working well. That clause was nullified not by activist judges but by the passage of the 13th Amendment after the Civil War. The change had been made to the text itself. Two and a half years after our meeting, I would enroll in law school at my St. Louis alma mater. I’m not a litigator, and don’t exactly subscribe to his strict constitutional interpretation. But I do wonder if Justice Scalia played a practical role in my law school decision. “Wash. U., eh?” Justice Scalia had said, when I told him where I was in college back in the states. “You know, your old law school building used to be gray and boxy. Mudd Hall, think that was its name. Too modern. Very ugly. But I hear the new one is pretty nice.” He was brilliant, he was polemical, he was a man of rigid principle, including in aesthetics. May he rest in peace. Stephen Harrison is a writer and corporate lawyer for a tech company in Dallas.Eight years ago, I had the once-in-a-lifetime opportunity to ask two questions of the country’s most famous conservative Supreme Court justice. I’ll never forget how he answered. I was a junior in college at Washington University in St. Louis, an English major, but that spring I was spending a semester abroad at Christ Church, Oxford University. I was truly enjoying my tutorials in 19th century British novels, although to be honest, I was even more excited to be taking my meals in the actual Harry Potter dining hall. My one complaint was that the American students were a little segregated from the rest of the student body, so I tried to expand my friend group by joining the Oxford Union Society. Founded in 1823, the Union is probably the world’s most famous debating club. It’s the historic training ground for prime ministers and politicians in the U.K. and throughout the former British Empire. The Union regularly invites distinguished guests to debate or give speeches. When the notice came out that Justice Scalia would be visiting in February, I put my name in the drawing, thinking it would be a long shot. I was stunned by the email notifying me of my selection. Not only did I have a ticket to his speech on the Union floor, but I was one of 12 students selected for a private dinner and drinks with him beforehand! My place setting was to the guest of honor’s left, to the relief of liberal family members when I told the story later. I was very nervous to be in the same room with such a famous jurist, but probably more anxious about the other students. At that point, I’d read very few judicial opinions. I fully expected to be embarrassed that these well-spoken Brits knew more than me about American constitutional law. Mostly, I recall Justice Scalia making light conversation throughout the meal and seeming fully at ease. Of course, he had strong opinions on every subject that came up, whether it was the newly advanced EU constitution or Chelsea vs. Manchester United.  He spoke fondly of Justice Ginsburg as his “best friend on the Court.” He said he hoped C-SPAN would one day record oral arguments because he’d obviously outperform his fellow justices. He agreed with one of the Rhodes scholars in attendance that his “Platonic golf” line was an all-time best. After food came glasses of sherry. I gathered my courage and decided to ask the two questions I’d considered in advance. “Justice Scalia, as we’re coming to the end of Bush’s presidency, I wondered if I could ask your opinion on the president’s leadership qualities.” I recall Justice Scalia leaned back a little and examined my face. He may have thought I was a plant. Nonetheless, he was completely candid. “I have the utmost respect for the Bush family,” he said. “And I’m not a politician or a political figure. But a lot of my fellow Republicans think the other Bush brother is much brighter. That the wrong Bush brother became president.” There in the Gladstone Room of the historic Oxford Union, Justice Scalia indicated that he shared this opinion. Flash forward to today: Jeb’s presidential campaign is on life support. It’s clear he does not have his brother’s political or interpersonal instincts. But I have a strong suspicion that Jeb would have had Justice Scalia’s support. I had practiced my second question several times so that I could get it out easily. “Justice Scalia, if there’s one decision you would like reversed from your tenure on the Court, what would that be?” He didn’t hesitate. “McConnell,” he said, referring to McConnell v. FEC. The case upheld the constitutionality of the Bipartisan Campaign Reform Act of 2002. The BCRA, more commonly called the McCain-Feingold Act, had put in place certain campaign finance rules for corporate and union spending in elections. The Court in 2003 upheld those reforms, and Justice Scalia issued a blistering dissent. As we drank sherry in Oxford, Justice Scalia employed one of his classic hyperboles. “If you’re not free to use money in the political process, then the First Amendment is dead.” Justice Scalia would get his reversal just two years later. That case is huge and infamous – Citizens United v. FEC – which effectively wiped out bipartisan campaign finance reform and led directly to the super PACs and money free-for-all of the contemporary election cycle. Justice Scalia described his victories on the Court as “damned few,” but there’s no doubt Citizens United was a fundamental change. We walked with Scalia toward the Union floor, where he was to give his pre-written speech. He told us he was rather disappointed that he wasn’t debating anybody that night. “I would have enjoyed it!” he said with a mischievous grin. But as it turned out, he got his wish anyway. When it came time for Q&A, he flipped every question back into an argument in support of his originalism. For example, when a student asked about slavery and the “three-fifths” clause of the Constitution, Justice Scalia said this was a perfect example of the system working well. That clause was nullified not by activist judges but by the passage of the 13th Amendment after the Civil War. The change had been made to the text itself. Two and a half years after our meeting, I would enroll in law school at my St. Louis alma mater. I’m not a litigator, and don’t exactly subscribe to his strict constitutional interpretation. But I do wonder if Justice Scalia played a practical role in my law school decision. “Wash. U., eh?” Justice Scalia had said, when I told him where I was in college back in the states. “You know, your old law school building used to be gray and boxy. Mudd Hall, think that was its name. Too modern. Very ugly. But I hear the new one is pretty nice.” He was brilliant, he was polemical, he was a man of rigid principle, including in aesthetics. May he rest in peace. Stephen Harrison is a writer and corporate lawyer for a tech company in Dallas.Eight years ago, I had the once-in-a-lifetime opportunity to ask two questions of the country’s most famous conservative Supreme Court justice. I’ll never forget how he answered. I was a junior in college at Washington University in St. Louis, an English major, but that spring I was spending a semester abroad at Christ Church, Oxford University. I was truly enjoying my tutorials in 19th century British novels, although to be honest, I was even more excited to be taking my meals in the actual Harry Potter dining hall. My one complaint was that the American students were a little segregated from the rest of the student body, so I tried to expand my friend group by joining the Oxford Union Society. Founded in 1823, the Union is probably the world’s most famous debating club. It’s the historic training ground for prime ministers and politicians in the U.K. and throughout the former British Empire. The Union regularly invites distinguished guests to debate or give speeches. When the notice came out that Justice Scalia would be visiting in February, I put my name in the drawing, thinking it would be a long shot. I was stunned by the email notifying me of my selection. Not only did I have a ticket to his speech on the Union floor, but I was one of 12 students selected for a private dinner and drinks with him beforehand! My place setting was to the guest of honor’s left, to the relief of liberal family members when I told the story later. I was very nervous to be in the same room with such a famous jurist, but probably more anxious about the other students. At that point, I’d read very few judicial opinions. I fully expected to be embarrassed that these well-spoken Brits knew more than me about American constitutional law. Mostly, I recall Justice Scalia making light conversation throughout the meal and seeming fully at ease. Of course, he had strong opinions on every subject that came up, whether it was the newly advanced EU constitution or Chelsea vs. Manchester United.  He spoke fondly of Justice Ginsburg as his “best friend on the Court.” He said he hoped C-SPAN would one day record oral arguments because he’d obviously outperform his fellow justices. He agreed with one of the Rhodes scholars in attendance that his “Platonic golf” line was an all-time best. After food came glasses of sherry. I gathered my courage and decided to ask the two questions I’d considered in advance. “Justice Scalia, as we’re coming to the end of Bush’s presidency, I wondered if I could ask your opinion on the president’s leadership qualities.” I recall Justice Scalia leaned back a little and examined my face. He may have thought I was a plant. Nonetheless, he was completely candid. “I have the utmost respect for the Bush family,” he said. “And I’m not a politician or a political figure. But a lot of my fellow Republicans think the other Bush brother is much brighter. That the wrong Bush brother became president.” There in the Gladstone Room of the historic Oxford Union, Justice Scalia indicated that he shared this opinion. Flash forward to today: Jeb’s presidential campaign is on life support. It’s clear he does not have his brother’s political or interpersonal instincts. But I have a strong suspicion that Jeb would have had Justice Scalia’s support. I had practiced my second question several times so that I could get it out easily. “Justice Scalia, if there’s one decision you would like reversed from your tenure on the Court, what would that be?” He didn’t hesitate. “McConnell,” he said, referring to McConnell v. FEC. The case upheld the constitutionality of the Bipartisan Campaign Reform Act of 2002. The BCRA, more commonly called the McCain-Feingold Act, had put in place certain campaign finance rules for corporate and union spending in elections. The Court in 2003 upheld those reforms, and Justice Scalia issued a blistering dissent. As we drank sherry in Oxford, Justice Scalia employed one of his classic hyperboles. “If you’re not free to use money in the political process, then the First Amendment is dead.” Justice Scalia would get his reversal just two years later. That case is huge and infamous – Citizens United v. FEC – which effectively wiped out bipartisan campaign finance reform and led directly to the super PACs and money free-for-all of the contemporary election cycle. Justice Scalia described his victories on the Court as “damned few,” but there’s no doubt Citizens United was a fundamental change. We walked with Scalia toward the Union floor, where he was to give his pre-written speech. He told us he was rather disappointed that he wasn’t debating anybody that night. “I would have enjoyed it!” he said with a mischievous grin. But as it turned out, he got his wish anyway. When it came time for Q&A, he flipped every question back into an argument in support of his originalism. For example, when a student asked about slavery and the “three-fifths” clause of the Constitution, Justice Scalia said this was a perfect example of the system working well. That clause was nullified not by activist judges but by the passage of the 13th Amendment after the Civil War. The change had been made to the text itself. Two and a half years after our meeting, I would enroll in law school at my St. Louis alma mater. I’m not a litigator, and don’t exactly subscribe to his strict constitutional interpretation. But I do wonder if Justice Scalia played a practical role in my law school decision. “Wash. U., eh?” Justice Scalia had said, when I told him where I was in college back in the states. “You know, your old law school building used to be gray and boxy. Mudd Hall, think that was its name. Too modern. Very ugly. But I hear the new one is pretty nice.” He was brilliant, he was polemical, he was a man of rigid principle, including in aesthetics. May he rest in peace. Stephen Harrison is a writer and corporate lawyer for a tech company in Dallas.Eight years ago, I had the once-in-a-lifetime opportunity to ask two questions of the country’s most famous conservative Supreme Court justice. I’ll never forget how he answered. I was a junior in college at Washington University in St. Louis, an English major, but that spring I was spending a semester abroad at Christ Church, Oxford University. I was truly enjoying my tutorials in 19th century British novels, although to be honest, I was even more excited to be taking my meals in the actual Harry Potter dining hall. My one complaint was that the American students were a little segregated from the rest of the student body, so I tried to expand my friend group by joining the Oxford Union Society. Founded in 1823, the Union is probably the world’s most famous debating club. It’s the historic training ground for prime ministers and politicians in the U.K. and throughout the former British Empire. The Union regularly invites distinguished guests to debate or give speeches. When the notice came out that Justice Scalia would be visiting in February, I put my name in the drawing, thinking it would be a long shot. I was stunned by the email notifying me of my selection. Not only did I have a ticket to his speech on the Union floor, but I was one of 12 students selected for a private dinner and drinks with him beforehand! My place setting was to the guest of honor’s left, to the relief of liberal family members when I told the story later. I was very nervous to be in the same room with such a famous jurist, but probably more anxious about the other students. At that point, I’d read very few judicial opinions. I fully expected to be embarrassed that these well-spoken Brits knew more than me about American constitutional law. Mostly, I recall Justice Scalia making light conversation throughout the meal and seeming fully at ease. Of course, he had strong opinions on every subject that came up, whether it was the newly advanced EU constitution or Chelsea vs. Manchester United.  He spoke fondly of Justice Ginsburg as his “best friend on the Court.” He said he hoped C-SPAN would one day record oral arguments because he’d obviously outperform his fellow justices. He agreed with one of the Rhodes scholars in attendance that his “Platonic golf” line was an all-time best. After food came glasses of sherry. I gathered my courage and decided to ask the two questions I’d considered in advance. “Justice Scalia, as we’re coming to the end of Bush’s presidency, I wondered if I could ask your opinion on the president’s leadership qualities.” I recall Justice Scalia leaned back a little and examined my face. He may have thought I was a plant. Nonetheless, he was completely candid. “I have the utmost respect for the Bush family,” he said. “And I’m not a politician or a political figure. But a lot of my fellow Republicans think the other Bush brother is much brighter. That the wrong Bush brother became president.” There in the Gladstone Room of the historic Oxford Union, Justice Scalia indicated that he shared this opinion. Flash forward to today: Jeb’s presidential campaign is on life support. It’s clear he does not have his brother’s political or interpersonal instincts. But I have a strong suspicion that Jeb would have had Justice Scalia’s support. I had practiced my second question several times so that I could get it out easily. “Justice Scalia, if there’s one decision you would like reversed from your tenure on the Court, what would that be?” He didn’t hesitate. “McConnell,” he said, referring to McConnell v. FEC. The case upheld the constitutionality of the Bipartisan Campaign Reform Act of 2002. The BCRA, more commonly called the McCain-Feingold Act, had put in place certain campaign finance rules for corporate and union spending in elections. The Court in 2003 upheld those reforms, and Justice Scalia issued a blistering dissent. As we drank sherry in Oxford, Justice Scalia employed one of his classic hyperboles. “If you’re not free to use money in the political process, then the First Amendment is dead.” Justice Scalia would get his reversal just two years later. That case is huge and infamous – Citizens United v. FEC – which effectively wiped out bipartisan campaign finance reform and led directly to the super PACs and money free-for-all of the contemporary election cycle. Justice Scalia described his victories on the Court as “damned few,” but there’s no doubt Citizens United was a fundamental change. We walked with Scalia toward the Union floor, where he was to give his pre-written speech. He told us he was rather disappointed that he wasn’t debating anybody that night. “I would have enjoyed it!” he said with a mischievous grin. But as it turned out, he got his wish anyway. When it came time for Q&A, he flipped every question back into an argument in support of his originalism. For example, when a student asked about slavery and the “three-fifths” clause of the Constitution, Justice Scalia said this was a perfect example of the system working well. That clause was nullified not by activist judges but by the passage of the 13th Amendment after the Civil War. The change had been made to the text itself. Two and a half years after our meeting, I would enroll in law school at my St. Louis alma mater. I’m not a litigator, and don’t exactly subscribe to his strict constitutional interpretation. But I do wonder if Justice Scalia played a practical role in my law school decision. “Wash. U., eh?” Justice Scalia had said, when I told him where I was in college back in the states. “You know, your old law school building used to be gray and boxy. Mudd Hall, think that was its name. Too modern. Very ugly. But I hear the new one is pretty nice.” He was brilliant, he was polemical, he was a man of rigid principle, including in aesthetics. May he rest in peace. Stephen Harrison is a writer and corporate lawyer for a tech company in Dallas.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 16:00

Trump vs. zombies — who wins?: Binge-watching “The Walking Dead” in an election year

My husband and I joined the Netflix revolution just a few months ago, which is how I became addicted to "The Walking Dead." My conversations with family and friends generally fix on two topics: what we’re watching or reading, and the election. My "TWD" Netflix binge does not impress, because, some of them say, they cannot take seriously a show about zombies — despite its 19 million viewers, according to Nielsen ratings, more than those who watched the Republican and Democratic debates combined.  ("The Walking Dead" returns for the second half of season 6 tonight on AMC.) “It’s not about the zombies,” I tell them. It’s about the human condition, spirituality, diplomacy, security, leadership, empathy…all important topics to consider in an election season. Yes, empathy … even for those symbolic walkers, who did not ask for their deadly condition. Viewers are frequently reminded that they were once human, too. Rick Grimes (played by Andrew Lincoln), who emerges as the leader of a group of survivors, sets the tone for empathy in the first season when he says to a walker, “I’m sorry this happened to you,” just before he puts it down. In another episode when the group must use parts of a walker’s body to disguise themselves for escape, Rick finds a driver’s license in a wallet, and gives the group the walker’s (human) name. He asks them to pause and consider that the walker once lived as they did. Hershel, the farmer (played by Scott Wilson), even shelters and feeds them in his barn. So, in season 3, when we see walkers being used by the Woodbury community at the behest of their corrupt governor (played by David Morrissey) for entertainment, it feels like a moral affront. Then there is the rhetorical element, which gives us insight into characters’ motives. Rick and his group refer to the zombies as “walkers”; the governor and his henchmen call them “biters.” One label reminds us that they were once people, while the other label casts them as nothing but predators. Leaders’ words are suggestive of how they perceive their constituents. Labels matter. Therein is a recurring theme that the group considers when they must do the unthinkable to survive: in the worst of conditions, we cannot lose our sense of humanity. Nevertheless, the humanity theme is a hard sell to skeptics who have no truck with gore. “Zombies,” they sneer in disgust. “You might as well play a video game.” The walkers, I remind them, are symbolic. We are faced with our own mortality every time we watch the news, just as our group faces it every episode. There are far too many headlines about ISIS, cancer, weather catastrophes, deadly viruses, humanitarian crises abroad, and more. And survivors know that worry never goes away. Rick’s group is constantly battling walkers as they endure the same problems that plagued people in their pre-pandemic lives, like domestic abuse, alcoholism, mental illness. We see what can happen when there are suddenly no facilities, no programs and no professionals to help manage them. The group must make agonizing choices as a result, choices that may even have caused some viewers to break up with the series. For others, the show appeals to that part of us that wonders: Could I do that? Where would I go? What skills do I have to survive? Most important, Could our leader endure the toughest of conditions, much less make a difference? While "TWD" is set in and around Atlanta in the first four seasons, the hordes of (symbolic) walkers drive survivors to the woods, the mountains and the farms where there may be resources to live, but only if they have the weapons and the skills. The rugged landscape is familiar to those of us who live in rural areas, like my own region of central Appalachia (just north of Atlanta), and particularly those of us only one or two generations removed from relatives who remember the Depression. We have heard stories of a time when money was worthless, and people had two choices: to live off the food they harvested or starve. We have seen industries develop and grow and now we are watching them leave. Older generations may be fatalistic when they talk about the future, but they lived through the past, and they have seen the worst. My great-grandmother, a farmer who could outshoot a man with her own pearl-handled pistol, would not have liked "TWD," but she would have understood it from the perspective of a survivalist. "TWD" includes a few pacifists who want nothing to do with guns, but in their world, most cannot survive without them. Nearly everyone carries some kind of weapon, even Rick’s son, Carl (played by Chandler Riggs). Herds of walkers must be controlled with guns. But guns also pose a problem because walkers are drawn by the noise. Darryl’s (played by Norman Reedus) crossbow and Michonne’s (played by Danai Gurira) sword are much better options in close combat with a few walkers, and without the worry of running out of ammunition. Noise only works in their favor if they are distracting a herd. “But the zombies,” my beloved skeptics say, shaking their heads. “How can you stand the decay, the consumption?Because it’s symbolic. We see the fragility of society in the wake of a pandemic due to the lack of resources and security. We are made uncomfortable by the group’s constant suffering and fear: They are without the conveniences of running water, electricity, processed foods and cellphones. There may be transportation, but only if a car has gas and the roads are clear. With Hershel’s help, Rick and his group manage to cobble together a farm near an abandoned penitentiary in season 3. Rick’s wife has a baby on the way and now they have shelter. Hope springs eternal. Still, the walkers -- and everything they represent -- can be counted on to take advantage of complacency and breach a comfortable life. Problems are never really gone. Without adequate resources, some survivors regress into primitive behavior, even cannibalism. But Rick’s group retains their humanity, something they prize above all, even as their hair and beards lengthen and their clothes become tattered. They may look wild, but they learn that the cleanest, most pristine people may be the ones with the most evil of intentions. The skeptics sigh. “We can’t get past the zombies,” they say. Then let’s talk about leadership. A pre-pandemic police officer, Rick develops an intuition about people that he hones to a deadly accuracy. He can be trusted with his decisions for the group’s well-being. By season 6, however, he has to make hard choices measured more by standards of survival than empathy. No one, no matter how friendly they seem, can be trusted, even as people must rely on each other. “That there,” Joe the Claimer (played by Jeff Kober), “is what you call a paradox.” The governor (a less than tongue-in-check reference) is a man who appears to have all the qualities of leadership. The organizer of the seemingly utopian Woodbury, he has a charming personality, complete with an Elvis-coated accent. But there are early signs that he is power-hungry, if not mad. He declares a micro-war to overtake Rick’s neighboring compound, complete with a tank and a band of loyalists. His final, heinous act as Rick tries to negotiate is Hershel’s beheading, a scene reminiscent of ISIS-inspired headlines. It may be no coincidence that season 6 is set near Washington, D. C., in the presumed “safe” zone of Alexandria. Its leader, congressperson Deanna Monroe (Tovah Feldshuh), is an idealist who promises a sustainable community. She prides herself on keeping the borders safe by excluding and exiling the dangerous people, and carefully vetting those who are invited inside. She ensures the group’s safety with a giant wall. Sound familiar? But Rick, a seasoned veteran of the brutality on the outside, chafes under her strong-willed, untested, idealistic leadership style. “People measure you,” he advises her, “by what they can take.” Though she balks when Rick consistently warns her about the community’s weaknesses, she learns that he is right. So, we need look no further than this election year, or the headlines, or the complacency of people who suddenly find themselves gobsmacked by unexpected approval ratings and caucus results to find parallels in "The Walking Dead." We cannot drop our presidential candidates into a post-pandemic scenario to know whether they could emerge as diplomatic leaders of a safe and sustainable United States (though it may prove more reliable than debates or a caucus). It would remain to be seen whether Trump or Cruz’s Great Walls would be secure. Neither Rubio’s rehearsed zingers nor Kasich’s budgeting pride would make much of a difference. Sanders would need some hefty taxes to lead, which couldn’t happen until he resurrected an economy. Clinton would have to prove her trustworthiness, which is hard to do as threat approaches. And no amount of noise from Trump, including bullying, intimidation and name-calling, would serve to make society safer or more sustainable. It didn’t work for the governor or Woodbury. Maybe our candidates could take some notes from "TWD," where the quiet, observant and diplomatic Rick Grimes is still the only one standing. Diplomatic, that is, until his group is threatened, and he unleashes a small army led by Carol, Darryl and Michonne. “How ridiculous,” my skeptics say. “We’re talking about a show with zombies.” “Yes,” I relent, “there are zombies.” And what can we learn from them this election season? How not to become part of a herd that is drawn to noise.My husband and I joined the Netflix revolution just a few months ago, which is how I became addicted to "The Walking Dead." My conversations with family and friends generally fix on two topics: what we’re watching or reading, and the election. My "TWD" Netflix binge does not impress, because, some of them say, they cannot take seriously a show about zombies — despite its 19 million viewers, according to Nielsen ratings, more than those who watched the Republican and Democratic debates combined.  ("The Walking Dead" returns for the second half of season 6 tonight on AMC.) “It’s not about the zombies,” I tell them. It’s about the human condition, spirituality, diplomacy, security, leadership, empathy…all important topics to consider in an election season. Yes, empathy … even for those symbolic walkers, who did not ask for their deadly condition. Viewers are frequently reminded that they were once human, too. Rick Grimes (played by Andrew Lincoln), who emerges as the leader of a group of survivors, sets the tone for empathy in the first season when he says to a walker, “I’m sorry this happened to you,” just before he puts it down. In another episode when the group must use parts of a walker’s body to disguise themselves for escape, Rick finds a driver’s license in a wallet, and gives the group the walker’s (human) name. He asks them to pause and consider that the walker once lived as they did. Hershel, the farmer (played by Scott Wilson), even shelters and feeds them in his barn. So, in season 3, when we see walkers being used by the Woodbury community at the behest of their corrupt governor (played by David Morrissey) for entertainment, it feels like a moral affront. Then there is the rhetorical element, which gives us insight into characters’ motives. Rick and his group refer to the zombies as “walkers”; the governor and his henchmen call them “biters.” One label reminds us that they were once people, while the other label casts them as nothing but predators. Leaders’ words are suggestive of how they perceive their constituents. Labels matter. Therein is a recurring theme that the group considers when they must do the unthinkable to survive: in the worst of conditions, we cannot lose our sense of humanity. Nevertheless, the humanity theme is a hard sell to skeptics who have no truck with gore. “Zombies,” they sneer in disgust. “You might as well play a video game.” The walkers, I remind them, are symbolic. We are faced with our own mortality every time we watch the news, just as our group faces it every episode. There are far too many headlines about ISIS, cancer, weather catastrophes, deadly viruses, humanitarian crises abroad, and more. And survivors know that worry never goes away. Rick’s group is constantly battling walkers as they endure the same problems that plagued people in their pre-pandemic lives, like domestic abuse, alcoholism, mental illness. We see what can happen when there are suddenly no facilities, no programs and no professionals to help manage them. The group must make agonizing choices as a result, choices that may even have caused some viewers to break up with the series. For others, the show appeals to that part of us that wonders: Could I do that? Where would I go? What skills do I have to survive? Most important, Could our leader endure the toughest of conditions, much less make a difference? While "TWD" is set in and around Atlanta in the first four seasons, the hordes of (symbolic) walkers drive survivors to the woods, the mountains and the farms where there may be resources to live, but only if they have the weapons and the skills. The rugged landscape is familiar to those of us who live in rural areas, like my own region of central Appalachia (just north of Atlanta), and particularly those of us only one or two generations removed from relatives who remember the Depression. We have heard stories of a time when money was worthless, and people had two choices: to live off the food they harvested or starve. We have seen industries develop and grow and now we are watching them leave. Older generations may be fatalistic when they talk about the future, but they lived through the past, and they have seen the worst. My great-grandmother, a farmer who could outshoot a man with her own pearl-handled pistol, would not have liked "TWD," but she would have understood it from the perspective of a survivalist. "TWD" includes a few pacifists who want nothing to do with guns, but in their world, most cannot survive without them. Nearly everyone carries some kind of weapon, even Rick’s son, Carl (played by Chandler Riggs). Herds of walkers must be controlled with guns. But guns also pose a problem because walkers are drawn by the noise. Darryl’s (played by Norman Reedus) crossbow and Michonne’s (played by Danai Gurira) sword are much better options in close combat with a few walkers, and without the worry of running out of ammunition. Noise only works in their favor if they are distracting a herd. “But the zombies,” my beloved skeptics say, shaking their heads. “How can you stand the decay, the consumption?Because it’s symbolic. We see the fragility of society in the wake of a pandemic due to the lack of resources and security. We are made uncomfortable by the group’s constant suffering and fear: They are without the conveniences of running water, electricity, processed foods and cellphones. There may be transportation, but only if a car has gas and the roads are clear. With Hershel’s help, Rick and his group manage to cobble together a farm near an abandoned penitentiary in season 3. Rick’s wife has a baby on the way and now they have shelter. Hope springs eternal. Still, the walkers -- and everything they represent -- can be counted on to take advantage of complacency and breach a comfortable life. Problems are never really gone. Without adequate resources, some survivors regress into primitive behavior, even cannibalism. But Rick’s group retains their humanity, something they prize above all, even as their hair and beards lengthen and their clothes become tattered. They may look wild, but they learn that the cleanest, most pristine people may be the ones with the most evil of intentions. The skeptics sigh. “We can’t get past the zombies,” they say. Then let’s talk about leadership. A pre-pandemic police officer, Rick develops an intuition about people that he hones to a deadly accuracy. He can be trusted with his decisions for the group’s well-being. By season 6, however, he has to make hard choices measured more by standards of survival than empathy. No one, no matter how friendly they seem, can be trusted, even as people must rely on each other. “That there,” Joe the Claimer (played by Jeff Kober), “is what you call a paradox.” The governor (a less than tongue-in-check reference) is a man who appears to have all the qualities of leadership. The organizer of the seemingly utopian Woodbury, he has a charming personality, complete with an Elvis-coated accent. But there are early signs that he is power-hungry, if not mad. He declares a micro-war to overtake Rick’s neighboring compound, complete with a tank and a band of loyalists. His final, heinous act as Rick tries to negotiate is Hershel’s beheading, a scene reminiscent of ISIS-inspired headlines. It may be no coincidence that season 6 is set near Washington, D. C., in the presumed “safe” zone of Alexandria. Its leader, congressperson Deanna Monroe (Tovah Feldshuh), is an idealist who promises a sustainable community. She prides herself on keeping the borders safe by excluding and exiling the dangerous people, and carefully vetting those who are invited inside. She ensures the group’s safety with a giant wall. Sound familiar? But Rick, a seasoned veteran of the brutality on the outside, chafes under her strong-willed, untested, idealistic leadership style. “People measure you,” he advises her, “by what they can take.” Though she balks when Rick consistently warns her about the community’s weaknesses, she learns that he is right. So, we need look no further than this election year, or the headlines, or the complacency of people who suddenly find themselves gobsmacked by unexpected approval ratings and caucus results to find parallels in "The Walking Dead." We cannot drop our presidential candidates into a post-pandemic scenario to know whether they could emerge as diplomatic leaders of a safe and sustainable United States (though it may prove more reliable than debates or a caucus). It would remain to be seen whether Trump or Cruz’s Great Walls would be secure. Neither Rubio’s rehearsed zingers nor Kasich’s budgeting pride would make much of a difference. Sanders would need some hefty taxes to lead, which couldn’t happen until he resurrected an economy. Clinton would have to prove her trustworthiness, which is hard to do as threat approaches. And no amount of noise from Trump, including bullying, intimidation and name-calling, would serve to make society safer or more sustainable. It didn’t work for the governor or Woodbury. Maybe our candidates could take some notes from "TWD," where the quiet, observant and diplomatic Rick Grimes is still the only one standing. Diplomatic, that is, until his group is threatened, and he unleashes a small army led by Carol, Darryl and Michonne. “How ridiculous,” my skeptics say. “We’re talking about a show with zombies.” “Yes,” I relent, “there are zombies.” And what can we learn from them this election season? How not to become part of a herd that is drawn to noise.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 15:00

America’s worst affluenza case: The U.S. military is blowing through our tax dollars

The word “affluenza” is much in vogue. Lately, it’s been linked to a Texas teenager, Ethan Couch, who in 2013 killed four people in a car accident while driving drunk. During the trial, a defense witness argued that Couch should not be held responsible for his destructive acts. His parents had showered him with so much money and praise that he was completely self-centered; he was, in other words, a victim of affluenza, overwhelmed by a sense of entitlement that rendered him incapable of distinguishing right from wrong. Indeed, the judge at his trial sentenced him only to probation, not jail, despite the deaths of those four innocents. Experts quickly dismissed “affluenza” as a false diagnosis, a form of quackery, and indeed the condition is not recognized by the American Psychiatric Association. Yet the word caught on big time, perhaps because it speaks to something in the human condition, and it got me to thinking. During Ethan Couch’s destructive lifetime, has there been an American institution similarly showered with money and praise that has been responsible for the deaths of innocents and inadequately called to account? Is there one that suffers from the institutional version of affluenza (however fuzzy or imprecise that word may be) so much that it has had immense difficulty shouldering the blame for its failures and wrongdoing? The answer is hidden in plain sight: the U.S. military. Unlike Couch, however, that military has never faced trial or probation; it hasn’t felt the need to abscond to Mexico or been forcibly returned to the homeland to face the music. Spoiling the Pentagon First, a caveat. When I talk about spoiling the Pentagon, I’m not talking about your brother or daughter or best friend who serves honorably. Anyone who’s braving enemy fire while humping mountains in Afghanistan or choking on sand in Iraq is not spoiled. I’m talking about the U.S. military as an institution. Think of the Pentagon and the top brass; think of Dwight Eisenhower’s military-industrial complex; think of the national security state with all its tentacles of power. Focus on those and maybe you’ll come to agree with my affluenza diagnosis. Let’s begin with one aspect of that affliction: unbridled praise. In last month’sState of the Union address, President Obama repeated a phrase that’s become standard in American political discourse, as common as asking God to bless America. The U.S. military, he said, is the “finest fighting force in the history of the world.” Such hyperbole is nothing new. Five years ago, in response to similar presidential statements, I argued that many war-like peoples, including the imperial Roman legions and Genghis Khan’s Mongol horsemen, held far better claims to the “best ever” Warrior Bowl trophy. Nonetheless, the over-the-top claims never cease. Upon being introduced by President Obama as his next nominee for secretary of defense in December 2014, for instance, Ash Carter promptly praised the military he was going to oversee as “the greatest fighting force the world has ever known.” His words echoed those of the president, who had claimed the previous August that it was “the best-led, best-trained, best-equipped military in human history.” Similar hosannas (“the greatest force for human liberation the world has ever known”) had once been sprinkled liberally through George W. Bush’s speeches and comments, as well as those of other politicians since 9/11. In fact, from the president to all those citizens who feel obliged in a way Americans never have before to “thank” the troops endlessly for their efforts, no other institution has been so universally applauded since 9/11. No one should be shocked then that, in polls, Americans regularly claim to trust the military leadership above any other crew around, including scientists, doctors, ministers, priests, and -- no surprise -- Congress. Imagine parents endlessly praising their son as “the smartest, handsomest, most athletically gifted boy since God created Adam.” We’d conclude that they were thoroughly obnoxious, if not a bit unhinged. Yet the military remains just this sort of favored son, the country’s golden child. And to the golden child go the spoils. Along with unbridled praise, consider the “allowance” the American people regularly offer the Pentagon. If this were an “affluenza” family unit, while mom and dad might be happily driving late-model his and her Audis, the favored son would be driving a spanking new Ferrari. Add up what the federal government spends on “defense,” “homeland security,” “overseas contingency operations” (wars), nuclear weapons, and intelligence and surveillance operations, and the Ferraris that belong to the Pentagon and its national security state pals are vrooming along at more than $750 billion dollars annually, or two-thirds of the government’s discretionary spending. That’s quite an allowance for “our boy”! To cite a point of comparison, in 2015, federal funding for the departments of education, interior, and transportation maxed out at $95 billion -- combined! Not only is the military our favored son by a country mile: it’s our Prodigal Son, and nothing satisfies “him.” He’s still asking for more (and his Republican uncles are clearly ready to turn over to him whatever’s left of the family savings, lock, stock, and barrel). On the other hand, like any spoiled kid, the Defense Department sees even the most modest suggested cuts in its allowance as a form of betrayal. Witness the whining of both those Pentagon officials and military officerstestifying before Congressional committees and of empathetic committee members themselves. Minimalist cuts to the soaring Pentagon budget are, it seems, defanging the military and recklessly endangering American security vis-a-vis the exaggerated threats of the day: ISIS, China, and Russia. In fact, the real “threat” is clearly that the Pentagon’s congressional “parents” might someday cut down on its privileges and toys, as well as its free rein to do more or less as it pleases. With respect to those privileges, enormous budgets drive an unimaginably top-heavy bureaucracy at the Pentagon. Since 9/11, Congressional authorizations of three- and four-star generals and admirals have multipliedtwice as fast as their one- and two-star colleagues. Too many generals are chasing too few combat billets, contributing to backstabbing and butt-kissing. Indeed, despite indifferent records in combat, generals wear uniforms bursting with badges and ribbons, resembling the ostentatious displays of formerSoviet premiers -- or field marshals in the fictional Ruritarian guards. Meanwhile, the proliferation of brass in turn drives budgets higher. Even with recent modest declines (due to the official end of major combat operations in Iraq and Afghanistan), the U.S. defense budget exceeds the combined military budgets of at least the next seven highest spenders. (President Obama proudly claims that it’s the next eight.) Four of those countries -- France, Germany, Great Britain, and Saudi Arabia -- are U.S. allies; China and Russia, the only rivals on the list, spend far less than the United States. With respect to its toys, the military and its enablers in Congress can never get enough or at a high enough price. The most popular of these, at present, is the under-performing new F-35 jet fighter, projected to cost $1.5 trillion (yes, you read that right) over its lifetime, making it the most expensive weapons system in history. Another trillion dollars is projected over the next 30 years for “modernizing” the U.S. nuclear arsenal (this from a president who, as a candidate, spoke of eliminating nuclear weapons). The projected acquisition cost for a new advanced Air Force bomber is already $100 billion (before the cost overruns even begin).  The list goes on, but you catch the drift. A Spoiled Pentagon Means Never Having to Say You’re Sorry To complete our affluenza diagnosis, let’s add one more factor to boundless praise and a bountiful allowance: a total inability to take responsibility for one’s actions. This is, of course, the most repellent part of the Ethan Couch affluenza defense: the idea that he shouldn’t be held responsible precisely because he was so favored. Think, then, of the Pentagon and the military as Couch writ large. No matter their mistakes, profligate expenditures, even crimes, neither institution is held accountable for anything. Consider these facts: Iraq, Afghanistan, and Libya are quagmires. The Islamic State is spreading. Foreign armies, trained and equipped at enormous expense by the U.S. military, continue to evaporate. A hospital, clearly identifiable as such, is destroyed “by accident.” Wedding parties are wiped out “by mistake.” Torture (a war crime) is committed in the field. Detainees are abused. And which senior leaders have been held accountable for any of this in any way? With the notable exception of Brigadier General Janis Karpinskiof Abu Ghraib infamy, not a one. After lengthy investigations, the Pentagon will occasionally hold accountable a few individuals who pulled the triggers or dropped the bombs or abused the prisoners. Meanwhile, the generals and the top civilians in the Pentagon who made it all possible are immunized from either responsibility or penalty of any sort. This is precisely why Lieutenant Colonel Paul Yingling memorably wrote in 2007 that, in the U.S. military, “a private who loses a rifle suffers far greater consequences than a general who loses a war.” In fact, no matter what that military doesn’t accomplish, no matter how lacking its ultimate performance in the field, it keeps getting more money, resources, praise. When it comes to such subjects, consider the Republican presidential debate in Iowa on January 28th. Jeb Bush led the rhetorical charge by claiming that President Obama was “gutting” the military. Ted Cruz and Marco Rubio eagerly agreed, insisting that a “dramatically degraded” military had to be rebuilt. All the Republican candidates (Rand Paul excepted) piled on, calling for major increases in defense spending as well as looser “rules of engagement” in the field to empower local commanders to take the fight to the enemy. America’s “warfighters,” more than one candidate claimed, are fighting with one arm tied behind their backs, thanks to knots tightened by government lawyers. The final twist that supposedly tied the military up in a giant knot was, so they claim, applied by that lawyer-in-chief, Barack Obama himself. Interestingly, there has been no talk of our burgeoning national debt, which former chairman of the Joint Chiefs of Staff Admiral Mike Mullen once identified as the biggest threat facing America. When asked during the debate which specific federal programs he would cut to reduce the deficit, Chris Christie came up with only one, Planned Parenthood, which at $500 million a year is the equivalent of two F-35 jet fighters. (The military wants to buy more than 2,000 of them.) Throwing yet more money at a spoiled military is precisely the worst thing we as “parents” can do. In this, we should resort to the fiscal wisdom of Army Major General Gerald Sajer, the son of a Pennsylvania coal miner killed in the mines, a Korean War veteran and former Adjutant General of Pennsylvania. When his senior commanders pleaded for more money (during the leaner budget years before 9/11) to accomplish the tasks he had assigned them, General Sajer’s retort was simple: “We’re out of money; now we have to think.” Accountability Is Everything It’s high time to force the Pentagon to think. Yet when it comes to our relationship with the military, too many of us have acted like Ethan Couch’s mother. Out of a twisted sense of love or loyalty, she sought to shelter her son from his day of reckoning. But we know better. We know her son has to face the music. Something similar is true of our relationship to the U.S. military. An institutional report card with so many deficits and failures, a record of deportment that has led to death and mayhem, should not be ignored. The military must be called to account. How? By cutting its allowance. (That should make the brass sit up and take notice, perhaps even think.) By holding senior leaders accountable for mistakes. And by cutting the easy praise. Our military commanders know that they are not leading the finest fighting force since the dawn of history and it’s time our political leaders and the rest of us acknowledged that as well.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 13:45