Helen H. Moore's Blog, page 897

January 10, 2016

Ricky Gervais kicks off his “nicer” Golden Globes with a biting monologue: “I’ve changed— not as much as Bruce Jenner”

How rude will Ricky Gervais be tonight? That’s still to be determined. He opened with “Shut up,” and managed to insult just about everyone who might get excited about the awards: "You disgusting pill popping sexual deviant scum." It didn’t take him long to refer to Sean Penn’s El Chapo interview, either. "I'm going to do this monologue, then go into hiding. Not even Sean Penn will find me. Snitch.” At the Globes, Lily Tomlin and Jane Fonda didn’t seem to be enjoying themselves. https://twitter.com/EWJessicaShaw/sta... But online, some are finding him hilarious. Kurt Anderson describes him as having “actual edge.” https://twitter.com/KBAndersen/status... The cool reception didn’t stop Gervais.  “I’ve changed,” he said in a reference that suggested he was going to be less mean this year. But maybe not. “Though not as much as Bruce Jenner.” Caitlyn broke down stereotypes, but “didn’t do much for women drivers.” And the description of Jeffrey Tambor’s anatomy — don’t ask — may be a new Golden Globes low. Then he went on to Catholic priest pedophiles in a bit about “Spotlight,” saying that “Roman Polanski called it the best date night ever.” Watch a room full of his colleagues try not to laugh. This will be a long night.How rude will Ricky Gervais be tonight? That’s still to be determined. He opened with “Shut up,” and managed to insult just about everyone who might get excited about the awards: "You disgusting pill popping sexual deviant scum." It didn’t take him long to refer to Sean Penn’s El Chapo interview, either. "I'm going to do this monologue, then go into hiding. Not even Sean Penn will find me. Snitch.” At the Globes, Lily Tomlin and Jane Fonda didn’t seem to be enjoying themselves. https://twitter.com/EWJessicaShaw/sta... But online, some are finding him hilarious. Kurt Anderson describes him as having “actual edge.” https://twitter.com/KBAndersen/status... The cool reception didn’t stop Gervais.  “I’ve changed,” he said in a reference that suggested he was going to be less mean this year. But maybe not. “Though not as much as Bruce Jenner.” Caitlyn broke down stereotypes, but “didn’t do much for women drivers.” And the description of Jeffrey Tambor’s anatomy — don’t ask — may be a new Golden Globes low. Then he went on to Catholic priest pedophiles in a bit about “Spotlight,” saying that “Roman Polanski called it the best date night ever.” Watch a room full of his colleagues try not to laugh. This will be a long night.How rude will Ricky Gervais be tonight? That’s still to be determined. He opened with “Shut up,” and managed to insult just about everyone who might get excited about the awards: "You disgusting pill popping sexual deviant scum." It didn’t take him long to refer to Sean Penn’s El Chapo interview, either. "I'm going to do this monologue, then go into hiding. Not even Sean Penn will find me. Snitch.” At the Globes, Lily Tomlin and Jane Fonda didn’t seem to be enjoying themselves. https://twitter.com/EWJessicaShaw/sta... But online, some are finding him hilarious. Kurt Anderson describes him as having “actual edge.” https://twitter.com/KBAndersen/status... The cool reception didn’t stop Gervais.  “I’ve changed,” he said in a reference that suggested he was going to be less mean this year. But maybe not. “Though not as much as Bruce Jenner.” Caitlyn broke down stereotypes, but “didn’t do much for women drivers.” And the description of Jeffrey Tambor’s anatomy — don’t ask — may be a new Golden Globes low. Then he went on to Catholic priest pedophiles in a bit about “Spotlight,” saying that “Roman Polanski called it the best date night ever.” Watch a room full of his colleagues try not to laugh. This will be a long night.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 10, 2016 17:25

Doing the Golden Globes math: Maybe there’s no “last f**kable day” for TV actresses — but film is a different story

I can think of no better confirmation of “Inside Amy Schumer’s" brilliant bit “Last Fuckable Day” than the news item, earlier this week, reporting that Oscar front-runner Leonardo DiCaprio only dates women between the ages of 20 and 25. Looking over his catalogued relationship history, you can see how there’s a point up to which it made perfect sense that all of DiCaprio’s partners were in their early 20s (and blond, and supermodels, but OK)—he was of a similar age, and a handsome young man in Hollywood, and life is only so short, after all. But then DiCaprio ages out of his early 30s, but the women stay the same age, and it just keeps going, and going, until Slate notices and writes a piece about it. But as weird as it feels, from the outside, this is Hollywood—where, over the course of a few years, 1990-born Jennifer Lawrence played opposite romantic leads Bradley Cooper (1975), Christian Bale (1974), and Édgar Ramírez (1977). In the “Inside Amy Schumer” sketch, Schumer (34) runs into Tina Fey (45), Patricia Arquette (47), and Julia Louis-Dreyfus (54). The three are celebrating Louis-Dreyus’ “last fuckable day,” the day where the entertainment industry no longer deems a woman fit for romantic interest opposite a man. As the three older women patiently explain to Schumer, there is no equivalent for men, even when they’re “like a hundred, and only white spiders coming out.” But unsaid in the sketch is that while the women are talking about Hollywood as a whole, they are specifically talking about film, in opposition to television, the medium they’re currently on. Arquette, who had just won an Academy Award for “Boyhood” and devoted her onstage speech to sexism in Hollywood—after an Oscar campaign where an insider speaking to the Hollywood Reporter confessed to voting for her because she pledged not to have plastic surgery for the 12 years the movie was filming—spent the years between 2005 and 2011 paying the bills with “Medium,” an NBC drama where she played a psychic. Julia Louis-Dreyfus has just been in a handful of films, but has won six Emmys over 20 years for her television roles, and currently is the lead on one of HBO’s most acclaimed comedies. And though Tina Fey has become a film actress on other people’s projects, with lightweight comedies like “Sisters,” “Baby Mama” and “Date Night,” she is in charge on television, as a writer and showrunner—first with “Saturday Night Live” and “30 Rock,” and now with “Unbreakable Kimmy Schmidt.” There is a difference between the “rules” for women in film and women in TV. This year’s Golden Globes—our annual chance to look at both television and film accolades side by side—put that on incredible display. We know that one of the reasons television has drawn very talented mid-career actresses is because they were able to find employment there, when they couldn’t in film. What is perhaps more surprising is that nearly a decade after the mid-aughts, when film actresses like Arquette, Kyra Sedgwick, Helen Mirren and more found a place on television after aging out of Hollywood’s romantic-lead standards, the age gap between Hollywood’s men and women in film—and between women in film and women in television—is still astonishingly wide. And this brings me back to Leonardo DiCaprio. Nowhere is this age disparity more apparent, this year, than in the most prestigious category of the Golden Globes—motion picture, drama. The median age of the nominees for best actor is 41, with DiCaprio, who is nominated for “The Revenant.” The median age for the best actress nominees? Twenty-seven, with Alicia Vikander, for “The Danish Girl.” Saorise Ronan is just 20; Brie Larson is 26. The next oldest is Rooney Mara, at 30. Cate Blanchett has proven to be a rare gem of a performer, both with her on-screen work like “Carol” and her off-screen glamour. But her greatest feat yet might be making it to the best actress nominees this year at the apparent ripe old age of 46. Median ages for 2016 Golden Globes nominees, by category: FILM Best actress, motion picture, drama: 27, Alicia Vikander Best actor, motion picture, drama: 41, Leonardo DiCaprio Best actress, motion picture, comedy: 34, Amy Schumer Best actor, motion picture, comedy: 48, Mark Ruffalo Supporting actor, motion picture: 53, Jennifer Jason leigh Supporting actor, motion picture: 43, Idris Elba TV Best actress, TV, drama: 45, Taraji P. Henson Best actor, TV, drama: 44, Jon Hamm Best actress, TV, comedy: 54, Julia Louis-Dreyfus Best actor, TV, comedy: 51, Rob Lowe Best actress, limited series/movie: 33, Kirsten Dunst Best actor, Limited series/movie: 42, Patrick Wilson Supporting actress, TV: 44, Regina King Supporting actor, TV: 46, Christian Slater and Ben Mendelsohn If there’s one glaring takeaway from this rudimentary data set, it’s that where film displays a massive gender disparity in age, television is meanwhile neck-and-neck. Furthermore, in film, the male actors hit about the same age range in each category (drama, comedy, supporting, and even TV’s miniseries/movie category). But the women get progressively older, the less traditionally prestigious the category is. So best actress, drama skews youngest; then best actress, comedy; then supporting actress. Meanwhile, in television’s best actor and best actress, the median age for women is higher than the corresponding male category. The only category in which this isn’t true is with the nominations for limited series/miniseries/TV movie. But given that those productions often pull directly from currently working film talent, à la “True Detective” and “Fargo,” you could lump that category in with the film trends. But we won’t, for another reason: Though Kirsten Dunst is just 33, she’s a veteran 33—she’s been in show business for the last 20 years. She is not middle-aged, but her career is; and it’s in that mid-career state, maybe 15 or 20 years in, when women disappear from film. In the spread for this year’s nominations, it’s more difficult to find a woman in her 50s recognized for film than it is to find a septua- or octogenarian. Jennifer Jason Leigh (53) is nominated along with Maggie Smith (81), Lily Tomlin (76), Jane Fonda (78) and Helen Mirren (70). It’s almost as if, post-ingenue phase (around 30), women don’t come back as serious contenders in film until they’re late in their career and doing a kind of victory lap, save for a few particularly impressive individuals who either are British-accented magical fairies (Blanchett and Kate Winslet, 40) or very good comedians (Melissa McCarthy, 45, and Schumer—who, incidentally, both came up in television first). There are, of course, a lot of other factors to potentially discuss—such as the fact that this is just the crop of films that were produced this year, and that several nominees do a lot of work in both mediums. Lily Tomlin, for example, is nominated for “Grace and Frankie” and “Grandma”; Schumer is nominated for her film “Trainwreck” but not for “Inside Amy Schumer.” This is where the data is most muddled; actresses like Tomlin, Schumer and Dame Maggie Smith might be nominated for films but are making the bulk of their financial obligations through TV money—or vice versa. Similarly, these performers could be leveraging their success in one medium toward success in another. And of course, there’s the perennial issue that the Golden Globes are actually just the result of public relations bribery, i.e., of Hollywood campaign spending run amok. These nominations are likely illustrative of nothing more than which actresses' studios are willing to spend a ton of money promoting to get votes from the Hollywood Foreign Press—but then again, that is in its own way a very telling metric of the industry’s perception of these women. Ultimately, the thing to look for in tonight’s broadcast is not any particular age when the nominees take to the red carpet or the winners walk the stage, but rather, how the female employees of Hollywood are able to construct a career out of the opportunities they have. Perhaps we are in an equilibrium where young and talented actresses start in film, move to television, and then do both at the end of their career. And perhaps we have a situation where female performers are still struggling to be even seen after their “last fuckable day,” let alone recognized for their work. Yes, the median age for the five actresses nominated for best actress in a dramatic motion picture is 27. But given that apparently only one is young enough to date Leonardo DiCaprio, my suspicion is that the situation is a bit closer to the latter.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 10, 2016 17:00

Why we love the Golden Globes: The wacky upsets and weirdo nominations that make the most fun ceremony of the season

Award season is in full swing. And just before the Oscar nominations are to be announced this week, the Hollywood Foreign Press Association will hand out the Golden Globes. This year’s nominations range from the sublime—Cate Blanchett in “Carol”—to the ridiculous—Al Pacino in “Danny Collins.” To be fair, shouty Al was actually quite good as a troubled crooner, but award-worthy? Really? At least the nomination for Sylvester Stallone in “Creed” makes sense! If there is any sense to what the HFPA likes, it is comedy. Melissa McCarthy and Amy Schumer both received nods this year for “Spy” and “Trainwreck,” respectively. However, it seems as if the Globes and the finicky HFPA—90 Southern California-based international journalists who are members of “a non-profit, philanthropic organization” according to its website—take it upon themselves to be tastemakers. They strive to nominate and award actors who are names, even if their work that year is not their best. (Is that their philanthropic mission?) They often choose to double nominate actors in film and TV perhaps to ensure the performer will show up. Publicists woo the HFPA with naked and/or crass For Your Consideration campaigns, such as the one for “The Tourist” in 2011 where critics were junketed to Vegas in the hopes of receiving a nomination that could boost attention to a flabby film. Awards like the Golden Globes generate some buzz, which can translate into box office receipts or Oscar attention. It may be how some dark horses gain traction to enter the Big Race. But looking over the Globe nominations over the years, there are patterns—actors who will be nominated for anything—and head-scratchers—talent that one might expect to be nominated just to get viewers to tune in. What is strange this year is how those two Venn diagrams don’t overlap. Sure the dividing of best picture and leading acting categories into “Drama” and “Musical or Comedy” allows the latter category to include both Mark Ruffalo for “Infinitely Polar Bear” (rather than the more deserving drama “Spotlight”) and Maggie Smith for “The Lady in the Van.” But these nominations seem to be grasping at straws. No disrespect to either performer, but they must be well liked by the HFPS, who have supported their work in the past. Ruffalo was double nominated for “Foxcatcher” and “The Normal Heart” last year; Smith was last recognized in 2013 for both the film “Quartet” and the TV series “Downton Abbey.” In fact, it is almost surprising that Smith and/or her costar Judi Dench were not nominated for the twaddle that was “The Second Best Exotic Marigold Hotel” this year, especially since Dench was in the running for a Golden Globe for the first “Best Exotic Marigold Hotel” in 2012. Perhaps there is a tendency by the HFPA to recognize British actresses of a certain age. Helen Mirren got a nod this year for “Trumbo,” her 14th, and at least deserving after the bewildering nomination for the cheesy “The Hundred-Foot Journey” last year and the underwhelming “Hitchcock” in 2012. Oddly, Mirren’s supremely silly 2010 film “RED” was nominated for a best musical or comedy Golden Globe, but not her performance. Alas, the HPFA filled the possible Mirren slot for best actress that year with the forgettable Anne Hathaway in “Love & Other Drugs,” the memorable Emma Stone in “Easy A,” as well as Angelina Jolie for the aforementioned debacle “The Tourist.” The HPFA may play favorites, but, curiously, “It Boy” Ryan Gosling was passed over this year for his sly performance in “The Big Short” while his co-stars Christian Bale and Steve Carell were both nominated. Gosling was a double nominee in 2012 for two far less notable roles in “Crazy Stupid Love” and “The Ides of March”—and also nominated, justly in 2011, for “Blue Valentine” and before that for “Lars and the Real Girl.” Maybe the HFPA is punishing Gosling for directing “Lost River”? Gosling is not sitting this year out alone. His “Big Short” co-star Brad Pitt was also shorted. Surely Pitt, who has been a Golden Globe best actor nominee five times in the past, starting in 1995 with “Legends of the Fall” through “Moneyball” in 2012, would be honored for this ensemble comedy. Even his unfairly overlooked dramatic turn in “By the Sea” directed by his wife, six-time Golden Globe nominee and three-time winner, Angelina Jolie, might merit him a place at the table. After all, if she could snag a nomination for her performance in the ridiculous film “The Tourist,” how could she be ignored for her far more ambitious work in “By the Sea”? Jolie’s fellow “Tourist,” co-star Johnny Depp, was also snubbed by the Globes this year. He had 10 previous nominations (and one win, for “Sweeney Todd” in 2008). Depp’s turn in the criminally underseen “Black Mass” has been generating Oscar buzz, so why is it an also-ran for the Globes? Was the HFPA trying to make folks forget about the “Tourist” scandal? Perhaps the HFPA is more focused on distinguishing themselves from the Oscars by honoring the best actors, not the most acting. So Paul Giamatti wins a Globe for “Barney’s Version” in 2011, and Colin Farrell bags a trophy in 2009 for “In Bruges.” These are two fine performances that practically came out of nowhere, proving that the Golden Globes nominations might have merit. That said, Gary Oldman famously spoke out against the Globes in 2012, deeming them “meaningless.” Still, nominating a deserving actor for a worthy performance seems to be the exception, not the rule. For every decent surprise nomination—consider Golden Globe winner (for “Chicago”) Richard Gere’s slick performance in “Arbitrage” and prior Golden Globe nominee (for “School of Rock!”) Jack Black’s underappreciated turn in “Bernie” in 2013—there is Ewan McGregor, a past Golden Globe nominee (for “Moulin Rouge”), being acknowledged that same year for “Salmon Fishing in Yemen.” Did anyone other than the HFPA even see “Salmon Fishing in Yemen”? And if they did see it, did they like it that much? Or was the junket in Yemen fabulous? Never mind. There have been other egregious nominations over the years: In 2006, Sarah Jessica Parker was nominated for “The Family Stone,” an amusing dysfunctional family comedy for sure, but c’mon, was this nomination necessary because “Sex and the City” had ended its run and the HPFA wanted to see her again? In 2008, John C. Reilly, a terrific actor who effortlessly does well in comic and dramatic roles, was a double nominee for “Walk Hard,” as best actor and for best song. Were these two nominations contingent on one another? That same year, Jodie Foster was nominated for “The Brave One,” which was as curious as her nomination a few years later for “Carnage.” Good performances by a great actress, but those films barely made a blip on anyone’s radar. Maybe this was just the HFPA’s way of charming Foster so they could award her the Cecil B. DeMille Award in 2013, and she could give that memorable and strange coming out speech. The 1990s had its share of wacky upsets: In 1991, Gerard Depardieu not only gets nominated for the middling film “Green Card,” but actually wins best actor in a musical or comedy, defeating Johnny Depp in “Edward Scissorhands,” Richard Gere in “Pretty Woman,” Patrick Swayze in “Ghost” and Macaulay Culkin in “Home Alone.” OK, maybe the competition wasn’t that stiff in that category. This was the same year “The Godfather, Part III” received seven Golden Globe nominations, though it won none, a rare feat only achieved by “Who’s Afraid of Virginia Woolf?” back in 1967. In 1996 Sharon Stone’s performance in “Casino” is awarded a Golden Globe over Susan Sarandon in “Dead Man Walking,” Meryl Streep in “The Bridges of Madison County,” Emma Thompson in “Sense and Sensibility” and Elisabeth Shue in “Leaving Las Vegas.” In 1997, Madonna in “Evita” bests Frances McDormand in “Fargo” and Glenn Close in “101 Dalmations.” Madonna surely agreed to attend. Also in 1997, Eddie Murphy’s turn in “The Nutty Professor” gets nominated. Go figure. In 1998, Judi Dench’s first (of 11) Golden Globe nominations, for “Mrs. Brown,” wins the prize, leaving Kate Winslet in “Titanic” at sea. Dench may have given a better performance, but most folks were surprised the more popular film lost. Other weird nominations over the years: In 1995, Arnold Schwarzenegger, a Golden Globe-winning actor (in 1977, for “Stay Hungry”), was nominated again for his tremendous performance in “Junior.” His co-star in the film, Emma Thompson gets nominated, too. In 1999, the HFPA nominated perennial favorite Robin Williams and his film “Patch Adams,” proving that no performance is too shameless, and no film is too lousy, not to be nominated. Unless it’s “Jakob the Liar” or “What Dreams May Come.” In 2001, two-time Golden Globe winner (and six-time nominee, including a nod for “Liar Liar”) Jim Carrey’s performance in “How the Grinch Stole Christmas” competed against Robert De Niro’s in “Meet the Parents” and Mel Gibson’s in “What Women Want.” One might think the HPFA was scraping the bottom of the barrel in 2001, but the following year, Hugh Jackman was nominated for “Kate & Leopold,” Cameron Diaz got a nod for “Vanilla Sky,” Kevin Spacey was recognized for “The Shipping News,” and Cate Blanchett and Billy Bob Thornton were both honored for “Bandits.” These nominations all confirm that star power in mediocre films is the HFPA way. As for the wooden Hayden Christensen’s nomination for “Life as a House” that same year, there is simply no explanation. In 2009, Dustin Hoffman and his co-star Emma Thompson were nominated for “Last Chance Harvey,” a likable film, but like Al Pacino’s bid for “Danny Collins,” mostly notable for being a less hammy late-career part for Hoffman, and hardly a career high. Then again, in 2009, James Franco was nominated for “Pineapple Express,” a WTF nomination if ever there was one. In 2010 Julia Roberts’ nomination for “Duplicity” must have been an HFPA effort to have the actress in the audience, and also have someone compete against Meryl Streep who was double nominated for her work in both “Julie & Julia” and “It’s Complicated” in the same category. La Streep tends to be nominated for almost every forgettable film she makes, as “She-Devil,” “Death Becomes Her,” “The Manchurian Candidate” and “Hope Springs” all prove. Also in 2010, Joseph Gordon-Leavitt, who was snubbed for “The Walk” this year, was nominated for the mild “(500) Days of Summer.” Two years later, he would get another surprise nomination for his turn in “50/50.” In 2011, Halle Berry received a nomination for “Frankie and Alice,” a film that was made in 2010 but no one knew about because it sat on the shelf until 2014. Maybe the HFPA were ahead of the curve on this one; but maybe they had ulterior motives in having Halle Berry at their ceremony? The 2012 nomination for Brendan Gleeson in “The Guard” was the actor’s third, after “In Bruges” in 2009 and for the 2010 TV movie “Into the Storm.” It may be a token of respect for the underrated character actor from abroad, but “The Guard” was not nearly as strong as his ignored performance in “Calvary” in 2014 (or even “The General” in 1998). So even when the HFPA does a kindness, it is not without question. 2014 holds two of the most peculiar nominations in Golden Globe history. Kate Winslet was honored for her dramatic turn in the execrable “Labor Day,” while Greta Gerwig was recognized for her performance in “Frances Ha.” Winslet certainly has Hollywood cachet, which might make her attendance appealing, but is the just HFPA trying to be cool by nominating hipster darling Gerwig? Also in 2014, Julia Louis-Dreyfus’ work in “Enough Said,” a respectable film, perhaps most notable for being James Gandolfini’s last leading role—where was his nomination?! One might surmise Louis-Dreyfus was included because the Golden Globe-winning actress (for “Seinfeld”) was also up for her TV work in “Veep” that same year. Which brings the conversation full circle, and back to the trend to double nominate actors. This may account for why Miss Congeniality herself, Sandra Bullock, got a leading actress nomination for her part in the amusing comedy “The Proposal” the same year as her dramatic acting nomination for “The Blind Side.” Or why Annette Bening received a comedy nomination for “Running With Scissors,” a film that was at the top of Roger Ebert’s “Worst of the Year” list, in 2007 when she was also acknowledged for the TV movie “Mrs. Harris.” Leonardo DiCaprio was also a double nominee in 2007 for his strong work in "The Departed" and for the weak film "Blood Diamond." Last year, Julianne Moore, Bill Murray and the aforementioned Mark Ruffalo were all double nominees. This year, four actors—Idris Elba, Mark Rylance, Lily Tomlin and Alicia Vikander—received two nominations each, which makes their chances greater for a win, so their attendance at the ceremony more likely. Still, there is the even rarer feat of a triple-Globe nomination, which not even Meryl Streep herself has achieved. But Jamie Foxx did it in 2005, getting Globe nominations for “Ray,” “Collateral” and the TV movie “Redemption.” Which also brings us to the famous, unbelievable “three-way tie” in 1989, when Jodie Foster shared her best actress Globe for “The Accused” with Sigourney Weaver in “Gorillas in the Mist” and Shirley MacLaine for “Madame Sousatzka.” What is hard to believe is that the losers that year were Christine Lahti in “Running on Empty,” and Meryl Streep from “A Cry in the Dark.” Ultimately, the value of the Golden Globes is debatable. But as the saying goes, “It’s an honor to be nominated.” Even if you’re Al Pacino in “Danny Collins.”Award season is in full swing. And just before the Oscar nominations are to be announced this week, the Hollywood Foreign Press Association will hand out the Golden Globes. This year’s nominations range from the sublime—Cate Blanchett in “Carol”—to the ridiculous—Al Pacino in “Danny Collins.” To be fair, shouty Al was actually quite good as a troubled crooner, but award-worthy? Really? At least the nomination for Sylvester Stallone in “Creed” makes sense! If there is any sense to what the HFPA likes, it is comedy. Melissa McCarthy and Amy Schumer both received nods this year for “Spy” and “Trainwreck,” respectively. However, it seems as if the Globes and the finicky HFPA—90 Southern California-based international journalists who are members of “a non-profit, philanthropic organization” according to its website—take it upon themselves to be tastemakers. They strive to nominate and award actors who are names, even if their work that year is not their best. (Is that their philanthropic mission?) They often choose to double nominate actors in film and TV perhaps to ensure the performer will show up. Publicists woo the HFPA with naked and/or crass For Your Consideration campaigns, such as the one for “The Tourist” in 2011 where critics were junketed to Vegas in the hopes of receiving a nomination that could boost attention to a flabby film. Awards like the Golden Globes generate some buzz, which can translate into box office receipts or Oscar attention. It may be how some dark horses gain traction to enter the Big Race. But looking over the Globe nominations over the years, there are patterns—actors who will be nominated for anything—and head-scratchers—talent that one might expect to be nominated just to get viewers to tune in. What is strange this year is how those two Venn diagrams don’t overlap. Sure the dividing of best picture and leading acting categories into “Drama” and “Musical or Comedy” allows the latter category to include both Mark Ruffalo for “Infinitely Polar Bear” (rather than the more deserving drama “Spotlight”) and Maggie Smith for “The Lady in the Van.” But these nominations seem to be grasping at straws. No disrespect to either performer, but they must be well liked by the HFPS, who have supported their work in the past. Ruffalo was double nominated for “Foxcatcher” and “The Normal Heart” last year; Smith was last recognized in 2013 for both the film “Quartet” and the TV series “Downton Abbey.” In fact, it is almost surprising that Smith and/or her costar Judi Dench were not nominated for the twaddle that was “The Second Best Exotic Marigold Hotel” this year, especially since Dench was in the running for a Golden Globe for the first “Best Exotic Marigold Hotel” in 2012. Perhaps there is a tendency by the HFPA to recognize British actresses of a certain age. Helen Mirren got a nod this year for “Trumbo,” her 14th, and at least deserving after the bewildering nomination for the cheesy “The Hundred-Foot Journey” last year and the underwhelming “Hitchcock” in 2012. Oddly, Mirren’s supremely silly 2010 film “RED” was nominated for a best musical or comedy Golden Globe, but not her performance. Alas, the HPFA filled the possible Mirren slot for best actress that year with the forgettable Anne Hathaway in “Love & Other Drugs,” the memorable Emma Stone in “Easy A,” as well as Angelina Jolie for the aforementioned debacle “The Tourist.” The HPFA may play favorites, but, curiously, “It Boy” Ryan Gosling was passed over this year for his sly performance in “The Big Short” while his co-stars Christian Bale and Steve Carell were both nominated. Gosling was a double nominee in 2012 for two far less notable roles in “Crazy Stupid Love” and “The Ides of March”—and also nominated, justly in 2011, for “Blue Valentine” and before that for “Lars and the Real Girl.” Maybe the HFPA is punishing Gosling for directing “Lost River”? Gosling is not sitting this year out alone. His “Big Short” co-star Brad Pitt was also shorted. Surely Pitt, who has been a Golden Globe best actor nominee five times in the past, starting in 1995 with “Legends of the Fall” through “Moneyball” in 2012, would be honored for this ensemble comedy. Even his unfairly overlooked dramatic turn in “By the Sea” directed by his wife, six-time Golden Globe nominee and three-time winner, Angelina Jolie, might merit him a place at the table. After all, if she could snag a nomination for her performance in the ridiculous film “The Tourist,” how could she be ignored for her far more ambitious work in “By the Sea”? Jolie’s fellow “Tourist,” co-star Johnny Depp, was also snubbed by the Globes this year. He had 10 previous nominations (and one win, for “Sweeney Todd” in 2008). Depp’s turn in the criminally underseen “Black Mass” has been generating Oscar buzz, so why is it an also-ran for the Globes? Was the HFPA trying to make folks forget about the “Tourist” scandal? Perhaps the HFPA is more focused on distinguishing themselves from the Oscars by honoring the best actors, not the most acting. So Paul Giamatti wins a Globe for “Barney’s Version” in 2011, and Colin Farrell bags a trophy in 2009 for “In Bruges.” These are two fine performances that practically came out of nowhere, proving that the Golden Globes nominations might have merit. That said, Gary Oldman famously spoke out against the Globes in 2012, deeming them “meaningless.” Still, nominating a deserving actor for a worthy performance seems to be the exception, not the rule. For every decent surprise nomination—consider Golden Globe winner (for “Chicago”) Richard Gere’s slick performance in “Arbitrage” and prior Golden Globe nominee (for “School of Rock!”) Jack Black’s underappreciated turn in “Bernie” in 2013—there is Ewan McGregor, a past Golden Globe nominee (for “Moulin Rouge”), being acknowledged that same year for “Salmon Fishing in Yemen.” Did anyone other than the HFPA even see “Salmon Fishing in Yemen”? And if they did see it, did they like it that much? Or was the junket in Yemen fabulous? Never mind. There have been other egregious nominations over the years: In 2006, Sarah Jessica Parker was nominated for “The Family Stone,” an amusing dysfunctional family comedy for sure, but c’mon, was this nomination necessary because “Sex and the City” had ended its run and the HPFA wanted to see her again? In 2008, John C. Reilly, a terrific actor who effortlessly does well in comic and dramatic roles, was a double nominee for “Walk Hard,” as best actor and for best song. Were these two nominations contingent on one another? That same year, Jodie Foster was nominated for “The Brave One,” which was as curious as her nomination a few years later for “Carnage.” Good performances by a great actress, but those films barely made a blip on anyone’s radar. Maybe this was just the HFPA’s way of charming Foster so they could award her the Cecil B. DeMille Award in 2013, and she could give that memorable and strange coming out speech. The 1990s had its share of wacky upsets: In 1991, Gerard Depardieu not only gets nominated for the middling film “Green Card,” but actually wins best actor in a musical or comedy, defeating Johnny Depp in “Edward Scissorhands,” Richard Gere in “Pretty Woman,” Patrick Swayze in “Ghost” and Macaulay Culkin in “Home Alone.” OK, maybe the competition wasn’t that stiff in that category. This was the same year “The Godfather, Part III” received seven Golden Globe nominations, though it won none, a rare feat only achieved by “Who’s Afraid of Virginia Woolf?” back in 1967. In 1996 Sharon Stone’s performance in “Casino” is awarded a Golden Globe over Susan Sarandon in “Dead Man Walking,” Meryl Streep in “The Bridges of Madison County,” Emma Thompson in “Sense and Sensibility” and Elisabeth Shue in “Leaving Las Vegas.” In 1997, Madonna in “Evita” bests Frances McDormand in “Fargo” and Glenn Close in “101 Dalmations.” Madonna surely agreed to attend. Also in 1997, Eddie Murphy’s turn in “The Nutty Professor” gets nominated. Go figure. In 1998, Judi Dench’s first (of 11) Golden Globe nominations, for “Mrs. Brown,” wins the prize, leaving Kate Winslet in “Titanic” at sea. Dench may have given a better performance, but most folks were surprised the more popular film lost. Other weird nominations over the years: In 1995, Arnold Schwarzenegger, a Golden Globe-winning actor (in 1977, for “Stay Hungry”), was nominated again for his tremendous performance in “Junior.” His co-star in the film, Emma Thompson gets nominated, too. In 1999, the HFPA nominated perennial favorite Robin Williams and his film “Patch Adams,” proving that no performance is too shameless, and no film is too lousy, not to be nominated. Unless it’s “Jakob the Liar” or “What Dreams May Come.” In 2001, two-time Golden Globe winner (and six-time nominee, including a nod for “Liar Liar”) Jim Carrey’s performance in “How the Grinch Stole Christmas” competed against Robert De Niro’s in “Meet the Parents” and Mel Gibson’s in “What Women Want.” One might think the HPFA was scraping the bottom of the barrel in 2001, but the following year, Hugh Jackman was nominated for “Kate & Leopold,” Cameron Diaz got a nod for “Vanilla Sky,” Kevin Spacey was recognized for “The Shipping News,” and Cate Blanchett and Billy Bob Thornton were both honored for “Bandits.” These nominations all confirm that star power in mediocre films is the HFPA way. As for the wooden Hayden Christensen’s nomination for “Life as a House” that same year, there is simply no explanation. In 2009, Dustin Hoffman and his co-star Emma Thompson were nominated for “Last Chance Harvey,” a likable film, but like Al Pacino’s bid for “Danny Collins,” mostly notable for being a less hammy late-career part for Hoffman, and hardly a career high. Then again, in 2009, James Franco was nominated for “Pineapple Express,” a WTF nomination if ever there was one. In 2010 Julia Roberts’ nomination for “Duplicity” must have been an HFPA effort to have the actress in the audience, and also have someone compete against Meryl Streep who was double nominated for her work in both “Julie & Julia” and “It’s Complicated” in the same category. La Streep tends to be nominated for almost every forgettable film she makes, as “She-Devil,” “Death Becomes Her,” “The Manchurian Candidate” and “Hope Springs” all prove. Also in 2010, Joseph Gordon-Leavitt, who was snubbed for “The Walk” this year, was nominated for the mild “(500) Days of Summer.” Two years later, he would get another surprise nomination for his turn in “50/50.” In 2011, Halle Berry received a nomination for “Frankie and Alice,” a film that was made in 2010 but no one knew about because it sat on the shelf until 2014. Maybe the HFPA were ahead of the curve on this one; but maybe they had ulterior motives in having Halle Berry at their ceremony? The 2012 nomination for Brendan Gleeson in “The Guard” was the actor’s third, after “In Bruges” in 2009 and for the 2010 TV movie “Into the Storm.” It may be a token of respect for the underrated character actor from abroad, but “The Guard” was not nearly as strong as his ignored performance in “Calvary” in 2014 (or even “The General” in 1998). So even when the HFPA does a kindness, it is not without question. 2014 holds two of the most peculiar nominations in Golden Globe history. Kate Winslet was honored for her dramatic turn in the execrable “Labor Day,” while Greta Gerwig was recognized for her performance in “Frances Ha.” Winslet certainly has Hollywood cachet, which might make her attendance appealing, but is the just HFPA trying to be cool by nominating hipster darling Gerwig? Also in 2014, Julia Louis-Dreyfus’ work in “Enough Said,” a respectable film, perhaps most notable for being James Gandolfini’s last leading role—where was his nomination?! One might surmise Louis-Dreyfus was included because the Golden Globe-winning actress (for “Seinfeld”) was also up for her TV work in “Veep” that same year. Which brings the conversation full circle, and back to the trend to double nominate actors. This may account for why Miss Congeniality herself, Sandra Bullock, got a leading actress nomination for her part in the amusing comedy “The Proposal” the same year as her dramatic acting nomination for “The Blind Side.” Or why Annette Bening received a comedy nomination for “Running With Scissors,” a film that was at the top of Roger Ebert’s “Worst of the Year” list, in 2007 when she was also acknowledged for the TV movie “Mrs. Harris.” Leonardo DiCaprio was also a double nominee in 2007 for his strong work in "The Departed" and for the weak film "Blood Diamond." Last year, Julianne Moore, Bill Murray and the aforementioned Mark Ruffalo were all double nominees. This year, four actors—Idris Elba, Mark Rylance, Lily Tomlin and Alicia Vikander—received two nominations each, which makes their chances greater for a win, so their attendance at the ceremony more likely. Still, there is the even rarer feat of a triple-Globe nomination, which not even Meryl Streep herself has achieved. But Jamie Foxx did it in 2005, getting Globe nominations for “Ray,” “Collateral” and the TV movie “Redemption.” Which also brings us to the famous, unbelievable “three-way tie” in 1989, when Jodie Foster shared her best actress Globe for “The Accused” with Sigourney Weaver in “Gorillas in the Mist” and Shirley MacLaine for “Madame Sousatzka.” What is hard to believe is that the losers that year were Christine Lahti in “Running on Empty,” and Meryl Streep from “A Cry in the Dark.” Ultimately, the value of the Golden Globes is debatable. But as the saying goes, “It’s an honor to be nominated.” Even if you’re Al Pacino in “Danny Collins.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 10, 2016 17:00

The Internet gets its wish — its boyfriend Oscar Isaac wins Golden Globe for best actor in TV miniseries

The Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta... Internet is quite pleased with Oscar Isaac winning best actor in a TV miniseries or movie at the 2016 Golden Globes for his performance in "Show Me a Hero" -- actually it's just quite pleased with him generally: https://twitter.com/JohnWascavage/sta... https://twitter.com/julietsburke/stat... https://twitter.com/camivequ/status/6... https://twitter.com/danie_bagel/statu... https://twitter.com/Chris_Topher_11/s... And because everything at the moment is, of course, about Star Wars: https://twitter.com/icanpictureit/sta... https://twitter.com/jeremynewberger/s... Watch his acceptance speech below via Twitter: https://twitter.com/SuperheroFeed/sta...

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 10, 2016 16:56

January 9, 2016

I survived mass Internet rage

When I graduated from Columbia’s journalism school in May, I had no idea that my first brush with Internet (in)fam(y) would come just 24 hours later.

As I left the commencement festivities, still wearing my graduation gown, my father and I ran into Humans of New York photographer Brandon Stanton. As soon as he introduced himself, I immediately turned to my father, and said: “Dad, please don’t say anything stupid.”

My concern was not unwarranted: My father has, to put it mildly, the sometimes hilarious, sometimes humiliating combination of a quick wit, an off-color sense of humor and a lack of filter. (Dad’s response, much later: “But stupid things are all I know how to say!”) He was also completely unfamiliar with Humans of New York and its immense audience. The conversation that followed with Brandon was brief and, frankly, embarrassing, with my father and I both caught off-guard and tongue-tied, offering up the most vapid of clichés.  After a few questions, Stanton clearly realized the futility of his efforts, and said he’d like to use my opening words to my father, which he found “really cute” and which my family would agree pretty much sums up my father’s and my playful relationship. He then took our photograph, and we parted ways. (Once he was out of earshot, my dad remarked, “He must have thought we were a couple of blithering idiots.” C’est la vie.)

The following day, when the photograph and accompanying “Dad, please don’t say anything stupid” ran, the comments were overwhelmingly negative. Commenters accused me of being disrespectful and not appreciating my father; they assumed I was a petulant teenager; they predicted how much I’d regret my words following his eventual death. One commenter even falsely claimed to know me and my student debt situation. My parents and I got a good laugh out of these musings—my personal favorites took shots at my graduation gown, which some commenters apparently assumed was some sort of peculiar, cornflower-blue, muumuu-esque fashion statement—but the response stuck with me; it affirmed my technophobic belief that the Internet was a big, bad, scary place, even if it was one with which I, as a journalist, was poised to inhabit for the duration of my career. Shortly after the post went up, I emailed Stanton to express gratitude for the experience, apologies for our mostly unquotable quotes and surprise at the strength and nature of the Internet’s response. Stanton, in his reply, agreed and summed it up nicely: “HONYs comment section is one of the most positive on the internet.  But if there is a razors edge chance of anything being interpreted negatively, there will always be people in a crowd of 15 million that will find a way to do it.  And unfortunately those people are often the most vocal with their opinions."

I kept this in mind when I offered myself up to the ire of the anonymous masses for my essay “I hate your kids. And I’m not sorry.” Though a negative response here was not at all unanticipated—in fact, my editor warned me ahead of time that I might want to refrain from reading the comments, though curiosity ultimately got the better of me—the pointed viciousness of the commenters was. Here I was accused of being a psychopath (I am not), of thinking no one should have children (I do not think this), of actively being cruel to them (I am not) or wishing them harm (which I state in the piece I do not). A Mormon forum dedicated to “protect[ing] our children from…Alanna Weissman” sprang up. An article on the Belfast Telegraph’s site claimed I “let down the sisterhood.” People latched onto and read into the word “hate,” which I use in the colloquial sense, just as I profess that I basely, emphatically, viscerally hate onions (the hatred of which, thankfully, bears no social stigma). Even as the piece got thousands of likes and shares—which I can only hope means it resonated with some people—I am, as I suspect is human nature, preoccupied with the unfavorable responses, even if out of fascination. I could not comprehend how my words could prompt people to wish upon me a lack of livelihood, to veer into threat territory, to project psychological conditions onto me.

These experiences, on Humans of New York and Salon, underscore the stark differences between our real-world and Internet selves, not just on the part of the authors, but on the commenters as well. Those who comment on online articles, I assume, would never be so unkind in real life, without the veil of anonymity and the safety of a computer screen. Yet, when responding to one’s public persona—whether that be an author’s writings, an artist’s work, a celebrity’s personal life (all of which are, unlike most online comments, tied to real names and lives)—we immediately jump to conjecture and invective. The conclusions we draw are, at best, frequently ill-informed and incorrect—and, at worst, they can be devastating.

Justine Sacco, who lost her job after a poorly thought-out tweet, has become the poster girl for Internet ire, and she’s not the only one; in fact, the mass rage of Internet commenters and the life-altering consequences that follow have become commonplace enough that there’s now an entire book (which, full disclosure, I have not yet read) devoted to the subject. Which should lead us to wonder: Do we really want to potentially ruin strangers’ lives with our assumptions? Is our anger over someone’s joke or opinion so great that we want it to cost them everything they’ve worked for, and then some?

This is not to say that I regret any of the instances in which I have released my (evidently unpopular) opinion into cyberspace. Far from it—the experiences have forced my intensely private self far outside my comfort zone, and have helped me to develop a journalist’s requisite thick skin. Nor do I take offense at the commenters’ opinions. But I and those who know me in real life can’t help but notice the stark contrast between my actual self and the self conjured by commenters. Whether author or reader, we could all stand to self-reflect a bit more before hitting “publish”—but especially when projecting our presumptions and biases onto those we don’t actually know, but only think we do.

Even under the pall of strangers’ backlash, my most pleasant memory from commencement day is not graduating, nor is it the heartwarming support from my family, friends and professors, nor is it meeting Mr. Stanton; it is of the perfect stranger who handed me a purple-wrapped bouquet of flowers through the turnstile of the 116th Street subway station as I departed for the evening. Though the bouquet was sparse and the petals browning and wilted, I was touched by the generosity of the gesture. It was an act of small, spontaneous, judgment-free kindness. And, whether on the Internet or in real life, we could all use some more of that.

When I graduated from Columbia’s journalism school in May, I had no idea that my first brush with Internet (in)fam(y) would come just 24 hours later.

As I left the commencement festivities, still wearing my graduation gown, my father and I ran into Humans of New York photographer Brandon Stanton. As soon as he introduced himself, I immediately turned to my father, and said: “Dad, please don’t say anything stupid.”

My concern was not unwarranted: My father has, to put it mildly, the sometimes hilarious, sometimes humiliating combination of a quick wit, an off-color sense of humor and a lack of filter. (Dad’s response, much later: “But stupid things are all I know how to say!”) He was also completely unfamiliar with Humans of New York and its immense audience. The conversation that followed with Brandon was brief and, frankly, embarrassing, with my father and I both caught off-guard and tongue-tied, offering up the most vapid of clichés.  After a few questions, Stanton clearly realized the futility of his efforts, and said he’d like to use my opening words to my father, which he found “really cute” and which my family would agree pretty much sums up my father’s and my playful relationship. He then took our photograph, and we parted ways. (Once he was out of earshot, my dad remarked, “He must have thought we were a couple of blithering idiots.” C’est la vie.)

The following day, when the photograph and accompanying “Dad, please don’t say anything stupid” ran, the comments were overwhelmingly negative. Commenters accused me of being disrespectful and not appreciating my father; they assumed I was a petulant teenager; they predicted how much I’d regret my words following his eventual death. One commenter even falsely claimed to know me and my student debt situation. My parents and I got a good laugh out of these musings—my personal favorites took shots at my graduation gown, which some commenters apparently assumed was some sort of peculiar, cornflower-blue, muumuu-esque fashion statement—but the response stuck with me; it affirmed my technophobic belief that the Internet was a big, bad, scary place, even if it was one with which I, as a journalist, was poised to inhabit for the duration of my career. Shortly after the post went up, I emailed Stanton to express gratitude for the experience, apologies for our mostly unquotable quotes and surprise at the strength and nature of the Internet’s response. Stanton, in his reply, agreed and summed it up nicely: “HONYs comment section is one of the most positive on the internet.  But if there is a razors edge chance of anything being interpreted negatively, there will always be people in a crowd of 15 million that will find a way to do it.  And unfortunately those people are often the most vocal with their opinions."

I kept this in mind when I offered myself up to the ire of the anonymous masses for my essay “I hate your kids. And I’m not sorry.” Though a negative response here was not at all unanticipated—in fact, my editor warned me ahead of time that I might want to refrain from reading the comments, though curiosity ultimately got the better of me—the pointed viciousness of the commenters was. Here I was accused of being a psychopath (I am not), of thinking no one should have children (I do not think this), of actively being cruel to them (I am not) or wishing them harm (which I state in the piece I do not). A Mormon forum dedicated to “protect[ing] our children from…Alanna Weissman” sprang up. An article on the Belfast Telegraph’s site claimed I “let down the sisterhood.” People latched onto and read into the word “hate,” which I use in the colloquial sense, just as I profess that I basely, emphatically, viscerally hate onions (the hatred of which, thankfully, bears no social stigma). Even as the piece got thousands of likes and shares—which I can only hope means it resonated with some people—I am, as I suspect is human nature, preoccupied with the unfavorable responses, even if out of fascination. I could not comprehend how my words could prompt people to wish upon me a lack of livelihood, to veer into threat territory, to project psychological conditions onto me.

These experiences, on Humans of New York and Salon, underscore the stark differences between our real-world and Internet selves, not just on the part of the authors, but on the commenters as well. Those who comment on online articles, I assume, would never be so unkind in real life, without the veil of anonymity and the safety of a computer screen. Yet, when responding to one’s public persona—whether that be an author’s writings, an artist’s work, a celebrity’s personal life (all of which are, unlike most online comments, tied to real names and lives)—we immediately jump to conjecture and invective. The conclusions we draw are, at best, frequently ill-informed and incorrect—and, at worst, they can be devastating.

Justine Sacco, who lost her job after a poorly thought-out tweet, has become the poster girl for Internet ire, and she’s not the only one; in fact, the mass rage of Internet commenters and the life-altering consequences that follow have become commonplace enough that there’s now an entire book (which, full disclosure, I have not yet read) devoted to the subject. Which should lead us to wonder: Do we really want to potentially ruin strangers’ lives with our assumptions? Is our anger over someone’s joke or opinion so great that we want it to cost them everything they’ve worked for, and then some?

This is not to say that I regret any of the instances in which I have released my (evidently unpopular) opinion into cyberspace. Far from it—the experiences have forced my intensely private self far outside my comfort zone, and have helped me to develop a journalist’s requisite thick skin. Nor do I take offense at the commenters’ opinions. But I and those who know me in real life can’t help but notice the stark contrast between my actual self and the self conjured by commenters. Whether author or reader, we could all stand to self-reflect a bit more before hitting “publish”—but especially when projecting our presumptions and biases onto those we don’t actually know, but only think we do.

Even under the pall of strangers’ backlash, my most pleasant memory from commencement day is not graduating, nor is it the heartwarming support from my family, friends and professors, nor is it meeting Mr. Stanton; it is of the perfect stranger who handed me a purple-wrapped bouquet of flowers through the turnstile of the 116th Street subway station as I departed for the evening. Though the bouquet was sparse and the petals browning and wilted, I was touched by the generosity of the gesture. It was an act of small, spontaneous, judgment-free kindness. And, whether on the Internet or in real life, we could all use some more of that.

When I graduated from Columbia’s journalism school in May, I had no idea that my first brush with Internet (in)fam(y) would come just 24 hours later.

As I left the commencement festivities, still wearing my graduation gown, my father and I ran into Humans of New York photographer Brandon Stanton. As soon as he introduced himself, I immediately turned to my father, and said: “Dad, please don’t say anything stupid.”

My concern was not unwarranted: My father has, to put it mildly, the sometimes hilarious, sometimes humiliating combination of a quick wit, an off-color sense of humor and a lack of filter. (Dad’s response, much later: “But stupid things are all I know how to say!”) He was also completely unfamiliar with Humans of New York and its immense audience. The conversation that followed with Brandon was brief and, frankly, embarrassing, with my father and I both caught off-guard and tongue-tied, offering up the most vapid of clichés.  After a few questions, Stanton clearly realized the futility of his efforts, and said he’d like to use my opening words to my father, which he found “really cute” and which my family would agree pretty much sums up my father’s and my playful relationship. He then took our photograph, and we parted ways. (Once he was out of earshot, my dad remarked, “He must have thought we were a couple of blithering idiots.” C’est la vie.)

The following day, when the photograph and accompanying “Dad, please don’t say anything stupid” ran, the comments were overwhelmingly negative. Commenters accused me of being disrespectful and not appreciating my father; they assumed I was a petulant teenager; they predicted how much I’d regret my words following his eventual death. One commenter even falsely claimed to know me and my student debt situation. My parents and I got a good laugh out of these musings—my personal favorites took shots at my graduation gown, which some commenters apparently assumed was some sort of peculiar, cornflower-blue, muumuu-esque fashion statement—but the response stuck with me; it affirmed my technophobic belief that the Internet was a big, bad, scary place, even if it was one with which I, as a journalist, was poised to inhabit for the duration of my career. Shortly after the post went up, I emailed Stanton to express gratitude for the experience, apologies for our mostly unquotable quotes and surprise at the strength and nature of the Internet’s response. Stanton, in his reply, agreed and summed it up nicely: “HONYs comment section is one of the most positive on the internet.  But if there is a razors edge chance of anything being interpreted negatively, there will always be people in a crowd of 15 million that will find a way to do it.  And unfortunately those people are often the most vocal with their opinions."

I kept this in mind when I offered myself up to the ire of the anonymous masses for my essay “I hate your kids. And I’m not sorry.” Though a negative response here was not at all unanticipated—in fact, my editor warned me ahead of time that I might want to refrain from reading the comments, though curiosity ultimately got the better of me—the pointed viciousness of the commenters was. Here I was accused of being a psychopath (I am not), of thinking no one should have children (I do not think this), of actively being cruel to them (I am not) or wishing them harm (which I state in the piece I do not). A Mormon forum dedicated to “protect[ing] our children from…Alanna Weissman” sprang up. An article on the Belfast Telegraph’s site claimed I “let down the sisterhood.” People latched onto and read into the word “hate,” which I use in the colloquial sense, just as I profess that I basely, emphatically, viscerally hate onions (the hatred of which, thankfully, bears no social stigma). Even as the piece got thousands of likes and shares—which I can only hope means it resonated with some people—I am, as I suspect is human nature, preoccupied with the unfavorable responses, even if out of fascination. I could not comprehend how my words could prompt people to wish upon me a lack of livelihood, to veer into threat territory, to project psychological conditions onto me.

These experiences, on Humans of New York and Salon, underscore the stark differences between our real-world and Internet selves, not just on the part of the authors, but on the commenters as well. Those who comment on online articles, I assume, would never be so unkind in real life, without the veil of anonymity and the safety of a computer screen. Yet, when responding to one’s public persona—whether that be an author’s writings, an artist’s work, a celebrity’s personal life (all of which are, unlike most online comments, tied to real names and lives)—we immediately jump to conjecture and invective. The conclusions we draw are, at best, frequently ill-informed and incorrect—and, at worst, they can be devastating.

Justine Sacco, who lost her job after a poorly thought-out tweet, has become the poster girl for Internet ire, and she’s not the only one; in fact, the mass rage of Internet commenters and the life-altering consequences that follow have become commonplace enough that there’s now an entire book (which, full disclosure, I have not yet read) devoted to the subject. Which should lead us to wonder: Do we really want to potentially ruin strangers’ lives with our assumptions? Is our anger over someone’s joke or opinion so great that we want it to cost them everything they’ve worked for, and then some?

This is not to say that I regret any of the instances in which I have released my (evidently unpopular) opinion into cyberspace. Far from it—the experiences have forced my intensely private self far outside my comfort zone, and have helped me to develop a journalist’s requisite thick skin. Nor do I take offense at the commenters’ opinions. But I and those who know me in real life can’t help but notice the stark contrast between my actual self and the self conjured by commenters. Whether author or reader, we could all stand to self-reflect a bit more before hitting “publish”—but especially when projecting our presumptions and biases onto those we don’t actually know, but only think we do.

Even under the pall of strangers’ backlash, my most pleasant memory from commencement day is not graduating, nor is it the heartwarming support from my family, friends and professors, nor is it meeting Mr. Stanton; it is of the perfect stranger who handed me a purple-wrapped bouquet of flowers through the turnstile of the 116th Street subway station as I departed for the evening. Though the bouquet was sparse and the petals browning and wilted, I was touched by the generosity of the gesture. It was an act of small, spontaneous, judgment-free kindness. And, whether on the Internet or in real life, we could all use some more of that.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 09, 2016 16:30

I want to eat animals again: How the collapse of my marriage led me back to meat

I was vegetarian for 20 years, from 1993 to 2013. Prior to my eliminating animal sources of protein (except for dairy) I didn’t just eat meat—I ate animals whose names I knew. I grew up in one of Washington State’s most abundant agricultural regions, Skagit Valley, and most families I knew raised animals for food. My family maintained about a dozen sheep, a few pigs, occasionally some chickens. Every spring I looked forward to the births of the lambs. They were adorable, bouncing around on their spindly legs, chasing each other around the field. After a few months, after they’d plumped up, we’d hire a company called Del Fox to come out and shoot them in the head, hoist them up by their hind legs, slit their bellies, leave their steaming entrails on the ground, and then haul their corpses away to a meat packing plant. A few days later Del Fox would deliver a box of white paper bundles containing lamb chops, which my mother would fry and serve for dinner. We speculated aloud during these meals about which lamb we were eating. Fluffy? Julie? Then I moved to Olympia to attend Evergreen, a hippie college in the Pacific Northwest, where the pressure to go veg was ever-present. For the first couple years at Evergreen I continued to eat meat, of the most revolting processed variety, Taco Bell. Then I moved in with some vegetarian roommates and accidentally succumbed to their diet. Buckling to peer pressure, I decided to try not eating meat for a month. After that month I’d lost weight and felt a new energy pulsing in my body. I decided to keep my vegetarian diet going for awhile and started having nightmares about eating roast beef sandwiches. I backslid once, eating a McDonald’s chicken sandwich, and the thing just tasted glued together, processed, almost plastic. For the next two decades I would eat meat only by accident, with the exception of the anchovies in Caesar salad dressing, because I simply couldn’t give that up. My girlfriend, who became my wife, was vegetarian, and so it was easy to stick to grains and legumes. I scanned menus and didn’t see the meat. But I wasn’t exactly filling up on salads. Those 20 years were filled with cheese sandwiches, fries, pasta. The benefits of cutting meat out of my diet were probably erased by the amount of carbs I was putting down. I settled into the Louis CK/Jack Black physique of my adulthood and stayed there, but with the self-righteousness of not eating animals. After my marriage ended in 2012 I started dating a prime rib- and fried chicken-loving woman from Texas. In the process of reexamining and reconstructing my life, I realized that I actually did have a choice about whether or not to eat meat. The kicker was my dad’s grilled salmon. Salmon is a spiritual food in the Pacific Northwest. The first contact that Chief Sealth had with Europeans occurred when his tribe was celebrating the return of the salmon to their spawning grounds. Sealth assured the white men that they had no cause for worry about the exuberance of his peoples’ celebrations. “Guys, guys,” he said, “Relax, we’re just throwing down this epic party because--holy shit, would you look at how many fucking salmon there are?!” I paraphrase. When I was a kid my dad volunteered every year at the Kiwanis Club salmon barbecue, held at the county fairgrounds. He’d spend the day flipping racks of Coho and Chinook, and come home wearing the aroma of smoke and having singed off all of his arm hair. My father’s barbecued salmon is not to be missed in our family’s circle of friends. During family gatherings in my 20 years of tofu and Garden Burgers, I would salivate at the aroma of salmon wafting from my dad’s barbecue, but didn’t deviate from my self-imposed dietary restrictions. Finally, one summer afternoon in 2013, I held out my plate and asked my dad to serve me up a piece of pink, migratory fish marinated in soy sauce, lemon and maple syrup. I was curious if I’d experience a Proustian rush of memories, a slideshow of images like out of a Terrence Malick film. What happened when I took my first bite was a memory, for sure, but not at all the kind I expected. My mind didn’t open up into a reservoir of sensations. Rather, my whole body remembered what it felt like when I was 14. I didn’t see anything in my mind’s eye, but felt my younger self almost wearing my heavier, wearier 40-year-old body. My eyes welled up. The meat created a bridge over that 20-year period, back to my teenage self. Much of my post-divorce process has entailed reestablishing relationships to people and activities that had fallen into neglect. Liberated from vegetarianism, I embarked on a rediscovery of carnivorousness. One night after a movie I remembered the existence of pepperoni pizza, so I ordered one from Big Mario’s on Capitol Hill and ate it, in a state of rapture, in the street. One day I sat bolt upright in bed, waking from a nap with the realization that I could get fried chicken at Seattle’s legendary Ezell’s. Clam chowder, barbecued pork sandwiches, phad Thai with chicken, and oh my god Ivar’s fish and chips; my new diet is a heaven of animals. If only I knew their names.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 09, 2016 15:30

Ammon Bundy is not a terrorist: The authorities are waiting out the militia — just as they should do with Black Lives Matter protesters

On Saturday, Jan. 2, citizens of Burns, Oregon, held a rally protesting the sentencing of Oregon ranchers Dwight and Steven Hammond. The local demonstration was co-opted by a militia, led by Nevada native Ammon Bundy, now calling itself Citizens for Constitutional Freedom. Following its participation in the planned protest, the militia seized and continues to illegally occupy the nearby Malheur National Wildlife Refuge—vowing to remain there unless and until the Hammonds are granted clemency. Many have been eager to brand Bundy and his militia as terrorists, referring to them as “Yee-haw-dists” or “Vanilla ISIS.” And to be sure, there are similarities with Islamic militant groups. For instance, while Bundy’s “resistance movement” is essentially driven by socio-political issues, chiefly land rights and perceived overreach by the federal government, their campaign is also religiously framed and motivated. This same dynamic holds true for ISIS, al-Qaida and related groups. Moreover, Bundy and his associates hold views that most would consider extreme. In fact, they share ISIS’ admiration for slavery—with Cliven Bundy (Ammon’s father, and the head of the Bundy clan) having suggested that blacks may be better off today if they were still in chains; others from the militia are known members of designated hate groups and extremist organizations. Finally, as with al-Qaida, militants who drew inspiration from the Bundys have carried out atrocities that the family itself had to disavow. However, any similarities between the Citizens for Constitutional Freedom and Islamic terrorists are vastly outweighed by the differences between them.   For instance, the militia is not threatening violence, nor even to deface or destroy federal property, if their demands are not met; the only consequence is continued occupation. In the interim, they are being careful not to damage the refuge or its facilities, they have allowed the public to come and go largely unrestricted, and have even claimed that they will vacate the premises if it seems clear that the local population wants them to go. And so, while it is illegal for the militia to be occupying the Malheur Refuge, their actions would be better understood as an act of civil disobedience than an act of terror. Granted, Bundy and his supporters are heavily armed. However, there is no evidence that their weapons were illegally obtained, are unlicensed, or are otherwise unlawful—and in the United States there is a constitutional right to bear arms. Clearly, the purpose of the guns is to deter the authorities from raiding them, and consistent with previous standoffs, Bundy has threatened violence if there is any attempt to forcibly dismantle or dislodge their demonstration. However, there is no evidence that they are seeking out this kind of escalation. And so the authorities have decided to wait them out. Although they are contemplating cutting off power to the site to render the occupation less comfortable, they are otherwise content with monitoring the situation and keeping the lines of communication as open as possible in order to bring the demonstration to a peaceful end. This is exactly what they should be doing. Black Guns Matter? Of course, critics are quick to point out that the authorities would not have the same kind of respect for the protesters’ rights, nor exercised the same level of restraint, were federal buildings being occupied by armed minorities—for instance, blacks or Muslims. Recent history suggests they are absolutely correct. As a matter of fact, the open-carry movement was not started by white conservatives, but by the Black Panthers. Much like the Citizens for Constitutional Freedom, the Panthers legally purchased and licensed their guns, and carried them to public demonstrations in order to deter the authorities from impugning on their freedom of speech, their right to assemble, or denying them due process if accused of wrongdoing. These were very real concerns: At the time, civil rights activists were routinely harassed, intimidated, brutalized or killed by police officers or white mobs. Like Bundy, the Panthers emphasized that their weapons were strictly for defensive purposes—they were not seeking out confrontation, but if the others initiated violence, they would respond in kind. Did conservative lawmakers celebrate black people affirming these constitutional rights? Far from it: Strict gun control laws were drafted, explicitly to disarm the Panthers, receiving wide bipartisan support—to include from Ronald Reagan and the NRA. Despite the widespread erosion of gun restrictions in the intervening decades, it remains extremely difficult for black people to open-carry: African-Americans have been killed just for walking around with toy weapons or pellet guns. To brandish loaded military-grade ordnance, as white activists frequently do, would not deter authorities—it would spook them into responding with immediate and overwhelming force. Even peaceful protests by unarmed African-Americans have been overwhelmingly regarded as dangerous in right-leaning media—despite the fact that black demonstrators have the same basic demand as the Citizens for Constitutional Freedom, namely that the government respect and protect its citizens and their rights in accordance with the U.S. Constitution. Given this response when black people protest, it almost goes without saying that were a group of Muslim Americans to stockpile military weaponry, form a militia and then seize a government facility—perhaps demanding justice for the torture of U.S. citizens, indefinitely detaining U.S. citizens without trial or executing U.S. citizens without due process (all unconstitutional practices carried out predominantly against Muslims)—such a militia would immediately be branded as, and treated like, a terrorist organization. Indeed, authorities have attempted to entrap Muslim-Americans for terrorism just because they publicly expressed criticism of U.S. policies in the Middle East.   Faux Terror, Real Danger Clearly, there is a double standard. But here’s the takeaway: Muslims and other minority groups should be empowered to engage in the political sphere just as robustly, dynamically or even confrontationally as their white counterparts—and with the expectation that authorities will respect their rights, exercise the same level of restraint, and extend the same benefit of doubt that the Citizens for Constitutional Freedom have received. In other words, the goal should be to have minorities treated more like white people--not to have Bundy and his militia treated more like Muslims or black people, as many have urged. No U.S. citizens should be treated as terrorists for engaging in civil disobedience. It is a profound threat to our democracy that so many are willing, even eager, to call for an authoritarian clampdown on their ideological opponents under the pretext of fighting terrorism. It is troubling when Occupy Wall Street or Black Lives Matter is branded this way--and it would be just as disturbing to inflate the threat posed by Ammon Bundy in order to justify the use of force against his militia. Certainly, there are right-wing groups that are trying to secede from or overthrow the government. While the Bundys may be popular with these groups, they should not be counted among them: Despite incessant railing against federal overreach, the Bundys are heavily dependent on government programs to subsidize their enterprises. Similarly, there are right-leaning militias who have idiosyncratic and parochial views of what the United States is, or should be. Many of these are perfectly willing to engage in violence, coercion and intimidation in order to realize or protect their ideal. These groups need to be monitored, protected against or even confronted much more aggressively than they have been. However, as to now, the Citizens for Constitutional Freedom do not appear to be this kind of threat. They could become far more radical, both in terms of their ideology and their methods, should the government respond to them in a needlessly authoritarian fashion (moreover, such a clampdown would likely bolster, rather than undermine, public sympathy for their cause). If we haven’t learned this from the last 15 years of exacerbating Islamic terrorism in the name of fighting it, we should take heed of the worst attack on U.S. soil prior to 9/11: the Oklahoma City Bombing. Its perpetrator, Timothy McVeigh, was attempting to retaliate against the federal government for the overly aggressive ATF and FBI-led raids at Ruby Ridge and Waco. The lesson: Overreacting to faux threats tends to produce real ones. Ammon Bundy is not a terrorist. He should not be called a terrorist, nor should he be treated like one.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 09, 2016 14:30

R.I.P, the list pop song: Missing the lost art of rock ‘n’ roll lyrical overload

Don’t get me wrong, I really enjoy and find grace in the simplest of rock lyrics. While I don’t condone the activity, the Ramones’ “beat on the brat with a baseball bat,” is among my favorites. So is German '80s era group Trio’s “Da Da Da” (“I don’t love you you don’t love me…da, da, da…”). And I enjoy a good instrumental. I even like songs with lyrics I can’t make out, like “Louie Louie” or that Sly and the Family Stone song "There’s a Riot Going On," where he sounds like he’s falling asleep on the mic. At heart, though, I’m a word geek, and the more words an artist can stuff into three to five minutes the better. Chuck Berry, Bob Dylan, Patti Smith, Leonard Cohen, Elvis Costello (listen to “Beyond Belief” lately?), John Cooper Clarke: these are my people, my rock and roll poets, but where have they gone in the 2010s? Every so often, over the past five decades, we could count on at least one brain-twisting single bursting forth from all the pop radio and inspiring us to want to re-learn to read big books. Some intrepid songwriters said, Fuck four chords and a rhyme, I’m going to Dickens the shit out of these four chords! I was listening to one of my Spotify playlists the other day and the song “88 Lines About 44 Women,” a 1984 New Wave hit by The Nails, happened to come on (and when I say "happened" I mean pretty randomly, as the playlist is 28 hours long and entitled "Quality New Wave" … look it up). I’d always loved this song (44 couplets about 44 women… a sort of faux-Casanova’s version of Esquire’s eternally gross “Women We Love” feature, populated with gals like “Debbie Ray,” who came from a “perfect Norman Rockwell home,” to “Tonya,” who was “Turkish and liked to fuck while wearing leather biker boots,” or Brenda, who had a thing for “certain vegetables and fruits.” The song, expertly rhyming, is anchored to a very simple, almost novelty keyboard riff, with a baritone hummed hook propelling it; once you hear it, you never forget it. https://www.youtube.com/watch?v=LwUot... “'88 Lines...' was informed by that cheesy sounding Casio,” Marc Campbell, the song’s writer and vocalist, told me via email. “We recorded about five minutes of our playing and I went home and wrote the lyrics. There was enough music to accommodate 44 couplets.” The Nails, one of the decade’s great cult bands, were list-song happy. They’d also cover Los Hombres' “Let It All Hang Out,” and later issued “The Things You Left Behind,” which is an itemization of what an ex-boyfriend finds in his gone girl’s flat. But the main inspiration for “88” came from the poet Jim Carroll’s 1980 hit “People Who Died.” Another tabulation (appropriately over a Berry-style riff) of the late Basketball Diarist’s unfortunate pals who all met their Maker too young. “Teddy,” who fell off a roof on East 29th Street, “Bobby,” who got leukemia… https://www.youtube.com/watch?v=sjYWW... “When I wrote lyrics for ‘88 Lines...’ I consciously used the repetitive style of ‘People Who Died,'” Campbell said. “I love that song. The specificity of it. The detail. And the accumulated power of mantric repetition. I also drew inspiration from poet Joe Brainard's book 'I Remember,' in which Brainard simply lists things he remembered.” By 1980, the year “People Who Died” appeared on Carroll and band’s debut "Catholic Boy," the art and craft of packing song space with hyper-verbosity was commonplace to hip hop and various modes of reggae, especially songs featuring toasting, but had fallen off a bit in rock and roll since the ’60s heyday of Dylan and Reed. Even the Monkees had a hand at it: https://www.youtube.com/watch?v=xnzrG... Disco and punk songs were simpler, though often no less sublime. And with the exception of Cooper Clarke, who dressed like Dylan anyway, attention spans seemed to fade. For every “Walk this Way,” and “Life Is a Rock," there was a “Fly Robin Fly” (as in “up up to the sky”). https://www.youtube.com/watch?v=16kh-... There’s an element of jive and humor to the list song that may be too subtle for the disco dancer and the run-of-the-mill pogo-er with the safety pin earring. It throws back to hep jazzers and swingers like Cab Calloway and later Mose Allison, a sort of fleet-tongued hustle to beat the band. “Humor is definitely part of what makes '88 Lines' enjoyable,” Campbell said. “There are some funny lines in the song. Some witty wordplay. I wrote it fast and generally that's the best way to capture the energy of the moment. And like many songs that could be seen as being full of sexual bravado, I wanted to bring a Rudy Ray Moore vibe to the ceremonies,” he says, invoking the disco-era film icon. As the ’80s progressed, the indie rock era became a bit too self-serious. Hip hop seemed to grow even more lyrically expansive with Public Enemy, De La Soul, Big Daddy Kane and others offering rapidity and dexterity over their beats, and the dancehall genre grew incredibly popular. But in the rock realm, early R.E.M. lyrics might as well have been “Louie Louie,” inscrutable and mumbled. That is, until 1987, when they surprised everyone with the fluke hit “It’s the End of the World as We Know It,” one of the greatest list songs ever recorded. https://www.youtube.com/watch?v=Z0GFR... Probably the most popular — if not critically, certainly commercially — list song is from Billy the Kid. In 1989, Mr. Joel, recipient of every prestigious award this country can bestow outside of combat honors, ventured into the land of the goofers and hepsters, although it seems as if he didn’t really know it. Billy was being sincere when he gave us “We Didn’t Start the Fire,” and we rewarded him for his possible cluelessness with a number one hit (his last on the pop charts, as it turns out). https://www.youtube.com/watch?v=eFTLK... Campbell remembers the first time he heard the hit: "Blasting from a record store on 2nd Ave. in Manhattan, I thought it was from a new Joe Strummer solo album. I didn't hear it clearly, obviously, but imagine Strummer singing “Begin, Reagan, Palestine, Terror on the airline / Ayatollah's in Iran, Russians in Afghanistan.'” You kind of can, can’t you?   Rap’s lyrical tradition not only stands — through artists like multiple Grammy nominee Kendrick Lamaar, Run the Jewels and yes, Eminem — it’s flourished when compound rhymes and intrepidly long storytelling are concerned. But in pop and rock, we're in a bit of a drought. Australian singer-songwriter Courtney Barnett was another of this year's stop-and-listen highlights, but let’s face it — we are not going back to the time when a band like The Nails can get on the radio with a rock and roll song that has over a certain amount of verses. Attention spans are shriveled. Don’t bore us, get to the chorus reigns, and if a hunger is there, it’s being fed by technology like a French goose. There’s just too much information to savor, so the savvy, chart-minded pop star goes for the blips and pings instead. “People are listening," said Campbell. "The problem is too many pop artists aren't giving music fans much to listen to. It's still possible to grab people's attention. Miley Cyrus is proof of that. But when it comes to the deep shit, people prefer the shallow end of the lyrical pool. No one wants to think about shit.”Don’t get me wrong, I really enjoy and find grace in the simplest of rock lyrics. While I don’t condone the activity, the Ramones’ “beat on the brat with a baseball bat,” is among my favorites. So is German '80s era group Trio’s “Da Da Da” (“I don’t love you you don’t love me…da, da, da…”). And I enjoy a good instrumental. I even like songs with lyrics I can’t make out, like “Louie Louie” or that Sly and the Family Stone song "There’s a Riot Going On," where he sounds like he’s falling asleep on the mic. At heart, though, I’m a word geek, and the more words an artist can stuff into three to five minutes the better. Chuck Berry, Bob Dylan, Patti Smith, Leonard Cohen, Elvis Costello (listen to “Beyond Belief” lately?), John Cooper Clarke: these are my people, my rock and roll poets, but where have they gone in the 2010s? Every so often, over the past five decades, we could count on at least one brain-twisting single bursting forth from all the pop radio and inspiring us to want to re-learn to read big books. Some intrepid songwriters said, Fuck four chords and a rhyme, I’m going to Dickens the shit out of these four chords! I was listening to one of my Spotify playlists the other day and the song “88 Lines About 44 Women,” a 1984 New Wave hit by The Nails, happened to come on (and when I say "happened" I mean pretty randomly, as the playlist is 28 hours long and entitled "Quality New Wave" … look it up). I’d always loved this song (44 couplets about 44 women… a sort of faux-Casanova’s version of Esquire’s eternally gross “Women We Love” feature, populated with gals like “Debbie Ray,” who came from a “perfect Norman Rockwell home,” to “Tonya,” who was “Turkish and liked to fuck while wearing leather biker boots,” or Brenda, who had a thing for “certain vegetables and fruits.” The song, expertly rhyming, is anchored to a very simple, almost novelty keyboard riff, with a baritone hummed hook propelling it; once you hear it, you never forget it. https://www.youtube.com/watch?v=LwUot... “'88 Lines...' was informed by that cheesy sounding Casio,” Marc Campbell, the song’s writer and vocalist, told me via email. “We recorded about five minutes of our playing and I went home and wrote the lyrics. There was enough music to accommodate 44 couplets.” The Nails, one of the decade’s great cult bands, were list-song happy. They’d also cover Los Hombres' “Let It All Hang Out,” and later issued “The Things You Left Behind,” which is an itemization of what an ex-boyfriend finds in his gone girl’s flat. But the main inspiration for “88” came from the poet Jim Carroll’s 1980 hit “People Who Died.” Another tabulation (appropriately over a Berry-style riff) of the late Basketball Diarist’s unfortunate pals who all met their Maker too young. “Teddy,” who fell off a roof on East 29th Street, “Bobby,” who got leukemia… https://www.youtube.com/watch?v=sjYWW... “When I wrote lyrics for ‘88 Lines...’ I consciously used the repetitive style of ‘People Who Died,'” Campbell said. “I love that song. The specificity of it. The detail. And the accumulated power of mantric repetition. I also drew inspiration from poet Joe Brainard's book 'I Remember,' in which Brainard simply lists things he remembered.” By 1980, the year “People Who Died” appeared on Carroll and band’s debut "Catholic Boy," the art and craft of packing song space with hyper-verbosity was commonplace to hip hop and various modes of reggae, especially songs featuring toasting, but had fallen off a bit in rock and roll since the ’60s heyday of Dylan and Reed. Even the Monkees had a hand at it: https://www.youtube.com/watch?v=xnzrG... Disco and punk songs were simpler, though often no less sublime. And with the exception of Cooper Clarke, who dressed like Dylan anyway, attention spans seemed to fade. For every “Walk this Way,” and “Life Is a Rock," there was a “Fly Robin Fly” (as in “up up to the sky”). https://www.youtube.com/watch?v=16kh-... There’s an element of jive and humor to the list song that may be too subtle for the disco dancer and the run-of-the-mill pogo-er with the safety pin earring. It throws back to hep jazzers and swingers like Cab Calloway and later Mose Allison, a sort of fleet-tongued hustle to beat the band. “Humor is definitely part of what makes '88 Lines' enjoyable,” Campbell said. “There are some funny lines in the song. Some witty wordplay. I wrote it fast and generally that's the best way to capture the energy of the moment. And like many songs that could be seen as being full of sexual bravado, I wanted to bring a Rudy Ray Moore vibe to the ceremonies,” he says, invoking the disco-era film icon. As the ’80s progressed, the indie rock era became a bit too self-serious. Hip hop seemed to grow even more lyrically expansive with Public Enemy, De La Soul, Big Daddy Kane and others offering rapidity and dexterity over their beats, and the dancehall genre grew incredibly popular. But in the rock realm, early R.E.M. lyrics might as well have been “Louie Louie,” inscrutable and mumbled. That is, until 1987, when they surprised everyone with the fluke hit “It’s the End of the World as We Know It,” one of the greatest list songs ever recorded. https://www.youtube.com/watch?v=Z0GFR... Probably the most popular — if not critically, certainly commercially — list song is from Billy the Kid. In 1989, Mr. Joel, recipient of every prestigious award this country can bestow outside of combat honors, ventured into the land of the goofers and hepsters, although it seems as if he didn’t really know it. Billy was being sincere when he gave us “We Didn’t Start the Fire,” and we rewarded him for his possible cluelessness with a number one hit (his last on the pop charts, as it turns out). https://www.youtube.com/watch?v=eFTLK... Campbell remembers the first time he heard the hit: "Blasting from a record store on 2nd Ave. in Manhattan, I thought it was from a new Joe Strummer solo album. I didn't hear it clearly, obviously, but imagine Strummer singing “Begin, Reagan, Palestine, Terror on the airline / Ayatollah's in Iran, Russians in Afghanistan.'” You kind of can, can’t you?   Rap’s lyrical tradition not only stands — through artists like multiple Grammy nominee Kendrick Lamaar, Run the Jewels and yes, Eminem — it’s flourished when compound rhymes and intrepidly long storytelling are concerned. But in pop and rock, we're in a bit of a drought. Australian singer-songwriter Courtney Barnett was another of this year's stop-and-listen highlights, but let’s face it — we are not going back to the time when a band like The Nails can get on the radio with a rock and roll song that has over a certain amount of verses. Attention spans are shriveled. Don’t bore us, get to the chorus reigns, and if a hunger is there, it’s being fed by technology like a French goose. There’s just too much information to savor, so the savvy, chart-minded pop star goes for the blips and pings instead. “People are listening," said Campbell. "The problem is too many pop artists aren't giving music fans much to listen to. It's still possible to grab people's attention. Miley Cyrus is proof of that. But when it comes to the deep shit, people prefer the shallow end of the lyrical pool. No one wants to think about shit.”Don’t get me wrong, I really enjoy and find grace in the simplest of rock lyrics. While I don’t condone the activity, the Ramones’ “beat on the brat with a baseball bat,” is among my favorites. So is German '80s era group Trio’s “Da Da Da” (“I don’t love you you don’t love me…da, da, da…”). And I enjoy a good instrumental. I even like songs with lyrics I can’t make out, like “Louie Louie” or that Sly and the Family Stone song "There’s a Riot Going On," where he sounds like he’s falling asleep on the mic. At heart, though, I’m a word geek, and the more words an artist can stuff into three to five minutes the better. Chuck Berry, Bob Dylan, Patti Smith, Leonard Cohen, Elvis Costello (listen to “Beyond Belief” lately?), John Cooper Clarke: these are my people, my rock and roll poets, but where have they gone in the 2010s? Every so often, over the past five decades, we could count on at least one brain-twisting single bursting forth from all the pop radio and inspiring us to want to re-learn to read big books. Some intrepid songwriters said, Fuck four chords and a rhyme, I’m going to Dickens the shit out of these four chords! I was listening to one of my Spotify playlists the other day and the song “88 Lines About 44 Women,” a 1984 New Wave hit by The Nails, happened to come on (and when I say "happened" I mean pretty randomly, as the playlist is 28 hours long and entitled "Quality New Wave" … look it up). I’d always loved this song (44 couplets about 44 women… a sort of faux-Casanova’s version of Esquire’s eternally gross “Women We Love” feature, populated with gals like “Debbie Ray,” who came from a “perfect Norman Rockwell home,” to “Tonya,” who was “Turkish and liked to fuck while wearing leather biker boots,” or Brenda, who had a thing for “certain vegetables and fruits.” The song, expertly rhyming, is anchored to a very simple, almost novelty keyboard riff, with a baritone hummed hook propelling it; once you hear it, you never forget it. https://www.youtube.com/watch?v=LwUot... “'88 Lines...' was informed by that cheesy sounding Casio,” Marc Campbell, the song’s writer and vocalist, told me via email. “We recorded about five minutes of our playing and I went home and wrote the lyrics. There was enough music to accommodate 44 couplets.” The Nails, one of the decade’s great cult bands, were list-song happy. They’d also cover Los Hombres' “Let It All Hang Out,” and later issued “The Things You Left Behind,” which is an itemization of what an ex-boyfriend finds in his gone girl’s flat. But the main inspiration for “88” came from the poet Jim Carroll’s 1980 hit “People Who Died.” Another tabulation (appropriately over a Berry-style riff) of the late Basketball Diarist’s unfortunate pals who all met their Maker too young. “Teddy,” who fell off a roof on East 29th Street, “Bobby,” who got leukemia… https://www.youtube.com/watch?v=sjYWW... “When I wrote lyrics for ‘88 Lines...’ I consciously used the repetitive style of ‘People Who Died,'” Campbell said. “I love that song. The specificity of it. The detail. And the accumulated power of mantric repetition. I also drew inspiration from poet Joe Brainard's book 'I Remember,' in which Brainard simply lists things he remembered.” By 1980, the year “People Who Died” appeared on Carroll and band’s debut "Catholic Boy," the art and craft of packing song space with hyper-verbosity was commonplace to hip hop and various modes of reggae, especially songs featuring toasting, but had fallen off a bit in rock and roll since the ’60s heyday of Dylan and Reed. Even the Monkees had a hand at it: https://www.youtube.com/watch?v=xnzrG... Disco and punk songs were simpler, though often no less sublime. And with the exception of Cooper Clarke, who dressed like Dylan anyway, attention spans seemed to fade. For every “Walk this Way,” and “Life Is a Rock," there was a “Fly Robin Fly” (as in “up up to the sky”). https://www.youtube.com/watch?v=16kh-... There’s an element of jive and humor to the list song that may be too subtle for the disco dancer and the run-of-the-mill pogo-er with the safety pin earring. It throws back to hep jazzers and swingers like Cab Calloway and later Mose Allison, a sort of fleet-tongued hustle to beat the band. “Humor is definitely part of what makes '88 Lines' enjoyable,” Campbell said. “There are some funny lines in the song. Some witty wordplay. I wrote it fast and generally that's the best way to capture the energy of the moment. And like many songs that could be seen as being full of sexual bravado, I wanted to bring a Rudy Ray Moore vibe to the ceremonies,” he says, invoking the disco-era film icon. As the ’80s progressed, the indie rock era became a bit too self-serious. Hip hop seemed to grow even more lyrically expansive with Public Enemy, De La Soul, Big Daddy Kane and others offering rapidity and dexterity over their beats, and the dancehall genre grew incredibly popular. But in the rock realm, early R.E.M. lyrics might as well have been “Louie Louie,” inscrutable and mumbled. That is, until 1987, when they surprised everyone with the fluke hit “It’s the End of the World as We Know It,” one of the greatest list songs ever recorded. https://www.youtube.com/watch?v=Z0GFR... Probably the most popular — if not critically, certainly commercially — list song is from Billy the Kid. In 1989, Mr. Joel, recipient of every prestigious award this country can bestow outside of combat honors, ventured into the land of the goofers and hepsters, although it seems as if he didn’t really know it. Billy was being sincere when he gave us “We Didn’t Start the Fire,” and we rewarded him for his possible cluelessness with a number one hit (his last on the pop charts, as it turns out). https://www.youtube.com/watch?v=eFTLK... Campbell remembers the first time he heard the hit: "Blasting from a record store on 2nd Ave. in Manhattan, I thought it was from a new Joe Strummer solo album. I didn't hear it clearly, obviously, but imagine Strummer singing “Begin, Reagan, Palestine, Terror on the airline / Ayatollah's in Iran, Russians in Afghanistan.'” You kind of can, can’t you?   Rap’s lyrical tradition not only stands — through artists like multiple Grammy nominee Kendrick Lamaar, Run the Jewels and yes, Eminem — it’s flourished when compound rhymes and intrepidly long storytelling are concerned. But in pop and rock, we're in a bit of a drought. Australian singer-songwriter Courtney Barnett was another of this year's stop-and-listen highlights, but let’s face it — we are not going back to the time when a band like The Nails can get on the radio with a rock and roll song that has over a certain amount of verses. Attention spans are shriveled. Don’t bore us, get to the chorus reigns, and if a hunger is there, it’s being fed by technology like a French goose. There’s just too much information to savor, so the savvy, chart-minded pop star goes for the blips and pings instead. “People are listening," said Campbell. "The problem is too many pop artists aren't giving music fans much to listen to. It's still possible to grab people's attention. Miley Cyrus is proof of that. But when it comes to the deep shit, people prefer the shallow end of the lyrical pool. No one wants to think about shit.”

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 09, 2016 13:30

The big home ownership lie: Greed, fear and how the big banks exploited a human need

The Home Ownership Meme Can Be Very Powerful Most of us are familiar with the concept of the “internet meme.” Someone does something, say “planking” on a fast-food counter, and uploads a video of him- or herself doing this to a public or social media site. Other people see the video, emulate the activity, and post themselves doing this to even more sites. Before you know it, the activity has spread like an epidemic throughout the world. Another internet meme has been born. Evolutionary biologist Richard Dawkins coined the term meme decades ago, but it really did not become a widespread meme itself until the internet age.15 Dawkins’s concept of meme was that it is an analog to gene—a replicator that is subject to natural selection and other evolutionary forces just as genes are, but in the cultural rather than the biological environment. Although memes spread through the cultural environment, competing with other memes and undergoing changes or mutations that make them more or less appealing, the ultimate selective environment for them is the human brain. This means that cultural memes that tap into the fundamental cognitive drives and preferences of the mind may be quite powerful and pervasive and important. For example, there has been much investigation of the evolutionary “memeplex” of religious belief systems, which are obviously of critical importance to our species as a whole. I suggest that another very important meme, at least in the United States, the United Kingdom, and some other countries, is “home ownership is good.” This cannot be a particularly ancient meme, because home ownership is a concept that emerges from relatively modern capitalist economies. On the other hand, the notion that people somehow control and possess the structures (or living spaces) they inhabit is probably much more ancient. Indeed, the security and stability that are part of feeling at home likely derive from a sense of possession of a structure, if not ownership in a legal or economic sense. From its founding, what with the emphasis on individual self-determination, it seems only natural that the United States would foster an ideology of home ownership. By owning their own property, farmers on the expanding frontier could be free of the feudal oversight of landlords and control their own destinies. Of course, farms are both home and business, which further encourages ownership on the part of their occupants. As historian Lawrence Vale has pointed out, by the second half of the nineteenth century, more and more Americans were living in towns and cities, and fewer were living where they worked. Nonetheless, the frontier mentality still held, and the idea of home ownership as a good, American thing remained common. Although only 7 percent of Bostonians were homeowners in 1880, that figure rose to 25 percent by 1900 and 35 percent by 1910. This increase was made possible by the development of new neighborhoods and suburbs away from the original city center. At the beginning of the twentieth century, non-farm home ownership became a driving force in housing policy in the United States, at several different levels of government. Zoning laws and tax codes increasingly encouraged home ownership, and realtor and builder trade organizations began to promote home ownership as the apotheosis of American values. As Vale writes, “the home could be lauded as the superiority of individualism to anarchosyndicalism or other socialist or communist movements.” This all came together in an extraordinary promotional campaign launched in the early 1920s by the National Association of Real Estate Boards (NAREB), working with the US Department of Labor and other groups. The “Own Your Own Home” campaign made home ownership practically a patriotic duty. Not only that, home ownership was equated with manliness and power, the rugged frontiersmen of the nineteenth century reborn in the suburban bungalows of the twentieth. This campaign, along with other policy initiatives, consistently denigrated renting or tenancy, and offered little to address the plight of the slum-dwelling poor (who were in fact vilified). The government’s role in housing was seen to be only as a facilitator, not as a primary provider. Undoubtedly, the “Own Your Own Home” campaign both shaped and reflected public opinion. We see that the ideology of home ownership present in the US housing bubble of the 2000s had deep roots, both in the culture and in decades-old government policy. In the United Kingdom, home ownership in the twentieth century has been seen as a foundation of democracy. Margaret Thatcher in fact hailed the “property-owning democracy,” and under her direction, two million government-owned residences were sold to private buyers. Since her time in the 1980s, successive British governments strongly encouraged home ownership. These policies had the dual effect of temporarily expanding home ownership while supporting an extraordinary rise in prices. For many years, this was not seen as bad news. As Faisal Islam says, “Housing is the only basic human need for which rapid price rises are met with celebration rather than protest.” The power of the home ownership meme helped make some people exceptionally vulnerable to predatory lenders during the housing boom. There is nothing new about lenders who prey on the desperation of their clients: loan sharking has long been a cornerstone of organized crime, payday lenders and check-cashing services make their money on people who need their money sooner rather than later, and pawnbrokers have been around forever. But lenders for buying houses are different, or at least they used to be. To paraphrase one British banking executive, during the sub-prime lending boom, it was as if mortgage lenders changed from being like doctors, with the best interests of the client (at least in terms of the ability to actually pay for a loan) at the forefront, to being like bartenders, plying customers with more and more whether or not they could handle it. Statistics during the housing boom in the United States showed that subprime loans were most common in the most overheated housing markets of California, Florida, Nevada, and Arizona, where they accounted for between five and ten new mortgages per one hundred housing units in 2005. Subprime loans were also disproportionately concentrated in zip codes with larger African American and Hispanic populations. To be in a position to obtain a mortgage, even a subprime one, buyers usually have some income and are almost certainly not homeless. They are far from hitting economic rock bottom; although they may be poorer or have worse credit than average, they are not conventionally financially desperate. During the boom, the vulnerability of subprime borrowers to their lenders was not based on an immediate need of financial rescue or resolution. Instead, they were seduced by cheap credit, or at least money that looked cheap (remember money illusion), and the possibility of substantial financial gains in a booming housing market. Prey become vulnerable to predators for all sorts of reasons. It would be easy to blame greed for why borrowers took on more than they could handle, but I suspect that fear was also an important factor. For poorer first-time buyers and those whose bad credit had kept them out of their own house, the booming real estate market would make the prospect of joining or rejoining the ranks of homeowners seem ever more distant. We have seen that the home ownership meme is very powerful and pervasive—to not be a homeowner is to not be a full participant in democratic society. The fear of not ever being able to own a home, to be permanently priced out of a market, is a great motivator to buy now. Some regions of the United States, such as the large urban centers of the Northeast, have long accepted renting as an acceptable and realistic alternative to home ownership. It is perhaps not surprising that the subprime crisis was not much of a crisis in these regions. Instead, it hit hardest in the Sun Belt states, with geographically expansive real estate markets and an abundance of new home construction. These areas also have more migration into them—they make up some of the new frontiers of American life. The home ownership meme may be more powerful in these areas, and the failure to live up to it more acutely felt. The idea of home ownership is culturally constructed, but the power of feelings related to home runs much deeper. They affect our emotions, which in turn affect our ability to make decisions, financial or otherwise. Excerpted with permission from "Home: How Habitat Made Us Human" by John S. Allen. Available from Basic Books, a member of The Perseus Books Group. Copyright © 2015. All rights reserved.The Home Ownership Meme Can Be Very Powerful Most of us are familiar with the concept of the “internet meme.” Someone does something, say “planking” on a fast-food counter, and uploads a video of him- or herself doing this to a public or social media site. Other people see the video, emulate the activity, and post themselves doing this to even more sites. Before you know it, the activity has spread like an epidemic throughout the world. Another internet meme has been born. Evolutionary biologist Richard Dawkins coined the term meme decades ago, but it really did not become a widespread meme itself until the internet age.15 Dawkins’s concept of meme was that it is an analog to gene—a replicator that is subject to natural selection and other evolutionary forces just as genes are, but in the cultural rather than the biological environment. Although memes spread through the cultural environment, competing with other memes and undergoing changes or mutations that make them more or less appealing, the ultimate selective environment for them is the human brain. This means that cultural memes that tap into the fundamental cognitive drives and preferences of the mind may be quite powerful and pervasive and important. For example, there has been much investigation of the evolutionary “memeplex” of religious belief systems, which are obviously of critical importance to our species as a whole. I suggest that another very important meme, at least in the United States, the United Kingdom, and some other countries, is “home ownership is good.” This cannot be a particularly ancient meme, because home ownership is a concept that emerges from relatively modern capitalist economies. On the other hand, the notion that people somehow control and possess the structures (or living spaces) they inhabit is probably much more ancient. Indeed, the security and stability that are part of feeling at home likely derive from a sense of possession of a structure, if not ownership in a legal or economic sense. From its founding, what with the emphasis on individual self-determination, it seems only natural that the United States would foster an ideology of home ownership. By owning their own property, farmers on the expanding frontier could be free of the feudal oversight of landlords and control their own destinies. Of course, farms are both home and business, which further encourages ownership on the part of their occupants. As historian Lawrence Vale has pointed out, by the second half of the nineteenth century, more and more Americans were living in towns and cities, and fewer were living where they worked. Nonetheless, the frontier mentality still held, and the idea of home ownership as a good, American thing remained common. Although only 7 percent of Bostonians were homeowners in 1880, that figure rose to 25 percent by 1900 and 35 percent by 1910. This increase was made possible by the development of new neighborhoods and suburbs away from the original city center. At the beginning of the twentieth century, non-farm home ownership became a driving force in housing policy in the United States, at several different levels of government. Zoning laws and tax codes increasingly encouraged home ownership, and realtor and builder trade organizations began to promote home ownership as the apotheosis of American values. As Vale writes, “the home could be lauded as the superiority of individualism to anarchosyndicalism or other socialist or communist movements.” This all came together in an extraordinary promotional campaign launched in the early 1920s by the National Association of Real Estate Boards (NAREB), working with the US Department of Labor and other groups. The “Own Your Own Home” campaign made home ownership practically a patriotic duty. Not only that, home ownership was equated with manliness and power, the rugged frontiersmen of the nineteenth century reborn in the suburban bungalows of the twentieth. This campaign, along with other policy initiatives, consistently denigrated renting or tenancy, and offered little to address the plight of the slum-dwelling poor (who were in fact vilified). The government’s role in housing was seen to be only as a facilitator, not as a primary provider. Undoubtedly, the “Own Your Own Home” campaign both shaped and reflected public opinion. We see that the ideology of home ownership present in the US housing bubble of the 2000s had deep roots, both in the culture and in decades-old government policy. In the United Kingdom, home ownership in the twentieth century has been seen as a foundation of democracy. Margaret Thatcher in fact hailed the “property-owning democracy,” and under her direction, two million government-owned residences were sold to private buyers. Since her time in the 1980s, successive British governments strongly encouraged home ownership. These policies had the dual effect of temporarily expanding home ownership while supporting an extraordinary rise in prices. For many years, this was not seen as bad news. As Faisal Islam says, “Housing is the only basic human need for which rapid price rises are met with celebration rather than protest.” The power of the home ownership meme helped make some people exceptionally vulnerable to predatory lenders during the housing boom. There is nothing new about lenders who prey on the desperation of their clients: loan sharking has long been a cornerstone of organized crime, payday lenders and check-cashing services make their money on people who need their money sooner rather than later, and pawnbrokers have been around forever. But lenders for buying houses are different, or at least they used to be. To paraphrase one British banking executive, during the sub-prime lending boom, it was as if mortgage lenders changed from being like doctors, with the best interests of the client (at least in terms of the ability to actually pay for a loan) at the forefront, to being like bartenders, plying customers with more and more whether or not they could handle it. Statistics during the housing boom in the United States showed that subprime loans were most common in the most overheated housing markets of California, Florida, Nevada, and Arizona, where they accounted for between five and ten new mortgages per one hundred housing units in 2005. Subprime loans were also disproportionately concentrated in zip codes with larger African American and Hispanic populations. To be in a position to obtain a mortgage, even a subprime one, buyers usually have some income and are almost certainly not homeless. They are far from hitting economic rock bottom; although they may be poorer or have worse credit than average, they are not conventionally financially desperate. During the boom, the vulnerability of subprime borrowers to their lenders was not based on an immediate need of financial rescue or resolution. Instead, they were seduced by cheap credit, or at least money that looked cheap (remember money illusion), and the possibility of substantial financial gains in a booming housing market. Prey become vulnerable to predators for all sorts of reasons. It would be easy to blame greed for why borrowers took on more than they could handle, but I suspect that fear was also an important factor. For poorer first-time buyers and those whose bad credit had kept them out of their own house, the booming real estate market would make the prospect of joining or rejoining the ranks of homeowners seem ever more distant. We have seen that the home ownership meme is very powerful and pervasive—to not be a homeowner is to not be a full participant in democratic society. The fear of not ever being able to own a home, to be permanently priced out of a market, is a great motivator to buy now. Some regions of the United States, such as the large urban centers of the Northeast, have long accepted renting as an acceptable and realistic alternative to home ownership. It is perhaps not surprising that the subprime crisis was not much of a crisis in these regions. Instead, it hit hardest in the Sun Belt states, with geographically expansive real estate markets and an abundance of new home construction. These areas also have more migration into them—they make up some of the new frontiers of American life. The home ownership meme may be more powerful in these areas, and the failure to live up to it more acutely felt. The idea of home ownership is culturally constructed, but the power of feelings related to home runs much deeper. They affect our emotions, which in turn affect our ability to make decisions, financial or otherwise. Excerpted with permission from "Home: How Habitat Made Us Human" by John S. Allen. Available from Basic Books, a member of The Perseus Books Group. Copyright © 2015. All rights reserved.

Continue Reading...










 •  0 comments  •  flag
Share on Twitter
Published on January 09, 2016 12:30