Atlantic Monthly Contributors's Blog, page 287

November 28, 2015

On Pandering and Julia Child: The Week in Pop-Culture Writing

Image

On Pandering
Claire Vaye Watkins | Tin House
“I wrote Battleborn for white men, toward them. If you hold the book to a certain light, you’ll see it as an exercise in self-hazing, a product of working-class madness, the female strain. So, natural then that Battleborn was well-received by the white-male lit establishment: It was written for them. The whole book’s a pander.”

Thanksgiving, the Julia Child Way
Julia Moskin | The New York Times
“No matter how busy, Mrs. Child would hand off whatever kitchen task she was doing, take the phone and talk the nervous cook down from the ledge … But Mrs. Child refused to unlist her number or turn off the phone; instead, she embraced the role of national Thanksgiving commander in chief.”

The Most Famous American Dog on Instagram
David Shapiro | The New Yorker
“Braha was annoyed with these e-mails. She was upset that Marnie was referred to as merely an ‘influencer.’ She insisted, ‘Marnie’s not an influencer—she’s a celebrity. When you’re stopped constantly on the street, you’re not an influencer—you’re a celebrity.’ She shook her head. She clicked through a few more. ‘Hillary’s campaign reached out for a photo. I need to follow up on that ...’”

In Praise of Mega Man X
Levi Rubeck | Kill Screen
“Going fast is easy—the challenge is in reacting to the unwritten near-future while maintaining environmental awareness to avoid running into shit. For all the risks to life and limb, the human brain and body craves the thrill of speed. As such, even relatively primitive virtualized acceleration titillates. ”

The Art of the Strange Writing Exercise
Nick Ripatrazone | The Millions
“We might be romantic and say that teacher and student need to create art through imagination, but in education, form is function. We need to shake things up in the creative-writing classroom. We need to remember that writing is a messy, fractured, intensely personal pursuit that must not be neutered by the institutional needs of our classrooms.”

In Conversation with Ray McKesson
Rembert Browne | New York
“To some degree, what you are doing is simultaneously selfless and a privilege, in the sense that there are people who want to be out there every day, but just have a nine-to-five and can’t. So it’s as if, at times, you’re speaking on behalf of those people who want to be there. But it’s also a privilege to be able to do it.”

Why Marvel’s Jessica Jones Survivor Narrative Is So Powerful
Pilot Viruet | Flavorwire
“It’s all heavy, heavy subject matter for any kind of art, let alone a comic book adaptation, but it’s ultimately brilliant and necessary. Comic books provide an escape for many readers, solace in finding a powerful narrative that helps them in their real life; Marvel’s Jessica Jones goes a step further and provides a realistic, cathartic narrative for survivors.”

The Mournful Sci-Fi Masterpiece Behind Amazon’s Splashy Man in the High Castle
Laura Miller | Slate
“The series’ creators have tried to pump up its premise into something that can sustain a 10-episode season (or more) by giving Dick’s dystopia an element that it utterly lacks in the book: an insurgency dedicated to fighting the twin fascist regimes that control the former United States. The people in Dick’s novel never consider resistance. They’re not heroes, and that, paradoxically, is exactly what makes them so arresting.”











 •  0 comments  •  flag
Share on Twitter
Published on November 28, 2015 05:00

Early Black Friday Sales Numbers Are Bunk

Image

Every year, shortly after Black Friday, the National Retail Federation issues its estimate of all the retail spending over the first weekend of the holiday season. And every year, media outlets repeat that estimate in earnest: Last year, citing NRF data, The New York Times ran the headline “Black Friday Fatigue? Thanksgiving Weekend Sales Slide 11 Percent.” The Wall Street Journal went with “‘Black Friday’ Fades as Weekend Retail Sales Sink.” Such headlines are often taken as indicators of holiday sales in general, and even consumers’ confidence in the economy.

But there’s another, much less visible, component of this annual tradition: Every year, a man named Barry Ritholtz, an asset manager and contributor to The Washington Post and Bloomberg View, complains about the quality of the NRF’s data and the media’s mindless repetition of it. In last year’s installment of his ongoing rant, he wrote, “I have become a curmudgeon on this.”

Ritholtz is mad about two things. First, he takes issue with the NRF’s methodology. As he explained to the podcast On the Media last year:

What the survey that the National Retail Federation does every year is they ask a group of shoppers, “Hey, what did you spend on holiday shopping last year?” and they get some number, and then say, “Well, what are you going to spend on this year?” And they get a second number and then the difference between those two numbers is how they come up with [it].

In other words, the NRF data that major newspapers rely on is self-reported, and self-reported data on spending is notoriously weak. It’s just not very useful to know what shoppers say they expect their future spending to be. They are going to be wrong. (ShopperTrak, an analytics firm whose data was used by the Times this year to pronounce a “slip” in sales, uses a methodology that Ritholtz finds similarly suspicious.)

The second thing that irritates Ritholtz is how this (already questionable) data is used to make claims about consumer sentiment at large. In 2005, the year Ritholtz first aired his grievances (to a reporter at The Wall Street Journal), the NRF estimated that sales the weekend after Thanksgiving were up 22 percent, but holiday spending that season ended up increasing only 1 percent over the previous year. The most egregious instance was in 2009, when the NRF’s 43 percent estimated decrease was rendered ridiculous by the 3 percent increase revealed after an actual tally.

The NRF, for its part, has claimed that it doesn’t present its estimates as definitive, and further, says that it has confidence in the validity of its survey methodology.

Last year on On the Media, one host, Bob Garfield, posed an important question to Ritholtz. “Why is it in the NRF's interest to be trading these silly numbers year after year, especially when the numbers point downwards?” he asked. To which Ritholtz replied: “It keeps the shopping season front and center.”









 •  0 comments  •  flag
Share on Twitter
Published on November 28, 2015 04:00

November 27, 2015

A Planned Parenthood Clinic Shooting in Colorado

Image

Local police responded to reports of an active shooter at a Planned Parenthood clinic in Colorado on Friday. At least four officers and “multiple civilians” were injured in what the Denver Post characterized as an “ongoing active shooter situation.”

Details about the shooting remain fluid. A 911 call placed from the Colorado Springs clinic, located about 70 miles south of Denver, first reported a gunman at about 11:38 a.m. local time. A Colorado Springs Police Department spokesperson told The Gazette, a local newspaper, that police were “actively engaged” with the gunman inside the Planned Parenthood facility.

White House officials said that President Obama had been briefed about the shooting. Planned Parenthood’s national organization also tweeted that they were monitoring the situation.

We’re monitoring the situation at the #ColoradoSprings PP health center. Right now our concern is for the safety of patients, staff & police

— Planned Parenthood (@PPact) November 27, 2015

The article will be updated as the story develops.











 •  0 comments  •  flag
Share on Twitter
Published on November 27, 2015 14:27

Breakfast Cereal's Last Gasp

Image

Last year, General Mills launched a new product aimed at health-conscious customers: Cheerios Protein, a version of its popular cereal made with whole-grain oats and lentils. Early reviews were favorable. The cereal, Huffington Post reported, tasted mostly like regular Cheerios, although “it seemed like they were sweetened and flavored a little more aggressively.” Meanwhile, ads boasted that the cereal would offer “long-lasting energy” as opposed to a sugar crash.

But earlier this month, the Center for Science in the Public Interest sued General Mills, saying that there’s very little extra protein in Cheerios Protein compared to the original brand and an awful lot more sugar—17 times as much, in fact. So why would General Mills try to market a product as containing protein when it’s really a box fill of carbs and refined sugar?

The easiest answer: Because history has shown it works. For more than a century, brands have successfully used health-related claims and gimmicky marketing to sell sugar as a breakfast product. The earliest cereal products were indeed intended as a better alternative to heavy morning meals. Around the 1930s, companies started marketing cereal to children, and they found that younger consumers infinitely preferred a sweeter product, while also being swayed by lovable mascots in advertising campaigns. (A 2014 study even found that these characters are designed and placed on the shelf to make eye contact with kids, establishing a sense of trust and connection.) But brands are now finding it more difficult to convince better-informed and rightly skeptical consumers of the health benefits of sugary cereal, which looks to be falling from the perch it enjoyed in American food culture for a century.

John Harvey Kellogg, a doctor and Seventh Day Adventist who ran a wellness retreat called the Battle Creek Sanitarium, is credited with inventing ready-to-eat cereal in 1878. Granula, as it was originally called, was designed to help treat illnesses such as dyspepsia. “The Sanitarium wanted something to give its patients instead of breakfasts with sausages and eggs and bacon,” says Martin Gitlin, the author of The Great American Cereal Book. Kellogg’s brother went on to found what would become the Kellogg Company, still one of the biggest purveyors of breakfast cereal today.

The success of granula (or corn flakes) inspired Grape Nuts, which contain neither grapes nor nuts and were invented by C.W. Post after a stay in Kellogg’s sanitarium. Early ads claimed Grape Nuts could do everything from cure the desire for liquor to prevent malaria. A senior brand manager for the product in the 1980s told The Wall Street Journal that Grape Nuts was “people eating advertising.”

But cereal as an example of the power of marketing largely took place in the 1950s onward. Companies began to take advantage of the new full-scale commercial broadcasting that began in 1947 to advertise to young Baby Boomers via their television sets. By this time, Kellogg’s and Post’s dreams of cereal as a kind of medicine had already begun to fade. According to Gitlin, a salesman from Philadelphia named Jim Rex saw his kids adding sugar to their cereal and invented the first pre-sweetened cereal, Wheat Honnies, in 1939. Along with the cereal came a mascot to help market it—Ranger Joe Honnies. From then on, these friendly characters became a crucial part of selling cereal. “TV advertisements were absolutely huge and had tie-ins in the ’50s and ’60s with cartoons and Westerns,” Gitlin says. The phenomenon went on to include iconic characters like Tony the Tiger (Frosted Flakes) and Snap, Crackle, and Pop (Rice Krispies).

But starting in the ’60s and ’70s, sugar became slightly less of a selling point. Accordingly, the word itself began vanishing from ads and boxes, only to be replaced by subtler terms like “honey” and “golden.” Products that didn’t adapt quickly enough, like Sugaroos, suffered as parents caught up with new science. But, as new products like Cheerios Protein indicate, the products themselves didn’t get any less sweet.

Kids today don’t identify with cereal as much as the older generations once did.

Today, thanks to the shrewd marketing of companies like Post and Kellogg, breakfast cereal still has a nostalgic hold on many. It’s what’s inspired a Cereality Café franchise with stores in places like Texas and Virginia. It’s also what spurred Scott Bruce, the author of Cerealizing America, to try to open a cereal museum called FlakeWorld on the Las Vegas strip. And yet most of these efforts have failed: Bruce raised a few million in funding before the dot-com bubble burst and investors abandoned the project, and most shops in the Cereality Café franchise have shuttered, deemed to be too gimmicky.

Perhaps more importantly, residual nostalgia for cereal hasn’t translated into sales. Cold-cereal consumption has decreased by at least one percent a year over the last decade, according to a report by the market-research firm Stealing Share. Likewise, America’s focus on cutting out sugar has led to declining consumption of full-calorie soda (with a decrease of seven percent between 2010 to 2013 alone), while sales of bottled water and diet sodas have increased. Meanwhile yogurt, especially Greek yogurt in recent years, has gained favor among Millennials and older consumers, in part because of its ability to diversify with no added sugar, probiotic, and on-the-go options.

Research has shown that cereal consumption decreases with age, and Millennial birth rates are declining. According to Mike Van Ausdeln, a senior brand strategist at Stealing Share, kids today don’t identify with cereal as much as the older generations once did. This has led cereal companies to target older demographics, including Baby Boomers and their now-adult children, by making dubious health claims regarding protein. Last year, Kellogg even started running an ad showing an adult couple eating Fruit Loops and playing the original Nintendo—as transparent an effort as any to tap into fond cereal memories.

It isn’t the very end for cereal: Sales of hot cereals like oatmeal, which need to be prepared, have increased in recent years as health-conscious consumers conclude that the less-processed grains are high in fiber and protein and therefore worth the time. (Hot cereal is also set to increase sales globally as it targets Asian markets where starting the day with a hot breakfast is the norm.) So what’s the solution for companies looking to make cold breakfast cereal a part of American food culture once again? Cereal companies clearly know what increasingly health-conscious consumers are looking for. The way forward, then, may be actually filling the box with what it claims.











 •  0 comments  •  flag
Share on Twitter
Published on November 27, 2015 05:00

Working at JCPenney on Black Friday

Image

I just wanted to go to Greece. See the Parthenon and drink wine with my buddies in the Religious Studies department at the University of Oklahoma where I was in school. But I needed $1,200 to make that happen. Rather than ask my parents for the money, I applied to be a seasonal employee at JCPenney. They accepted me immediately—I’d just come off a summer stint working at another location. I was to work in the catalog department and my first day would be the day retail employees dread most: Black Friday.

The best thing about working in the catalog department is that you don’t have to deal with long lines of customers waiting to check out. Occasionally, some would trickle over when they realized we could perform the task as easily as any other checker, but mostly we only saw customers if they wanted to order an item or to get a free box. The latter was actually more difficult than the former.  

Customers were allowed one free box for each item they purchased. There were three sizes: small, medium, and large. Small would fit a necktie. Medium would fit an average piece of clothing and large would fit a heavy coat or blanket. As you might imagine, medium boxes were by far the most popular. And soon enough, we ran out.

I was standing (sitting wasn’t allowed) at the desk when a woman approached and asked for medium boxes. “I’m sorry, but we’re all out of medium boxes today,” I said. “What do you mean you’re out of medium boxes?” she replied.  She wasn’t quite irate but she wasn’t happy either. “We just don’t have any more to give out,” I responded. I’d realized throughout our interactions that this woman looked somewhat familiar. Did she attend my church or work at a school I’d gone to? “What’s your name?” I asked. “What do you mean, what’s my name?” she said. “You look familiar. I think I may know you.”

She told me her name and I realized the connection immediately: She was a friend of my mom’s! I told her how I knew her and her entire manner changed. She smiled. She asked after my mother and my family and she stopped caring whether she went home with her medium boxes.

That one interaction changed how I think about class and what it’s like to work in the service industry. When I was a nameless 20-year-old at the JCPenney’s counter, a customer didn’t care how she treated me. But once I was a daughter of a friend, she was warm and perhaps embarrassed at her previous behavior.

I worked that job until New Year’s Eve. I made my $1,200. But instead of going to Greece as I had planned, I used it to attend the United Nations Commission on the Status of Women instead.











 •  0 comments  •  flag
Share on Twitter
Published on November 27, 2015 04:15

Milli Vanilli, Pop Music's Original Fakes

Image

One of the stranger images in pop culture this year has been the one above, of Drake’s face pasted onto the body of a Milli Vanilli member. It came courtesy of Meek Mill, the rapper who picked a fight on Twitter over the summer by claiming that Drake doesn’t write his own songs. In one of the diss tracks to result, Mill (nickname: “Meek Milli!”) called Drake a “Milli Vanilli-ass n*****.” T-Pain, commenting on the controversy, boiled it down to being a “Milli Vanilli thing.”

Among the many important implications of this headline-making beef is the notion that, despite or perhaps because of the best efforts of some of pop culture’s watchdog forces, Milli Vanilli hasn’t been forgotten. November 27 marks a quarter century since the Grammys revoked the Best New Artist trophy from the act whose songs, it turned out, were sung not by the European models Fab Morvan and Rob Pilatus but by uncredited musicians working with the producer Frank Farian. It’s one of the most important scandals in pop history, especially when viewed in the context of today’s cultural wars over realness and fakeness.

Morvan and Pilatus always maintained that they were suckered by Farian, who recruited them for their looks and presented them with a catchy demo track that they, despite his promises, were never given a chance to rerecord. As that track, “Girl You Know It’s True,” rose to the top of the charts internationally, the two became sensations who gyrated (with great finesse and charisma) for screaming masses. In one spectacularly ill-advised quote, they told a Time reporter that they were more talented than Bob Dylan or Paul McCartney. A backing-track malfunction at a show fed rumors that they were just lip syncing, but it all really began to unravel when a singer named Charles Shaw said that he was the real voice on “Girl You Know It’s True,” after which Farian owned up about what he’d created.

The response from the public and the music industry was furious. Class-action lawsuits were filed, with a resulting court settlement allowing consumers to seek refunds for their Milli Vanilli merch. Said a 9-year-old former fan quoted in the L.A. Times in 1990, “I think they’re dirty scumbuckets.” Said the president of the Recording Academy upon the revocation of the Grammy, “I hope this revocation will make the industry think long and hard before anyone ever tries to pull something like this again.” Rob Pilatus died of an overdose in 1998, shortly after the Behind the Music special about the band aired.

As far as the public knows, no one has quite tried to “pull something like this again,” if “this” means “drastically lying in liner notes and publicity about the provenance of hit songs.” But Milli Vanilli’s influence is a bit counterintuitive: Their fall from stardom presaged more artifice in pop, not less. In his recent book The Song Machine, John Seabrook details the complex, well-funded apparatus of songwriters and producers and executives who mint most pop songs these days, an apparatus that isn’t hidden from listeners but is also not fully understood by most of them. The system in part has its roots in the work of Lou Pearlman, the now-incarcerated entrepreneur who assembled the Backstreet Boys after witnessing the backlashes to Milli Vanilli and New Kids on the Block—backlashes that were both rooted in the idea that talentless people were lying about being talented. According to Seabrook, “Pearlman wondered what an urban-sounding group of five white boys who really could sing might do in the marketplace.”

130 million records sold later, we have an answer. It’s not often appreciated that a core part of the appeal of the Backstreet Boys—and the other boy bands that followed, some also managed by Pearlman—was that its members really could sing. These teenagers, brought together by newspaper ads to wiggle in CGI videos to songs written by Swedish studio pros, were positioned as a more “authentic” alternative to the likes of Milli Vanilli. Which puts a pretty fine point on what a shell game authenticity can be. Today, autotune is everywhere, producers spend hours “comping” vocals (i.e. piecing together the very best syllables out of dozens of takes), and the Backstreet Boys songwriter Max Martin has nearly as many No. 1 hits as the Beatles.

The Backstreet Boys were positioned as a more authentic alternative to Milli Vanilli, which shows just what a shell game authenticity is.

None of this is a secret. For members of the public still subscribing to the idea that musical merit should be connected to natural musical ability, vocal prowess is often the saving grace, the exonerating factor, that lets them enjoy many of the slickest acts working today. It is essential to the myth of One Direction, for example, that they have verified pipes, that they were generated from a talent contest. Lady Gaga, whose shtick parodies and embraces the insane fakeness of pop, is often praised because she “really can sing.” Or think about what sets Adele apart: She uses many of the same songwriters as the rest of Top 40—Max Martin’s on 25—but she sells more in large part because of how she belts. Morvan and Pilatus didn’t have the fig leaf of vocal talent to protect them once the provenance of their music became clear. But the songs they fronted were as undeniably catchy—if saccharine, over-earnest, cheesy—as they were before the ruse was up.

As for Drake, he seems to have weathered the accusations of Milli-Vanilli-ness just fine. He’s a rapper and a singer, and his delivery—sometimes cocky, sometimes smooth, often sounding unlike much else on the radio—is undeniably his. Plus he can make the case that he’s been honest about the authorship of his work. Quentin Miller, the man Mill said wrote some of Drake’s best recent bars, is credited on a few Drake songs (but not all of the ones at issue). And hip-hop has been collaborative since the days when Easy-E was recording Ice Cube lyrics. “If I have to be the vessel for this conversation to be brought up—you know, God forbid we start talking about writing and references and who takes what from where—I’m OK with it being me,” Drake said recently about Mill’s ghostwriting accusations. The conversation that he was talking about has included incensed radio rants and profane diss tracks, but it’s still been less brutal than the reckoning that faced Milli Vanilli.











 •  0 comments  •  flag
Share on Twitter
Published on November 27, 2015 04:00

Europeans Are Flying Across the Atlantic to Participate in Black Friday

Image

My pharmacist here in Copenhagen boards a plane on Thanksgiving Day and flies eight hours to New York just to shop the Black Friday sales; she does this every year. So does one of my neighbors.

When I first heard about this, four years ago, I put it down to the crazy prices that prevail in Copenhagen, where everything costs three to five times as much as it does in the U.S.  Even taking into account the cost of a round-trip plane ticket and a few days in a hotel, people could still save quite a bit by arbitraging the dramatic price differences between the U.S. and Denmark on clothing, cosmetics, and especially electronics (though they would have to buy a lot). Go with an empty suitcase and come back ahead in terms of total expenses.

Then I found out that this phenomenon isn’t unique to the Danes. The travel industry has built a niche market around this event: There are a whole suite of package tours catering to foreigners who want to participate in the Black Friday bacchanal. Over the past few years, tourists from all over Europe, Asia, and Latin America have paid to fly to the U.S., get up in the middle of the night, and wait in long lines to shop.

The first shopping day after Thanksgiving has evolved from a practical matter of snagging great deals into a spectacle that outsiders find exotic and entertaining—the capitalist equivalent of Spain’s Running of the Bulls. In Pamplona, you get frenzied crowds, violence and death; a Black Friday trip gives you all that, plus the chance to score a plasma television.

Surprisingly, the Black Friday custom seems to be migrating to other countries. The U.K. and Brazil, for example, have made the day their own (for a little history of how the holiday came to Britain, The Economist has you covered). Now that the dollar is strong against many world currencies, there is likely to be more of this in future. Black Friday may join Halloween as another American export embraced around the world.

But what happens to Black Friday when it is severed from the Thanksgiving context? What do foreign countries get when they extract the rabid consumerism from a four-day holiday ritual, leaving behind the ceremony of gratitude and the time with family and friends? From an anthropological point of view, one might say these new adopters of Black Friday are getting the profane without the balance of the sacred.

The economist and moral philosopher Albert Hirschman once wrote that there are basically two arguments about the impact of capitalism on social life: One school of thought has it that commerce barbarizes human relations, while the other view holds that exchange makes individuals behave in a more civilized manner toward each other. For the civilizing impact of commerce to work, people have to be mindful of their interdependency with others and their self-interest in maintaining those relationships. Thanksgiving functions in part as a ceremony of gratitude for precisely those interdependencies and relationships.

In the U.S., there’s been a recent backlash against Black Friday sales that encroach on the sacred space of Thanksgiving. Stores that used to open one minute after midnight on Friday are now returning to normal business hours for the day after Thanksgiving; others, like REI, have even opted to forego Black Friday entirely, urging customers to spend the day in non-commercial pursuits.

Some have attributed this to the easy availability of online shopping. But that doesn’t square with the rise of Black Friday outside the U.S.: People in Europe, Asia, and Latin America have access to Internet retail, but that has not prevented the adoption of American-style shopping at its most promiscuous. This may be because those societies lack the cultural context that Thanksgiving provides in the U.S.: If Black Friday is an expression of the capitalist Id, Thanksgiving acts as the Ego, reminding Americans that there are better things to do than shop.











 •  0 comments  •  flag
Share on Twitter
Published on November 27, 2015 04:00

November 26, 2015

The Rise of Anti-Black Friday Branding

Image

It starts with a scene of touch football in the yard. Next, a woman and a girl, cooking together in the kitchen. “Imagine a world,” a soothing voice intones, “where the only thing you have to wrestle for on Thanksgiving is the last piece of pumpkin pie, and the only place we camped out was in front of a fire, and not the parking lot of a store.” And, then, more scenes: a man, cuddling with kids on a couch. An older woman, rolling pie dough on the counter. A fire, crackling in the fireplace. Warmth. Wine. Togetherness. Laughter.

Related Story

The Fading Spectacle of Black Friday

It’s an ad, unsurprisingly, but it’s an ad with a strange objective: to tell you not to buy stuff. Or, at least, to spend a day not buying stuff. “At T.J. Maxx, Marshall’s, and HomeGoods, we’re closed on Thanksgiving,” the spot’s velvet-voiced narrator informs us, “because family time comes first.” And then: more music. More scenes of familiar/familial delights. More laughter. More pie. The whole thing concludes: “Let’s put more value on what really matters. This season, bring back the holidays—with T.J. Maxx, Marshall’s, and HomeGoods.”

It’s a great ad, and not just because, as with most great ads, it ends up making you kind of hungry for pie. The spot also accomplishes a canny rhetorical trick: It features brands invoking—on the surface, at least—a boycott against themselves. It is marketing in the guise of anti-marketing. But while T.J. Maxx, Marshall’s, and HomeGoods may be particularly vocal in denouncing the hyper-commercialism of the Thanksgiving weekend, they are definitely not alone in it. This year Apple, Costco, Crate & Barrel, IKEA, Nordstrom, Sam’s Club, Staples, and many other retailers—in response to the discount creep that has led to Black “Friday” sales commencing on Thursday—have announced that they, too, will be closed on Thanksgiving. (Also known, in this context, as “Black Friday Eve” and “Black Thursday” and, slightly more poetically, “Gray Friday.”)

Many of the companies offer, in this, the same family-first logic that the Maxximalist brands gave in their ad: the secular sacredness of Thanksgiving, for customers and employees alike. As the discount shoe chain DSW put it in its own ad/explanation for its Thanksgiving Day closure, “Family time is extremely important to us, and we want our associates to enjoy the holiday with their loved ones.”

Marketing via moralism! And this year the Black Friday backlash—the day’s taking of its rightful place within the hype cycle—was brought to a logical extreme when the outdoor goods retailer REI announced that it would be closed not just on Thanksgiving, but also on the high holy day of American consumerism—Black Friday itself.

Black Friday is no longer just a day; it is a season. And also a state of mind.

REI framed this decision, just as T.J. Maxx and DSW did, as a matter of principle—a brave stand against the miasmic encroachments of the commercialized holiday season. “We think that Black Friday has gotten out of hand,” Jerry Stritzke, the president and CEO of REI, summed it up. He further explained that the idea for a commerce-free Black Friday stemmed from his employees’ “anxiety” about the longer store hours REI had been keeping, as per Black Friday custom, on that day. And so: REI made Black Friday a paid holiday for its employees. It also encouraged them to spend the day outdoors, exploring and adventuring and otherwise anti-Black Friday-ing.

Which is wonderful, and humane, and also a reasonable response to Black Friday’s creep and Black Friday’s death and—a combination of those two—Black Friday’s dissolution into the broader commercial phenomenon that is The Holiday Season. While an estimated 135.8 million people are expected to shop over this Thanksgiving weekend, both in stores and online, more are expected to do so on Cyber Monday than on any of those preceding four days. And a survey from the National Retail Federation estimates that 46 percent of holiday shopping will take place online this year—up from 44 percent last year (and the highest figure since the organization started tracking those numbers in 2006). The stuff of IRL Black Friday—the 4 a.m. lines, the “door buster” deals, the fights, the crowds, the generalized chaos that leads to articles with titles like “The Worst Black Friday Injuries and Deaths of All Time”—is diffusing, like so much else, into the cloud. Black Friday is no longer just a day; it is a season. And also a state of mind.

So while the retailers’ opposition to Black Friday—let’s put more value on what really matters—may be (may be) principled, it is also simply practical. Which is also to say: self-serving. The companies’ “brave stands” on behalf of the sanctity of the stuffed turkey are also, in another sense, capitulations. A study conducted by the research firm MarketLive found that roughly 65 percent of consumers “hate or dislike” the trend of retailers opening stores on Thanksgiving Day. (Only 12 percent of Americans surveyed said they “like it or love it.”) A recent article in Time helpfully pointed its readers to the retailers who “are the worst offenders pushing Thanksgiving Day store hours,” adding, to make the point extra-clear, “If you hate Thanksgiving shopping, here’s where to focus your anger.” The Facebook group Boycott Black Thursday encourages people to “#keepfamilyfirst and boycott retailers that extend Black Friday sales into Thanksgiving Day.”

In that sense, REI and T.J. Maxx and DSW and all the other retailers that will prioritize gut-busting over door-busting this week are simply tapping into public frustration—not with commercialism itself, because come on, but with the particular spectacle that Black Friday has become. Black Friday (almost entirely, of course, because of the dedicated efforts of retailers like REI and T.J. Maxx and DSW), has at this point taken on a Darwinian tone. It is competitive. It is zero-sum. It is hungry and wild-eyed and sweaty and pushy and desperate. It runs counter to deeply held cultural mythologies about what the holiday season is supposed to represent: “community,” “generosity,” “warmth,” “love.”

Black Friday is hungry and wild-eyed and sweaty and pushy and desperate. It is competitive. It is Darwinian.

The brands that are embracing the Black Friday backlash are offering themselves up, in general, as orderly alternatives to this madness. Shopping should be civilized, they insist. It should know its place, they scold. The retailers here aren’t saying “don’t shop with us”; they’re simply saying “shop with us literally any other day of the holiday season.” And they aren’t limiting that messaging to ads alone. REI cloaked its flagship store in a banner reiterating the news of its Black Friday closure. It is making ample use of the hashtag #OptOutside, encouraging people to ditch the malls in favor of nature (with the help, ostensibly, of the coats and packs and tents sold by REI). And its anti-Black Friday stance has won the brand media coverage that doubles, conveniently, as free advertising. Stritzke has taken full advantage of all that, telling ABC News about his employees’ “tears” of gratitude at the announcement of REI’s anti-Black Friday, and then noting that he expects few retailers to follow REI’s principled lead—because, after all, “it’s hard to leave money on the table.”

Which: It is! But it’s also an open question whether money will be lost or, actually, earned through all this Black Friday backlashing—and an open question, more broadly, how consumers will react to retailers’ embrace of a commerce-free introduction to the holiday season. With “commerce-free,” of course, being a relative term. The let’s reclaim Thanksgiving spots from T.J. Maxx and its fellow Maxxinista brands (they come in a series, each more treacly than the last) aren’t just tributes to the small satisfactions of the family dinner; they are also—via stylishly furnished homes, via stylishly set tables, via stylishly dressed actors—ads for the goods that are sold at … yeah. Look at all those people, getting maxx style at maxx savings! If only there were a way to be like them!

And all these ads encouraging their viewers to bring back the holidays and put more value on what really matters and the like fail to include an important logistical detail: that T.J. Maxx, Marshall’s, and HomeGoods, like DSW and Apple and Nordstrom and Sam’s Club and pretty much every other retail giant out there, will be open for business bright and early on Friday morning. Because while family time may come first, shopping time, apparently, comes in a close second. And because some things, after all, really are sacred.











 •  0 comments  •  flag
Share on Twitter
Published on November 26, 2015 08:00

The 16th-Century Origins of Food Porn

Image

Thanksgiving is as good a time as any to remember that, for humans, eating is never just about the food. It’s also about ritual, be it the perfectly brewed cup of morning coffee or the annual appearance of grandma’s pumpkin pie. Presentation is what separates us from other animals. We all must eat, but only humans have gone beyond sustenance to make napkin origami and Mayflower centerpieces—and then post pictures of them on the Internet.

Though the contemporary phenomenon of food porn may feel like an Internet-era excess, there’s a long history of different cultures taking part in obnoxious public displays of meals. The Edible Monument: The Art of Food for Festivals, currently on display at the Getty Research Institute, considers the history of table decoration and food display in early modern Europe. The underlying message of these centuries-old examples feels echoed in contemporary TV cooking contests like Cake Wars or The Great British Baking Show: So much of eating is about spectatorship, about consuming feats of gastronomy with the eyes more so than the mouth. So lavish Pinterest planning and meticulous Instagram filtering of Thanksgiving dinner isn’t a corruption of the ages-old communal joys of eating—it’s a natural extension of it.

When it comes to party food especially, the sense of sight has always trumped the senses of taste. For Voltaire and other philosophes of the 18th century, taste was not a single sense but the act of discrimination in general, whether applied to painting or pastry. Its opposite was bad taste, or tastelessness. The meat mountains, fruit pyramids, and marzipan castles that graced princely and aristocratic tables from the 16th century onward may have pleased the palate, but they were primarily intended as feasts for the eyes: visibly expensive, fragile, and time-consuming to create, using hard-to-find ingredients like white sugar or out-of-season produce.

Centerpiece for the feast of Senator Francesco Ratta from 1693 (Getty Research Institute)

Taking the gluttonous feasts of ancient Rome as their models, Renaissance and baroque festivals—whether public or private—invariably featured beautifully displayed foods in vast quantities. One banquet thrown by the Duke of Newcastle at Windsor in the early 1700s included pickled crawfish, pigeon bisque, lamb roasted in blood, wild boar pie, a drawn and quartered pig, green geese and ducklings, a “stack of pies,” and two pyramids of fruit. Civil occasions were celebrated with temporary architecture and sculpture, including tree-like greased poles hung with fowl and other foods, which revelers could climb for sport and a tasty reward.   

These fabulous dishes, decorations, and table settings were meticulously depicted and widely disseminated in mass-produced prints, books, and broadsides, which was at least partly the point of going to all that trouble in the first place. Food was ephemeral, but food porn—the visual culture of food—had a long afterlife in print. As a result, these elaborate entertainments served as propaganda tools. Food festivals were about fostering community spirit, making political statements, displaying wealth, and creating shared memories and histories—the same emotional appetites that drive today’s competitive foodie culture. Though they may depict things like 60-foot-high centerpieces of molded sugar and triumphal arches of bread and cheese, these centuries-old books and images have the same kind of half-instructive, half-smug tone you’d find in the pages of Martha Stewart Living.

Drawing of a table with 100 place settings from 1747 (Getty Research Institute)

Because these monuments were more about spectacle, they may have been technically edible, but they weren’t always eaten. Roasted peacocks and swans looked stunning on a banquet table, but they didn’t taste especially good. The nursery rhyme about “four and 20 blackbirds baked in a pie” isn’t just doggerel; a pie cut open to reveal live blackbirds who sing on cue was precisely the sort of culinary performance art that early modern partygoers expected. What the Germans called “Schau-Essen”—literally “show food”—was usually placed on the table as the grand finale to a meal, once everyone was already full, and not intended to be eaten.

The popularity of edible table decorations beginning in the late-15th century was closely linked to the growing availability of sugar, imported to Europe from Africa and the Caribbean. On a state visit to Venice in 1574, Henry III sat down to a lavish “sugar course” and lifted his napkin, only to have it break apart in his hand. The king was amused to discover that not only the napkins but all the crockery and cutlery were crafted from sugar. These sugar courses were not just sweet nothings; properly deployed, they flattered distinguished guests while simultaneously advertising the host’s wealth, munificence, and good taste, in all its connotations.

A 1587 drawing of a table with sugar figures for a marriage feast, by Theodor Graminaeus (The Getty Research Institute)

In the 18th century, sugar courses became less common but even more ornate, covering whole banquet tables in sugar sculptures depicting formal gardens populated by classical temples and statuary, or illustrating stories from literature or the theatre, staged on mirrored trays to reflect light and reveal hidden details. “All columns, cornices and fixtures, all statues and figures ... and everything visible are cast entirely from sugar,” Julius Bernhard von Rohr marveled in 1729. But sugar was no longer the luxury item it had once been; increasingly, even the middle classes sweetened their tea and coffee with it. Edible dessert table decorations—already virtually indigestible—were replaced by a more permanent doppelganger, biscuit porcelain pieces, which resembled sugar sculptures but cost more, and lasted longer.

Credenza of silverware in the Palazzo Vizzani in Bologna in 1693 (Getty Research Institute)

These images of ancien régime extravagance can’t help bringing to mind that most modern of party planning tools, Pinterest. Today, instead of royal birthdays and religious festivals, we stage-manage elaborate entertainment to celebrate big episodes of favorite TV shows, divorces, or baby-gender announcements; instead of larger-than-life sugar sculptures and fountains flowing with wine, we impress our friends with artisanal tailgating and ridiculously over-the-top children’s birthday parties. Having a few people over to watch the big game no longer means chips and dips but an Instagram-worthy spread of mini chicken-and-waffle sliders, Nutter Butter referees, and football-shaped pumpernickel sandwiches, the stitches piped on in mayo (but only at the last minute, lest they soak into the bread). When every meal is on camera, all food is porn.

The spirit of early modern Europe’s wealthiest lives on in the foodie culture of today. It’s there in the audacity of the edible helium balloon served by Chicago’s Alinea restaurant, in the $1000 Golden Opulence Sundae at New York’s Serendipity 3, in the $30 million, gem-encrusted cake made for a socialite on the show Cake Boss. And on a more relatable level, we may cringe at the idea of baking live blackbirds in a pie, but is it really any more offensive (or labor intensive) than a turducken? Whether on television or on the table, festive food is all about the big reveal.











 •  0 comments  •  flag
Share on Twitter
Published on November 26, 2015 05:00

Why We Eat Together

Image

Between us and our ancestors, who tore apart their half-raw, half-burnt meat with their teeth, or the women of Mesopotamia who ground flour to bake bread, food traditions have piled up and up. Food is no longer a matter of survival, nor purely power; it confers the status and identity with which we distinguish ourselves from others and at the same time gives us the sense of community we seek. Those who eat as we do have a connection with us; they are as we are.

“Dinner’s ready!” is the traditional cry with which Western mothers used to call their playing children indoors and grab the attention of their newspaper-reading husbands. “Dinner’s ready!” We’re about to eat, so drop what you’re doing. The call represented the most important moment of the day, a confirmation of family life, of the caring role of the mother and the authority of the father. So it went on for many generations, in many countries.

The table is a place of memory where we, whether because of the Proustian madeleine or not, become aware of who we are and with whom we are. Around the table, all previous meals come together in every meal, in an endless succession of memories and associations. The table is the place where the family gathers, the symbol of solidarity, or indeed the backdrop to family rows and childhood tragedies. At the table the eater is tamed. At the table we relive our youth through the recipes of the past, our hatred of endive or liver, teenage love through that first failed canard à l’orange, the sadness of the unarticulated apology, the tears of loneliness that mixed with the burnt cauliflower, the sensuality of fingers dipped in an airy sauce mousseline.

Eating around a table means both eating and talking, if only to say a few words of praise for what is presented to us. At the table we talk about what we’ve eaten before and what we’re going to eat and everything in between. If we say nothing at the table, then there is always the refusal of food as the final word. With that same sensitive organ, the mouth, we taste and consume, speak and kiss.

How food is experienced has everything to do with the decor, with the rituals surrounding the meal, with the company, and with the experience. Everyone knows the trap of the Vinho Verde: that famous Portuguese wine tastes so much better when drunk in a sun-drenched restaurant garden than at home with central heating and a view of a rainy street. Simple wooden tables and farmyard cutlery appeal to our emotions, just like damask tablecloths and crystal wine glasses. Food is drama, the table the stage, and the cook is the tamer and hero. People eat more if the food is presented festively even if the taste is no different—important to remember if you are trying to encourage an elderly person to eat.

“Dinner’s ready!” calls people to a specific place. The dining table may be round or long, of wood or plastic, covered with a luxuriant cloth and monogrammed table napkins or with reed placemats that have seen better days. Sometimes the table is bare and shows traces of previous meals. On the top of my French farmhouse table, 200 years old, are diagonal scratches from the breadknife of an unknown cook. “Dinner’s ready!” is always a call to a specific place where the meal is being served, whether or not it includes a table: the picnic in the grass on a blanket, or on the open hatchback of a car beside the highway, or the rickety folding table on the balcony. Even in “Le Déjeuner sur l’herbe,” that famous painting by Manet with a nude woman sitting on the grass, something that looks like a scrunched tablecloth lies next to the fruit. (Or could it be her dress, being used for the purpose?)

Food is drama, the table the stage, and the cook is the tamer and hero.

The words “Dinner’s ready!” denote a state of mind determined by the topography of the table. People sitting opposite each other inevitably pass dishes or pans, and are almost forced to look each other in the eye and to converse. People sitting next to each other look at a third person, or out of the window, or at a wall. Does that looking, or indeed not looking, make the exchange of confidences easier? Is the taste of the food influenced by the company and the seating arrangements? These dimensions are beautifully illustrated in Edward Hopper’s painting “Chop Suey,” in which two women sit opposite each other at an almost empty table in a Chinese restaurant in New York. The table is the center of a universe in which we seek our place, revolving like planets around the sun, drawn by the gravity of the regularity of eating and the longing for company.

At the table it’s all about receiving food, or at least the ritual of serving and eating. Every meal arises from a series of specified acts, even if only an improvised picnic or a chocolate cake consumed alone. Something is revealed, from a dish, box, or picnic basket, steaming plates are brought in, pan lids are lifted, and vinegar and oil poured, there is stirring and slicing. Even where a lonely diner picks sweets out of a bag with bare fingers, a rudimentary ritual exists, a moment of pleasure, no matter how ambiguous or guilty. This symbolism of the meal applies to politics as well. The table is functional; formal dinners confirm the state of negotiations and at the same time demonstrate the power and opulence of those attending. Every meal, however simple, has a beginning and an end, marked by the unfolding of napkins and the deployment of cutlery, or by a prayer, a speech, or a toast, or a satisfied leaning back in the grass as the last glasses are emptied.

The human is the only animal species that surrounds its food with rituals and takes account of hunger among others who are not direct relatives. The table makes us human. Cooking is the basis for relationships. We distinguish ourselves from the animals not by our use of tools—the stick other primates use to extract honey from a honeycomb could with a bit of a stretch be called a “fork” or “spoon.” No, we distinguish ourselves by the fact that we eat at a table, or at least a specific place intended for a meal, such as a mat on the ground. We don’t eat as soon as we get our hands on food, to stifle hunger; we usually eat together, if less than we used to and at more flexible times. We generally wait—although again less than we used to—until everyone has food on his or her plate, and we don’t regard the meal as over until everyone has eaten enough. In urban families where older children remain at home and everyone goes their own way, people increasingly eat alone, or at any rate no longer with the whole family gathered at a specific time. The rhythm and communality of meals is declining in single-parent families too.

The table is the center of a universe in which we seek our place, revolving like planets around the sun.

The call “Dinner’s ready!” was until recently a sign that a particular time had been reached, a specific part of the day, in harmony with the rhythm of the seasons, determined by cultural preferences: Is the midday meal most important or the evening meal? The sandwich at a quarter past twelve or the Spanish el almuerzo (lunch) at 3 in the afternoon? The call to the table marks the moment in the day at which everything else must give way to communality: Toy bears are abandoned, school textbooks closed, computers put on standby, and work stopped, at least for a while. In Western Europe there used to be three meals every day, but elsewhere too the meals determine the hour, even where breakfast consists of nothing more than the cold leftovers from the previous evening and lunch is carried out to workers in the fields, even where little remains to serve as an evening meal.

How far back does the dining table go? For much of the hundreds of thousands of years of human evolution we did not sit at tables at all. The Roman emperors lay on beds beside low tables, the poor of the Middle Ages had little more than wooden troughs for their food, and in Africa and India people eat crouching down or in the lotus position on the ground. Estimates suggest that a quarter of the world’s population doesn’t eat at a table but around a mat, or standing in the mud of a market with a narrow plank for support. In poor countries, where mothers and children often eat separately from men, usually in or near the kitchen, the cry is not “Dinner’s ready!” but “Come and eat!”—just as the usual greeting in many countries is “Have you eaten yet?” Why should we assume that dining at a table marks a high point in our evolution? Humans are not simply what they eat but how and where they eat. And with whom; in 18th-century Dutch a good friend was called a “table friend.”

The dining table is disappearing. Fewer are being sold now in rich economies, apparently. This says a lot about the times we live in. The table is less and less the center of family life. We eat at the computer, standing in the kitchen, lounging on the sofa in front of the television, in the car, or walking along the street. Best of all we like to graze all day. If we do still eat at the table then it’s no longer a dining table but one where eating shares space with other things, such as a computer, a television, or newspapers. Sales of plates are declining too, and even more so serving dishes and cutlery designed for serving from them. More and more of the food we buy is ready to eat, in throwaway tubs or trays, or designed as finger food to be eaten with one hand and no cutlery. What’s the point of a table if we can devour a microwaved ready meal on our laps?

With the disappearance of the table as the center of existence, a new emotion is coming to the fore. The table exerted a certain discipline; now we feel conscience stricken because we eat too much, while neglecting to cook and forgetting how. Is the table not becoming the place of sin, of guilt about our desire to eat, now that we no longer dare to enjoy food uninhibitedly? Increasingly we eat alone, and what solitary diner bothers to lay the table?

Something comparable is happening to the kitchen. Avant-garde kitchens are becoming living spaces with a built-in library and bar, the ultimate thus far being Berloni’s Not For Food kitchen in which a desk, sofa, and kitchen are fused into a single whole. Nostalgic farm kitchen or high-tech laboratory: the irony is that the more a kitchen is visible as a symbol of status and identity, the less it is used.

This article has been excerpted from Louise O. Fresco’s recently released book, Hamburgers in Paradise: The Stories Behind the Food We Eat.











 •  0 comments  •  flag
Share on Twitter
Published on November 26, 2015 05:00

Atlantic Monthly Contributors's Blog

Atlantic Monthly Contributors
Atlantic Monthly Contributors isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Atlantic Monthly Contributors's blog with rss.