Atlantic Monthly Contributors's Blog, page 41

November 24, 2016

Where Are All the Thanksgiving Pop-Culture Classics?

Image










Back in 2012, 30 Rock made the case that Hollywood will drum up an ensemble film about any holiday. In one episode, a lovelorn Emma Stone declares in a trailer for the fake romantic comedy Martin Luther King Day, “In the words of Martin Luther King, I just gotta go for it.” The show was parodying a crop of movies from the late director Gary Marshall that attempted to imbue smaller American holidays with unearned sentimentality: New Year’s Eve in 2010 and Valentine’s Day in 2011 (this year brought...Mother’s Day). The excesses of the Holiday-Movie Industrial Complex are especially apparent when you consider how few classic films have been made about Thanksgiving. In fact, there isn’t much of a Thanksgiving pop-culture canon at all.



Think about it: What are the widely adored Thanksgiving equivalents of Miracle on 34th Street, It’s a Wonderful Life, “Jingle Bells,” How the Grinch Stole Christmas, or Rudolph the Red-Nosed Reindeer? Even Halloween has a bigger pop-culture footprint, with Hocus Pocus, “The Monster Mash,” Halloween, The Craft, and Halloweentown. (The Nightmare Before Christmas neatly bridges the two.) And yet Thanksgiving’s primary offerings include Planes Trains and Automobiles, Pieces of April, A Charlie Brown Thanksgiving, and Arlo Guthrie’s folk song “Alice’s Restaurant”—all classics in their own right, but still a bizarrely light roster for America’s second-favorite holiday. Like Christmas, Thanksgiving is a day of indulgence, of relaxation, of communal comforts, of cold-weather cheer. So why the disparity?





Robert Thompson, a professor of pop-culture studies at Syracuse University, acknowledges that Thanksgiving has been edged out when it comes to distinct works of entertainment. That’s partly a matter of unfortunate timing. Thanksgiving doesn’t have a fixed date, and it has a fairly small window of opportunity, being sandwiched between two major days of celebration. It doesn’t help that both Christmas and Halloween tend to spill out of their respective months into November. Stores still have jack-o-lanterns and discounted trick-or-treat candy on shelves after October, and Christmas decorations and music are everywhere before December 1.



But Thanksgiving is still very much a day of cultural rituals. Perhaps the best distillation of Thanksgiving diversion came just over 30 years ago in 1985: During pregame coverage for the New York Jets game against the Detroit Lions, the sportscaster Ahmad Rashad proposed on the air to the Cosby Show actress Phylicia Ayers-Allen, who was at the Macy’s Day Thanksgiving Parade. That moment, witnessed live by roughly 40 million people, united the three elements that still form the backbone of Thanksgiving recreation today: sitcoms, football, and the Macy’s Day Parade. All of which unfold, of course, on television.



“Thanksgiving now in America is almost completely television-centric,” Thompson told me, noting that the holiday is profoundly domestic. It’s about food and traveling, but most of all it’s about home, a fact borne out in Thanksgiving films like Home for the Holidays and What’s Cooking (neither of which is a household name). And, this being America, a domestic holiday can’t unfold without TV. In terms of Thanksgiving Day programming, most people tune in to one of three things: the Macy’s Day Parade, a football game, and dog-related specials (NBC’s National Dog Show, Fox’s “Cause for Paws”). It’s worth nothing that while these are annual traditions, the specifics of the events change from year to year—unlike, say, Elf, or Home Alone, or Love Actually, films that offer the same delights every Christmas.



Without many movies or characters or songs, it can be harder to get kids as excited about Thanksgiving as Christmas.

When it comes to narrative storytelling, sitcoms do the best job of capturing the essence of Thanksgiving. “Television comedy and Thanksgiving are a match made in heaven,” Thompson said. “It’s an interior holiday, and TV comedy is an interior art form.” There are countless roundups and rankings of the best Thanksgiving TV episodes, many of which come from a single show: Friends. In the popular imagination (and, often, in reality), Thanksgiving is a time associated with family dysfunction, miscommunication, personal squabbles—all elements that happen to make for really good episodic comedy. But because many shows go on winter hiatus in December, it’s common to see Christmas-themed episodes or TV specials air as soon as Thanksgiving is over, if not slightly before. It’s yet another example of Christmas infringing on the limited cultural—and thus emotional—space that belongs to Turkey Day.



But on a deeper level, Thanksgiving and Christmas have different kinds of mythological resonance, which may help explain why there are so many movies about the latter. Christmas has way more themed characters and story types at its disposal. For instance, Thompson said, Christmas tales are often “redemption stories,” “miracle stories,” or stories about the importance of believing—all reliable formulas for timeless, feel-good films. To add even more options, these stories can be either secular or religious. Plus, Christmas has an abundance of iconic, named characters: There’s Santa Claus, Mrs. Claus, the Krampus, Rudolph the Red-Nose Reindeer, the Grinch, Frosty the Snowman, Ebenezer Scrooge, and Jack Frost. Meanwhile, Thanksgiving’s only real (and fraught) American mythology is the pilgrims and Native Americans, Thompson said. Its most noticeable figure is the turkey—and the poor bird doesn’t even have a specific name or backstory.



Without many movies or characters or songs, it can be harder to get kids as excited about Thanksgiving as about Christmas, Thompson noted, which means those kids turn into adults with less nostalgic attachment to the day. While Thompson doesn’t see Thanksgiving ever quite catching up with Christmas or Halloween on the pop-culture front, he said he believes directors and writers and musicians have plenty of reason to try anyway. From a storytelling standpoint, Thanksgiving is “fertile territory that hasn’t been over-plowed,” he explained. It helps that the holiday’s broader themes—family, harvest, togetherness, homecoming—are timeless in their own way.



Of course, all this isn’t to say that Thanksgiving is somehow a more meager festivity. In between the crazy, hyper-commercialized corridors of Halloween and Christmas, Thanksgiving is wonderful partly because of its looser, lower-key nature. Think of it this way: If every single Thanksgiving-centered film or song somehow vanished, the spirit of the holiday would survive unscathed. Especially if there’s tons of food and good company. And maybe a television playing somewhere in the background.


 •  0 comments  •  flag
Share on Twitter
Published on November 24, 2016 04:00

Can the Dakota Access Pipeline Protests Survive the Winter?

Image










It is cold and growing colder in Cannon Ball, North Dakota, where protesters against the Dakota Access pipeline recently fought one of their most tense battles with the Morton County Sheriff’s Department. On Sunday night, 400 people marched toward a bridge that has become the frontline of this fight, where two burned-out trucks and a stretch of coiled barbed wire divided the Standing Rock Sioux Nation from deputies.



Protesters started fires on the road by the bridge. Deputies fired volleys of tear gas and rubber bullets into the crowd; as the temperature registered 20 degrees Fahrenheit, deputies blasted protesters with a water cannon mounted to an armored vehicle, sending about two dozen people to the hospital with hypothermia. Protesters also say a 21-year-old woman, Sophia Wilansky, was hit in the arm with a concussion grenade, and the impact mangled her arm so badly she may need amputation. But the sheriff’s department told the Los Angeles Times the explosion likely came when protesters were “rigging up their own explosives.” They said protesters had thrown stones and logs at deputies, injuring at least one. The protest has become an “ongoing riot,” the department said, and it’s asked border-patrol agents from the Grand Forks Sector to step in.



At issue is the 1,170-mile pipeline that would cross hundreds of waterways, wetlands, private parcels, and span four states, funneling half a million barrels of crude oil every day between North Dakota and Illinois. In North Dakota, the pipeline is on private property that lies close to tribal land, although it needs permission from the Army Corps of Engineers to pass below the Missouri River. The Standing Rock Sioux Nation says not only was it not consulted over the project, but that the pipeline imperils the town’s only water source. Standing Rock, supported by environmentalists and other Native American tribes, has challenged the project in court and has protested at the site since last spring. Energy Transfer Partners, the company behind the project, is standing firm. The Obama administration has unsuccessfully tried to find a solution.



The pipeline project had ramped up quickly; only two years ago it was still a concept. An early plan had originally sent it north of Bismarck, North Dakota, but the Army Corps rejected that route, in part, because it posed a risk to the town’s water supply. So Energy Transfer Partners altered the pipeline’s course to pass just below Bismarck and a half-mile north of the Standing Rock Sioux nation, crossing over its major water supply, the Missouri River, on land taken from the Sioux three times.



The land originally belonged to the Sioux, and the 1868 U.S. Treaty of Fort Laramie set aside this land for them all the way down to the Black Hills in South Dakota, as my colleague Robinson Meyer wrote this summer. The U.S. changed its mind after rumors of gold in the area spread. Then in 1958, the U.S. needed more land to build the Oahe Dam, below which the pipeline is set to cross.



The standoff  between the protesters and the company has steadily become worse. In the summer, videos showed mounted protesters harassing construction security workers in an attempt to delay the pipeline’s progress. It was an action against increased reliance on carbon-producing energy, but also a legal fight over the tribe’s right to be consulted.



Standing Rock says the Army Corps never included the tribe in the surveying process, that permitting was rushed, and, because Energy Transfer Partners relied on old surveys of the land, the pipeline has plowed through sacred ancient sites. The tribe has filed several lawsuits, but most significantly it sued the Army Corps, saying if crude leaked into the Missouri River it would poison their nation’s water source. A federal judge in Washington, D.C., halted construction, but in September the court ruled that Energy Transfer Partners could proceed. Almost immediately the U.S. Department of Justice and the Department of Interior intervened and ordered construction to pause.



This is where the situation is at currently, with the federal government intervening and Energy Transfer Partners vowing to finish the project. North Dakota Governor Jack Dalrymple is pressuring the Army Corps to let the the project continue. And because along with the change of season this year comes a change in the presidential administration, this seems likely to happen. President-elect Trump has promised to ease environmental restrictions and let energy companies have at America’s “treasure trove of untapped energy.”



Soon the snow will come to Standing Rock, as will Canadian winds that dip below zero degrees and carry freezing rain and sleet. The protesters have raised millions of dollars to fund their protest, and the tents and teepees they spent the summer in are giving way to bunkhouses built of two-by-fours and makeshift sheds with solar panel roofs. They’ve been chopping wood each day, preparing for a long winter.


 •  0 comments  •  flag
Share on Twitter
Published on November 24, 2016 04:00

Gilmore Girls: A Millennial Story Come Full Circle

Image










When it premiered this fall, the new CBS sitcom The Great Indoors came under fire for relying heavily on unimaginative jokes about Millennials: They’re obsessed with social media and political correctness, addicted to technology, sheltered, entitled, and lazy. But the series, which just received a full-season order, at least suggests that portrayals of Generation Y are prevalent enough in the public consicousness to justify a network show dedicated to making fun of them.



The pop-cultural footprint of Millennials is especially apparent in the broader TV landscape, which has seen a boon of stories focused on members of that age group over the past five years. At least a dozen current shows examine the generation’s varied experiences with humor, pathos, and self-awareness, including Master of None, Love, Atlanta, Girls, Crazy Ex-Girlfriend, You’re the Worst, Jane the Virgin, Younger, Insecure, and Broad City. As TV diversifies, and as Millennials—now aged 18 to 35, according to Pew Research Center—climb to higher positions in the industry, these shows are becoming increasingly nuanced and inclusive of different backgrounds. Collectively, they form an intriguing generational narrative that’s more meaningful than what The Great Indoors offers.





This week, joining their ranks is another show, one that partly owes its existence to Millennial nostalgia. The mini-series Gilmore Girls: A Year in the Life premieres on Netflix Friday after nine years of lingering fan investment and dissatisfaction with the show’s conclusion in its seventh and final season. The revival, helmed by the original showrunner and creator Amy Sherman-Palladino, will offer closure for many fans, while also acting as a throwback to one of the generation’s earliest portrayals on TV: The WB dramedy was one of the first character-driven series to trace the transitional experiences of a Millennial protagonist. It’s fitting, then, that the miniseries will have to reckon with the contemporary struggles facing the younger Gilmore girl, Rory (Alexis Bledel), as a single journalist searching for fulfillment in her early 30s. While it might seem regressive to revisit a character from a more homogenous time on TV, Gilmore Girls: A Year in the Life does have something fresh to deliver—the generation’s first full-circle story and, by extension, a case study for how a show can grow up with its audience.



When Gilmore Girls premiered in 2000, the audaciously clever show quickly proved it had little in common with the teen dramas that shared its target audience—Dawson’s Creek and 7th Heaven, and later One Tree Hill, The O.C., and Veronica Mars. Gilmore Girls’ portrayal of the 15-year-old Rory was instead more akin to My So-Called Life (five years prior) and Friday Night Lights (six years later), which stood out for their emotional realism and sophisticated perspective on relationships. Rory was more complicated than many of her onscreen peers. She was bookish and driven, a rare choice for a young female protagonist, but she was also at turns kind and selfish, independent and stunted, and almost always colored by the expectations of those around her.



Today, that description puts Rory in the company of the well-drawn stars of shows like Girls and Master of None that deliberately explore their characters’ flaws, often to make larger sociocultural points. (Behind some of these current programs are Millennials who were avid Gilmore Girls fans.) But Gilmore Girls had a bigger-picture focus: It was at its core a story about the intricacies of family relationships, told with fast-paced wit and through a feminist lens. In the pilot episode, Rory is accepted into the fictional, elite Chilton Preparatory School, forcing her free-spirited single mother Lorelai (the dynamic Lauren Graham) to reach out to her estranged parents for money. Rory’s grandparents agree on the condition of a weekly dinner, and so begins the storyline that drives the series’ rich interpersonal conflicts. The conceit is that Chilton will lead to Harvard, which will lead to a career in journalism, which will lead to a life of possibilities for Rory that Lorelai, who got pregnant at 16 and fled to the small town of Stars Hollow, never had.



Rory’s experiences mirrored what would become the challenges of her upper-middle-class fictional peers a decade later.

In other words, if TV’s modern archetypal Millennial story is about twenty- and thirty-somethings navigating an extended adulthood, Gilmore Girls was its prequel—a broader story about the deep familial history, baggage, and expectations that inform the generation’s coming of age. Gilmore Girls rarely looked at Rory’s life in isolation: Though her storyline occasionally went in its own direction, it was never long before she returned to Stars Hollow for comfort, sought support from her mother, or was roped into her grandparents’ hijinks.



Despite its whimsical hyper-reality, Gilmore Girls was grounded in the idea that its characters were intrinsically and emotionally linked; it emphasized, vividly, how Rory’s decisions affected not just her own immediate future but also those closest to her. When, in season six, Rory crumbles under the criticism of a newspaper publisher, steals a yacht, and temporarily drops out of Yale, the most profound consequences are the ones that alter her family’s dynamics. (A brilliant, Woody Allen-inspired dinner scene in the episode “Friday Night’s Alright for Fighting” brings this conflict to a head and could easily serve as a thesis statement for the series.) Gilmore Girls’ closest relative on TV at the moment, then, may be the CW’s Jane the Virgin, another three-generational story about smart, complex women and the ways they mold each other.



Today, shows like You’re the Worst are more solipsistic—their narrower focus on their protagonists means they are also particularly masterful at tracing their characters’ internal conflicts. In the original series, Sherman-Palladino largely reserved such psychological deep-dives for Lorelai, the show’s emotional center. (Meanwhile, the most interesting insight viewers had into Rory’s eventual decision to return to Yale, for example, was that it was prompted by a conversation with an ex-boyfriend.) To be sure, Rory’s experiences mirrored, or even foreshadowed, what would become the defining challenges of her upper-middle-class fictional peers a decade later, from handling the privilege of choice to grappling with a false sense of entitlement. But for all its progressiveness about politics, class, and feminism, Gilmore Girls showed little, if any, sensitivity to issues of race, the LGBT community, and sex-positivity—subjects that have been explored on most shows centered around Gen-Y characters today.



Which is all to say that Sherman-Palladino’s depiction of Rory in Gilmore Girls: A Year in the Life will be fascinating to see. When news of the revival broke last fall, The New York Times expressed concern that “it will be a different thing, no matter how much of the original talent returns, because there’s one thing even the best-funded, best-intentioned reboot can’t restore: lost time.” While that’s true, the rare gift of Gilmore Girls is that, like Graham’s recent show Parenthood, its stakes are tied not to the pursuit of success or power or survival so common of prestige television, but to character growth and emotional resolution. That time lost between 2007 and 2016 is then but a part of the characters’ evolution, a layer of Sherman-Palladino’s larger story about the Gilmore family that, in a way, never really ends. That the revival will reflect the death of the actor Edward Herrmann, who played the family patriarch Richard Gilmore, is a poignant testament to this.



Rory’s arc will link her generation’s foundation with its emergence into adulthood in an unprecedented way.

So, viewers won’t get to see how Rory navigated the rest of her 20s after Yale, or how she fared on that fortuitous first job covering Barack Obama on the campaign trail. They won’t get to see the ways in which her relationship with Lorelai inevitably shifted as Rory built a life outside Connecticut. But it seems poetic for Gilmore Girls: A Year in the Life to revisit Rory at 32: the same age Lorelai was when the show began, and an age at which career choices carry a certain gravitas. And it is, importantly, an age when more and more young women are coming up against “late-breaking sexism,” as they simultaneously face gendered expectations about families and limitations in their careers. It would make for a remarkable TV arc if the show linked Rory’s adolescent dreams of success to the modern pressures of being a working woman in her 30s.



At least, it would be gratifying to see the places where Rory’s professional and personal fulfillment have come into conflict, a theme that’s been handled with care and humor on newer shows about the growing pains of twenty- and thirty-somethings. Girls followed the aspiring writer Hannah on a self-destructive stint at the Iowa Writers’ Workshop, while Jane the Virgin’s Jane is learning to balance unexpected motherhood with her dream of becoming a romance novelist. With the creative flexibility afforded by Netflix, Sherman-Palladino has an opportunity to thoughtfully test Rory’s notion of happiness, one that was influenced heavily in the series by her mother and grandparents.



As for those three returning ex-boyfriends, Sherman-Palladino has danced around their relevance to Rory’s arc: “It’s just such a small part of who Rory is,” she recently told Time. “Rory didn’t spend her days thinking, ‘Who am I going to end up with?’ Rory was much more concerned about ‘How do I get that interview at The New York Times?’” Her comments were made in reference to the incessant, often frustrating, public debate over Rory’s love life. Indeed, Kevin Porter, the 27-year-old co-host of the popular Gilmore Guys podcast, tells me it is the most frequent topic raised by listeners. But it’s of note that the same podcast (which corralled the show’s fan base in 2014 and has since featured cast members and writers) has prompted critical discussions about Rory’s merits as a journalist, her inability to recognize privilege, and the various ways her boyfriends have affected the show’s titular relationship. Sherman-Palladino’s greatest challenge may be to match the nuanced perspective with which Millennials themselves have come to dissect their generation’s experiences, romantic and otherwise.



Gilmore Girls: A Year in the Life comes at a time when TV has no shortage of compelling stories about a demographic cohort that will continue to be praised, mocked, and analyzed for years to come. But the return of Rory Gilmore—a textured, early-aughts character who mostly preceded the scrutiny of her generation—will be a fascinating contribution to this developing narrative. Her arc will link her generation’s foundation with its emergence into adulthood in an unprecedented way. In doing so, A Year in the Life could help make the case for seeing other Millennial stories through, from their awkward beginnings to their, hopefully, more enlightened ends.


 •  0 comments  •  flag
Share on Twitter
Published on November 24, 2016 03:00

How the Dallas Cowboys Prop Up the NFL

Image










Two Sundays ago, the Dallas Cowboys beat the Pittsburgh Steelers in a game widely considered the best of this NFL season. The lead changed hands a whopping seven times, Dallas’s duo of star rookies—the quarterback Dak Prescott and the running back Ezekiel Elliott—trading big plays with a Steelers offense powered by the veteran Ben Roethlisberger. The final eight minutes featured two touchdowns from each team, the last of them a game-winning 32-yard dash from Elliott with nine seconds left.





For the Cowboys, it was a signature win, giving them a league-best 10-1 record that would improve to 11-1 after the next week’s matchup with the Baltimore Ravens, but also a characteristic one. The resurgence of one of the most storied sports franchises has been a key plot point of the 2016 season. Known over the past couple of decades as an opulent but often self-defeating outfit, Dallas has refashioned itself around its two youngsters and into a sturdy force. These Cowboys appeal not only to their legions of boisterous fans, but also to the most discerning football aficionados. Elliott gains yards by the dozens behind a thick-shouldered but mobile offensive line; Prescott displays the calm of someone who has commanded huddles for a decade. Dallas blows out the teams it should and manufactures close wins against its near-equals.



Thursday afternoon, the Cowboys will host their annual Thanksgiving Day game, this one against Washington. In many ways, it is a moment the NFL has been waiting for: an iconic team, freshly ascendant, playing on the holiday synonymous with football. It features the blend that has made the sport America’s most watched and most profitable: tradition and novelty, old customs shined up by new superstars. This Thanksgiving, though, partway through one of the more trying seasons in recent memory, the league will look to the Cowboys not to celebrate its ongoing standing but rather to revive it.



NFL ratings have dropped. For the league, the decline in viewership is as much an ideological problem as a fiscal one. Football is the sport of a certain strain of the American dream, one that operates on a corporate rather than an individual scale, in which success compounds and growth begets more growth, perpetually. It is a game whose championships announce their halftime performers months beforehand, in special news releases, and whose teams charge more for parking than teams in other sports do for the actual tickets. The NFL has long since passed the point where its hugeness might be thought of as a simple result, e.g. more people want to watch us play, so we need bigger stadiums. The size, the sheer scope of the country’s appetite for this product, has become a core component of it. The sense that everyone is watching the same game on a Sunday afternoon is an undeniable part of that game’s appeal.



That sense is weakening, and various theories have been offered as to why. Some blame the drawn-out and attention-hoarding presidential election, others the less regimented television habits of a new generation. Critics of the league’s extensive faults—ranging from its ongoing concussion crisis to its mishandling of domestic-violence incidents—suggest that the public may no longer have the stomach for it. The NFL itself attributes the ratings slide partly to bad luck, a run of primetime matchups that looked good on the preseason schedule but have since turned out to be uninspiring. “There are a lot of factors to be considered,” the commissioner Roger Goodell said in October. “We don’t make excuses. We try to figure out what’s changing.”



Perhaps the most worrisome hypothesis, at least to NFL officials, is that the quality of play simply isn’t up to par. Thursday Night Football, a series introduced before the 2012 season, routinely features sloppy games played by two poorly rested teams, but even the usual Sunday and Monday contests have increasingly been lacking. To a greater degree than ever, teams rely on younger and more affordable players, who make mistakes their more seasoned counterparts might not. Players of all ages are also subject to stricter injury protocols, which may help their long-term health but which dilutes the talent on the field; the linebacker who may have played through “having his bell rung” five years ago now visits the independent neurologist on the sideline while his backup takes his place. The NFL’s failsafe has always been the expertise of its teams, but more and more, that trait is being chipped away.



If football can still be beautiful, its ugliness waits just outside the lines.

In this context, the Cowboys are an aberration, a reminder of what the league used to produce on a regular basis. They combine competence—expert blocking, tidy routes—with extreme star power, so that they embody football’s X-and-O fundamentals and its physics-bending possibilities simultaneously. Prescott throws a quick, accurate pass, and the locomotive-like receiver Dez Bryant rips it out of the air. The offensive line opens up a seam, the kind that might give an average runner five yards or so, and Elliott burns through it, lowers his shoulder, and rumbles for twenty. They celebrate their sport as much as play it: Those rudiments that, repeated often and well enough, can turn into spectacle. The Cowboys owner Jerry Jones, who has presided over three Dallas Super Bowl victories, coos over this year’s version, saying, “This is a rare team.”



It is the kind of show that lets its audience overlook the behind-the-scenes trouble, at least for a few hours. Even in its present form, though, Dallas isn’t immune to the problems that plague the league. Prescott only found the field in the first place because the former starter Tony Romo suffered the latest in a string of grisly back injuries, each one a reminder of the harm this game does to its participants. Elliott is the subject of an ongoing league investigation centering on his possible physical abuse of a woman. If football can still be beautiful, at least when certain teams are playing it, its ugliness waits just outside the lines.



The NFL hopes that the former outweighs the latter in its audience’s mind, but the swooning television metrics, and the endless supply of plausible reasons for them, suggest this might not be the case. Each week brings another gruesome injury or heinous accusation, or simply another slate of mediocre games. The Cowboys will draw plenty of viewers this Thanksgiving afternoon and will almost certainly play well enough to reward those who tune in. They are an exception, though. Every team, and every game, can serve as a reminder of the shortcomings of America’s most popular (for now) sport. Only a few are good enough to make the fans forget.


 •  0 comments  •  flag
Share on Twitter
Published on November 24, 2016 01:50

November 23, 2016

How American Cuisine Became a Melting Pot

Image










The first celebrity chef in America was an Indian immigrant named Prince Ranji Smile. Described by an excitable reporter in The New York Letter as having “clear dark skin, brilliant black eyes, smooth black hair, and the whitest of teeth,” Smile was poached from London by the New York restaurateur Louis Sherry to work in his eponymous Fifth Avenue establishment. Smile’s complex curries enthralled the city, and by 1907 he was touring the nation, performing cooking demonstrations at department stores and food halls. Fans, particularly women, flocked to him. But in the 1920s he left the U.S. after a Supreme Court ruling denied citizenship to Indian natives on the grounds that they weren’t white. No further records of his life remain.



Smile’s biography is revealed in Eight Flavors: The Untold Story of American Cuisine, a new book by Sarah Lohman that unpacks the diverse history of a nation’s palate via eight distinct ingredients. Through chapters focusing on black pepper, vanilla, curry powder, chili powder, soy sauce, garlic, MSG, and sriracha, Lohman reveals how a nation founded by immigrants built its national cuisine on tastes from all over the world, and how those tastes continue to evolve. But almost more fascinating than the countless odd facts Lohman reveals—the Vanilloidae orchid is native to four continents, which suggests it was around before those continents divided; ketchup has its origins in an early recipe for soy sauce—are the people whose work had a profound impact on the way Americans eat, but whose biographies have been almost completely forgotten. In that sense, American food, which Lohman describes as “the most complex and diverse cuisine on the planet,” offers a unique and surprising view of American history.





Lohman organizes her chapters chronologically, starting with black pepper—hugely popular in the 18th century—and ending with sriracha, whose literal and metaphorical hotness as a condiment was enshrined when Bon Appetit named it the Ingredient of the Year in 2010. Early on, she establishes her argument that food is much more than nourishment: It’s an intrinsic part of human culture. “The physiological signals of flavor are interpreted in our brain’s frontal lobe,” she writes, “the part of the brain where emotional reactions are processed and personality is formed. Personal experience, our memories, and our emotions all inform the experience.” No Thanksgiving dish is an island; each one carries its own weight of memory and emotional connection before we so much as take a single bite.



That said, as much as we inherit our sense of taste from our parents and grandparents (Lohman points out that a liking for garlic, for instance, is passed from mothers to babies in the womb), American cuisine is an ever-evolving thing. It shifts and expands rapidly alongside changing patterns of immigration, culture, and even politics. Consider black pepper, which was so commonplace in the U.S. in 1750 that 50 different recipes in Martha Washington’s wedding gift, A Booke of Cookery, featured it as an ingredient. After the Revolutionary War, it became impossibly scarce, because the British had imported it directly from the U.K. without revealing where it came from. But in 1790, an American captain from Salem arrived in Sumatra, where he learned that Piper nigrum grew on the northwestern coast of the island. He convinced a merchant to send an expedition to source the spice, and the boat returned 18 months later with more than 100,000 pounds of pepper “shoveled right into her hold like gravel.”



After that, black pepper became ubiquitous on American tables, especially when pre-ground pepper did away with the need to grind it by hand. But 1993 saw the launch of the Food Network, on which viewers watched chefs finish dishes with fresh ground black pepper, which led to yet another boost for the spice. In the two decades since then, Lohman writes, “black pepper consumption has increased by 40 percent. And in the 21st century, we’re buying it whole and grinding it fresh, just like Martha Washington.” Soy sauce has similarly ebbed and flowed in popularity, gaining favor in the 18th century as a British import, then largely disappearing until the Gold Rush in 1848, when Chinese immigrants arrived on the West Coast. It received its biggest boost in 1972, when Kikkoman launched the first Japanese manufacturing plant in the U.S., appealing to American soldiers who’d fought overseas in World War Two and gained a taste for the cuisine.



Sriracha, made in California as a Thai-style sauce by a Vietnamese refugee, has an origin story that’s “more American than apple pie.”

War, Lohman points out, “is a great propagator for new culinary movements.” Mexican cooking was first introduced to American palates in the early 19th century, when soldiers invaded what’s now Texas. Garlic owes its rise in popularity in the U.S. to the First World War, after which American intellectuals flocked to Paris and French cooking became the newest trend (it also helped that James Beard was stationed in Marseille in 1945).



But the key factor that’s defined American cuisine throughout the years is undoubtedly immigration. Chili powder, invented by a German American in 1897 to facilitate making Mexican food in the U.S., is one example of what Lohman describes as the “patchwork quilt” of American food culture. Sriracha, made in Southern California as a Thai-style sauce by a Vietnamese refugee, has an origin story that’s “more American than apple pie.” Fear of immigrants, she argues, is also an age-old American tradition, leading to a decades-long stigma against garlic, which represented Italian immigrants’ supposed refusal to assimilate, and absurd myths that the Chinese eat rats, or that MSG causes headaches (rather than being a chemical additive, Lohman points out, it’s a substance that naturally occurs in everything from tomatoes to cheese).



So it seems appropriate that Lohman dedicates significant portions of her book to the people who helped define American cuisine even while facing discrimination and disdain. There are the “Chili Queens,” impossibly glamorous Mexican women who supported their families by selling chili con carne in the Alamo Plaza in the late 19th century until they were shut down by concerns about sanitation. here’s Edmond Albius, a 12-year-old slave and amateur botanist on Ile de Bourbon who changed the flavor of much of the world’s baked goods when he discovered a way to make vanilla plants pollinate. And there’s William Gebhardt, who found a way to manufacture chili powder as a shortcut for home cooks.



There’s also Prince Ranji Smile, whose celebrity and popular appeal couldn’t save him from clashing with the U.S.’s strict labor and immigration laws. In 1922, a profile of Smile ran in the New York Hotel Review, in which the author noted Smile’s contributions to the diverse character of American cuisine. “America has given no attention to the development of a school of cookery of its own,” Mary Pickett wrote, “but it has imported its cooks from all parts of the world, and when the American culinary school is finally developed it will have embodied in it the good points of the culinary art of the world.” Eight Flavors, a richly researched, intriguing, and elegantly written book, is a testament to how accurate Pickett’s prediction was, and how much American food owes to the people who helped a nation make other traditions its own.


 •  0 comments  •  flag
Share on Twitter
Published on November 23, 2016 13:09

‘Alice’s Restaurant,’ an Undying Thanksgiving Protest Song

Image










In the small canon of Thanksgiving-related popular music, Arlo Guthrie’s “Alice's Restaurant Massacree” stands out for a few reasons, one of which is that it’s only barely related to Thanksgiving. The other reasons include its 16-minute runtime, and that it’s politically minded art of the sort worth revisiting this particular holiday season.



Guthrie released the story-song in October 1967 as the leadoff to his album Alice’s Restaurant, and it made such an impact that it was turned into a film, some radio stations still play it each Thanksgiving, and Guthrie now has a tradition of performing it once a decade. For last year’s fiftieth anniversary of the real-life incident that inspired the song, Guthrie told Rolling Stone, “I never expected it to even be on a record, let alone get airplay, let alone have it made into a movie. I mean, that was all like a whirlwind of events that were way beyond my control.”





The lyrics’ synopsis: On Thanksgiving 1965, Guthrie and some pals went to throw out garbage from the church where the titular Alice lived, but the dump was closed for the holiday. So he instead tossed the waste at an unsanctioned site, was caught and arrested, and his arrest prevented him from being drafted to fight in the Vietnam War. In the song he tells this story with heapings of humor and twang, and it culminates in him advising would-be draft-dodgers to go into their draft office and sing the chorus of the tune to show themselves unfit for service.



The song has been portrayed as anti-war, a Baby Boomer tale of resistance. But if you actually listen, it’s not quite that. The Guthrie of the song tells the draft psychologist, "Shrink, I want to kill. I mean, I wanna, I wanna kill. Kill. I wanna, I wanna see, I wanna see blood and gore and guts and veins in my teeth. Eat dead, burnt bodies. I mean kill, Kill, KILL, KILL." This just makes him more likely to be enlisted. The twist of the thing comes when it’s a littering offense, not his supposed sadistic tendencies, that keep him out of the military—a sign of screwed-up governmental priorities.





Guthrie’s line over the years has been that it’s not so much an anti-war song—though it did endorse resisting the draft—as an anti-stupidity song or, as he told NPR, one that’s “celebrating idiocy.” “Thank God, that the people that run this world are not smart enough to keep running it forever,” he said in the same interview. “You know, everybody gets a handle on it for a little while. They get their 15 minutes of fame, but then, inevitably, they disappear and we have a few brief years of just hanging out and being ourselves.” Comforting words to about half the country right now, no doubt.



Listening to the song today is an easy way to feel beamed back to the mood of the late ’60s counterculture—and to be reminded of its goofy streak. After all, Guthrie’s 16-minute lackadaisical telling of government illogic likely caught on less because of the political sentiment it contained than because of its sheer entertainment value. Today’s socially conscious artists for whom this election may feel like a rejection, who are grappling with the reality that half the country have been either tuning out their efforts or actively rejecting them, might take heed when reflecting on how to reach new constituencies.



Guthrie, in the same NPR interview, was asked what his dad—who died  in ’67, before the song’s release—might think of “Alice’s Restaurant Massacree.” Woody Guthrie, of course, is now remembered perhaps as one of the most iconic example of a protest artist, and his “This machine kills fascists” guitar sticker has been invoked plenty on social media in the wake of the election. “I can imagine the smile on his face is all I can say,” Arlo said, “because I know he would have enjoyed at least the sense of humor.”


 •  0 comments  •  flag
Share on Twitter
Published on November 23, 2016 09:42

Gilmore Girls: A Year in the Life Is a Rare TV Revival That Works

Image










The nostalgic TV revival is a genre still in search of a purpose: Too often, shows like Arrested Development, Full House, and The X-Files have returned for no reason other than to gin up easy viewership, to appeal to those seeking to remember better days. Gilmore Girls: A Year in the Life is the first such TV sequel that really uses the long-delayed circumstances of its existence to its advantage. There’s a smart self-awareness to the show (available on Netflix starting Friday) that goes beyond sly winks to the camera about how long it’s been since its titular mother-daughter team have appeared onscreen. Considering it could merely exist as a cheap cash-in, that A Year in the Life feels so emotionally resonant is somewhat miraculous.



Perhaps it isn’t too surprising, though, given that A Year in the Life represents a redemptive moment for the show. Gilmore Girls saw its original run end rather abruptly in 2007, after an unsatisfying seventh season that aired without its major creative voice, Amy Sherman-Palladino. She created Gilmore Girls but wasn’t able to end it, due to botched negotiations with its network; A Year in the Life was her chance to reclaim her show and finish it on her own terms. Happily, it’s more than that. Over the course of four feature-length episodes (each about 90 minutes long), the show renews the witty spirit that has helped it endure since it went off the air. But the series is also unafraid to grapple with how much time has passed—and the inertia that needed to be overcome to recapture the magic.





A Year in the Life is set nine years after the last episode of Gilmore Girls, but its setting, the fictional Connecticut town of Stars Hollow, has always existed in a quirky bubble. The original show, which mostly aired during the George W. Bush administration, has always existed as a sort of fuzzy security blanket, one only tenuously in touch with the scarier real world around it. The stakes in A Year in the Life remain personal, but in the intervening years, the world has grown a little larger around the mother-daughter unit Lorelai and Rory Gilmore, and, as Sherman-Palladino cleverly shows, that world is a little tougher to get ahead in.



During the show’s original run, Rory (Alexis Bledel) was something of a child prodigy, a voracious reader with an intellect beyond her teenage years. She was blessed with the gumption of her mother Lorelai (Lauren Graham) but without quite the rebellious spirit that led to Lorelai getting pregnant at the age of 16. With the help of Lorelai, the friendly townspeople of Stars Hollow, and her blue-blooded grandparents Richard (Edward Herrmann) and Emily (Kelly Bishop), Rory thrived. She went to Yale and embarked on a promising journalistic career at the end of the show, getting ready to follow the presidential candidate Barack Obama on the campaign trail for an up-and-coming blog.



What a difference nine years makes. Rory is now 32, and her journalistic career still seems on the verge of true success; she’s had freelance pieces in The New Yorker, Slate, and this very publication, and she spends most of her time in London. But there’s a sense that she’s running to stand still—that her life, both personal and professional, has been trapped in amber for nine years. From a plot perspective, her stasis is the only way to plausibly have her hanging out in Stars Hollow all the time (and revisiting all of her former flames). But it seems Sherman-Palladino wants to complicate the show’s return for viewers—A Year in the Life is still a bit of comforting escapism, but one where having things stay the same comes at a cost.



If you grew up fond of the show, it’s hard to imagine that the new season will miss a beat.

That self-awareness helps the first episode feel natural, even as it’s clear the cast is shaking off the cobwebs. There are old rhythms to slip back into, and plenty of exposition that needs to be delivered as organically as possible. Every member of the show’s cast, regular and recurring, pops back up for at least a scene or two, and Stars Hollow remains essentially the same as it ever was, even if everyone’s added a few wrinkles in the intervening decade. Lorelai is still running her Dragonfly Inn, is still living in the same house, and has settled down with her long-time love interest Luke (Scott Patterson). She, like her daughter, seems plagued with an indefinable malaise, perhaps possessed of the thought that after nine years things should look more dramatically different.



There is one major change, of course—the loss of Richard, who was always a comic and dramatic standout. (Herrmann died of brain cancer in 2014.) Rather than glossing over such a massive loss for the show, Sherman-Palladino makes Richard’s death the narrative spine of the series, with Lorelai, Rory, and Emily each struggling to cope in her own way. More broadly, the long runtimes of these new episodes (each named after a season) make it easier for the show to unfold without much attention to plot. There’s room for silly digressions, for long town meetings featuring the hilariously tyrannical Taylor Doose (Michael Winters), even for set-pieces that bring in actors from other shows (Graham gets to meet a few of her Parenthood co-stars, and there are cameos from several stars of Sherman-Palladino’s TV show Bunheads).



The narrative looseness is forgivable; after all, this is Gilmore Girls, which was always more interested in rat-a-tat witty dialogue than in detailed story arcs. If you grew up fond of the show, or recently discovered it in Netflix’s archives, it’s hard to imagine that the new season will miss a beat. A Year in the Life won’t necessarily convert new viewers—like any revival, it’s making a play for a loyal fanbase, which should be more than enough to justify Netflix’s investment in the show. But as a salvage attempt after Gilmore Girls’ original bittersweet ending, it feels wholly justified.


 •  0 comments  •  flag
Share on Twitter
Published on November 23, 2016 08:46

Apolitical Arguments for the Thanksgiving Table

Image










Arguments are time-honored guests at the American Thanksgiving table. But when fights get explicitly political, very often things won’t end well—this year, in particular.



Here, then, are some pleasantly apolitical alternatives: arguments (presented here, for controversy’s sake, as incontrovertible declarations of truth) that will help to ensure Thanksgiving discussions that come with maximal amounts of liveliness—and minimal amounts of existential despair.




Pie is better than cake.



Pecan pie is better than pumpkin.



Apple pie is better than pecan.



Pie is only acceptable when it is served à la mode or with whipped cream.



Cool-Whip does not count as whipped cream.



Canned cranberry sauce—the blobby, gelatinous kind that magically keeps the form of the can—is much, much better than fresh.



100 duck-sized horses, obviously.



It would be better to have an unlimited budget to travel the world than an unlimited budget to build a dream house.  



It should be as acceptable to order a martini on a weekend midday meal as it is to order a Bloody Mary.



“Brunch” is a terrible word.



“Moist” is even worse.



Red wine is better than white.



Beer is better than wine.



Whiskey is better than beer.



“Read receipt” is pronounced “REED reh-seet.”



“Gif” is pronounced “jif.”



“Gyro” is pronounced “YEE-roh.”



Macaroni and cheese should have a privileged place upon the Thanksgiving table.



Macaroni and cheese is only good when it includes bread crumbs.



Plain cheese pizza is the best pizza.



Cold pizza is not actually pizza.



Cold pizza is wonderful as its own foodstuff.



Tom Brady totally knew.



Fall is better than winter.



Winter is better than summer.



Spring is better than absolutely everything.



Wealth is more important than fame.



Love is more important than wealth.



Hot dogs are best when consumed with ketchup, mustard, and nothing else.



Ketchup is better than mustard.



Sriracha is better than ketchup.



Dogs are better than cats.



Mannequinning is better than planking.



Frappuccinos are better than coffee.



Mint chip is better than chocolate chip.



Milk chocolate is better than dark.



Hot dogs are sandwiches. (No, they’re not).



Quesadillas are sandwiches.



Burritos are wraps.



Wraps are nonsense.



Bond is better than Bourne.



Star Wars is better than Star Trek.



Crushed ice is better than cubed.



Sausage is better than bacon.



Link sausage is better than patty.



Fried eggs are better than scrambled.



Poached eggs are better than fried.



Hash browns are better than home fries.



French fries are better than hash browns.



“Steak fries” are just lazy.



Thanksgiving is the best holiday.



Actually, Arbor Day is the best holiday.



There is no place for the word “actually” in contemporary civic discourse.



Ghosting is better than a drawn-out goodbye.


 •  0 comments  •  flag
Share on Twitter
Published on November 23, 2016 08:39

Moana Is a Big, Beautiful Disney Smash

Image










“If you wear a dress and have an animal sidekick, you’re a princess.” Thus does a Polynesian demigod chastise the daughter of a Pacific island chieftain who has maintained that she is nothing of the kind.



Of course, for all intents and purposes, he is right and she is wrong. “Chieftain’s daughter” is merely “princess” by another name. But with this cunning wink, Disney’s Moana inoculates itself against the charge that it is yet another of the studio’s unwoke princess movies.





Better still are the substantive upgrades: The 16-year-old titular heroine is proportioned like an actual adolescent female, rather than a saucer-eyed, wasp-waisted Barbie. And you can scan the ocean horizon in every direction without spotting anything that remotely resembles a love interest.



Such political advances, however, are secondary to the sheer virtuosity of Moana. The movie is an absolute delight, a lush, exuberant quest fable full of big musical numbers and featuring perhaps the most stunning visuals of any Disney film to date.



As the story opens, the chieftain’s daughter, Moana (played by young Hawaiian actress Auli‘i Cravalho), is perpetually vexed that her father (Temuera Morrison) will not allow her to venture beyond the reef encircling their island home of Motunui. But the island and the ocean around it are slowly dying, because long ago a capricious demigod named Maui (Dwayne Johnson) stole—and subsequently lost—the precious-stone “heart” of the fertile goddess Te Fiti. When the sea itself entrusts that heart to young Moana, she knows that she must set sail beyond the reef, find Maui, and with his help restore Te Fiti’s heart.



Though the narrative is linear, there are inevitably perils to be met: a horde of pirate raiders that seems to have snuck in from Mad Max: Fury Road, except for the fact that they are all…no, I won’t spoil it; a treasure-hoarding monster crab (voiced by Jemaine Clement), who puts Smaug to shame; and the smoldering lava spirit Te Ka, who also has designs on the Heart of Te Fiti.



Johnson’s charming, witty vocal performance is perhaps Moana’s greatest pleasure.

But the principal obstacle for Moana to overcome is her demigod partner in adventure, Maui. Vain, selfish, and utterly uncommitted to her mission, he is also in the midst of a crisis of confidence, having lost his magical fishhook and with it most of his demigodliness. Once a a shapeshifter of uncanny ability, he’s now hard pressed to turn himself into anything more impressive than a half-shark—a transformation that is precisely as useful as it sounds.



Indeed, even as Moana—the princess who defies her father to venture across the sea—cannot help but recall Ariel of The Little Mermaid, the problematic and polymorphous Maui bears a distinct resemblance to Aladdin’s genie. Nor does this intra-Disney cross-pollination seem entirely accidental: Back in the day, Moana directors Ron Clements and John Musker were also responsible for both Mermaid and Aladdin. And like Robin Williams’s showstealing turn in the latter picture, Johnson’s charming, witty vocal performance here is perhaps Moana’s greatest pleasure. There is a particularly delicious irony in the fact that an actor who first arrived onscreen thanks largely to his physique (Johnson was formerly the professional wrestler known as The Rock) has now done the best work of his career without the use of it.



Disney Animation is currently in the midst of one of its periodic streaks of greatness—the first since the Mermaid-Aladdin-Lion King run in which Clements and Musker played such a central role twenty-odd years ago. But what is notable this time around is the sheer variety of the studio’s offerings, from the high-concept premise of Wreck-It Ralph to the classic virtues of Frozen to the Asian-inflected tenderness of Big Hero 6 to the ingenious mammalian noir of Zootopia.



Moana definitely resides at the Frozen end of this cinematic spectrum, a conventional story featuring just enough innovation to feel current and relying principally on its dazzling execution. The musical numbers by Opetaia Foa’i, Mark Mancina, and (yes) Lin-Manuel Miranda may not be quite Hamiltonian, but they will soon be on your children’s lips and perhaps your own: the ensemble introduction “Where You Are,” Clement’s hilarious “Shiny,” and the anthemic “How Far I’ll Go” which, for better and worse, may rival Frozen’s “Let It Go” in sheer catchiness. These serve as accompaniments to the film’s flat-out gorgeous CGI cinematography—lush greens, sunlit golds, and a deep blue sea that doubles as a principal supporting character.



Is Moana a princess movie? Sure it is. But it’s a great one.


 •  0 comments  •  flag
Share on Twitter
Published on November 23, 2016 04:11

What's Really Going On in North Carolina's Gubernatorial Race?

Image










DURHAM, N.C.—It’s been two weeks since Election Day, which means that in 49 states, residents know who their governor will be next year. North Carolinians, however, are still waiting to see.



At the end of balloting on November 8, Democrat Roy Cooper, the current state attorney general, had a lead of a few thousand votes over Governor Pat McCrory, a Republican. Cooper promptly declared victory, but McCrory has not conceded the race, and the state has not been able to declare a winner. On Monday, Cooper announced a transition team, while McCrory accused him of “circumventing the electoral process.” Cooper’s lead fluctuated throughout the day on Tuesday as more counties returned their final canvass, but as of writing it was a little more than 6,000 votes out of roughly 4,700,000. (The most up-to-date results are here.)



It’s an unusually acrimonious race, but it follows an unusually acrimonious few years in North Carolina politics. The mention of an obscure state law that could theoretically allow the Republican-dominated state legislature to declare a winner in the race has set off something of a frenzy at the national level. It’s unclear whether there’s really much likelihood that the General Assembly would invoke the law and step in, although it would represent a breathtaking attempt to seize power.



The stakes are high in part because North Carolina has been through a series of significant political shifts over the last few years. After decades in which Democrats mostly dominated state politics, the GOP won control of the General Assembly in 2010 and then of the governor’s mansion two years later, with McCrory’s election. Republicans embarked on an aggressive program of conservative reforms. The most nationally noticed of those was HB 2, the so-called bathroom bill that, among other things, required transgender people to use bathrooms corresponding to the sex on their birth certificates. McCrory’s deficit appears to be in part a result of backlash to that law, and the economic repercussions for the state, which saw businesses cancel expansions, entertainers boycott concerts, and sports tournaments abandon the state.



But the battle over the results of the election is more directly tied to—and really ought to be seen as simply the latest battle in—a long war over voting rights. In 2013, as soon as the Supreme Court struck down several provisions of the Voting Rights Act, North Carolina Republicans passed a new law regulating voting in the state. Among other things, the law required voters to show a photo ID when voting, ended same-day registration, and shortened the early-voting period. The North Carolina law was described by some analysts as the most sweeping in the nation, but it fits with a wave of such laws that have been passed or proposed in states around the nation, mostly by conservative politicians who argue they’re necessary to safeguard the sanctity of elections. But there is very little evidence of widespread voter fraud, and less evidence that such measures prevent what fraud there is.



A coalition of groups, including the North Carolina NAACP and the Department of Justice, sued the state over the law, and after losing at the district court level, scored a resounding victory at the Fourth Circuit Court of Appeals, which struck down most of the law, finding that it was specifically intended to suppress the votes of minority voters who vote overwhelming Democratic. That meant voting in November was conducted—for the most part—under the pre-2013 laws.



Now McCrory and his allies are challenging the tally in the election, suggesting that fraud is responsible for Cooper’s advantage. In essence, the battle over the McCrory-Cooper race is another major battle in that campaign. If Cooper wins, it will put a Democrat in the governor’s mansion—albeit still with strong Republican majorities in the legislature—and slow the conservative revolution. But if McCrory can somehow come back and win, it would not only preserve those conservative changes but also offer an opportunity to prove that fraud is real.



One of the more readily apparent problems with the McCrory team’s claims of widespread fraud is that several other statewide Republican candidates won solid victories, including Donald Trump, Senator Richard Burr, and Lieutenant Governor Dan Forest. But McCrory and his allies have several avenues to try to win him reelection. Here’s a quick rundown.



Formal Protests and Challenges: Republicans have tried to argue that there are examples of problems with balloting around the state. In one case, a McCrory ally requested a recount in Durham County, a Democratic stronghold where votes came in particularly late on election night; some Republicans took this as evidence of chicanery. But the county’s Republican-led board of elections rejected that request. (All county boards of election across the state have a 2-1 majority in favor of the party of the sitting governor.)



McCrory and the state GOP have also brought up various cases where they argue that other ballots were improperly cast—for example, by felons who should not have been allowed to vote, or by people who died after casting absentee ballots but before election day, or by people who voted more than once. In total, McCrory’s team mounted challenges to votes in more than half of the state’s counties. But it’s not clear that McCrory would build a lead, even if every one of those challenges were granted. During a long state Board of Elections meeting today, Democratic and Republican lawyers sparred over how best to proceed on those challenges.



Meanwhile, the state Board of Elections added some 1,500 provisional votes to the final tally as the result of a lawsuit from liberal groups that complained that the state DMV had failed to pass voter registration data to the Board in a timey fashion.



A Lawsuit Over Same-Day Registration: Separately, the Civitas Institute Monday evening filed a lawsuit seeking a restraining order against counting ballots cast through same-day registration this year. The Civitas Institute is largely funded by Art Pope, a conservative businessman who spent heavily to produce Republican victories and served as McCrory’s first budget director. When voters register and vote on the same day in North Carolina, they are required to show proof of address, which can be either a photo ID or some other document. The government also sends a letter to the address they provide to ensure it is valid.



Civitas’s suit argues that same-day votes should not be counted until this process can be completed, because the lack of verification lends itself to fraud. It is true that a higher number of same-day registrations fail the test than standard registrations, but the number is still very small—less than 2.5 percent. Advocates for same-day registration argue that even that number is misleading, and that many of the rejected registrations are rejected simply because the voters have moved. The suit covers more than 90,000 votes, and Civitas argues that based on the past statistics, that means some 3,000 votes invalid votes could be counted. That’s still less than Cooper’s lead at the moment, and so discarding these wouldn’t on its own change the outcome, even if they all favored his opponent.(Same-day voters broke roughly equally among registered Democrats, Republicans, and unaffiliated voters, so this might not help McCrory that much.) In any case, the challenge is notable because it seeks to exclude voters from the tally once the election is over and votes have been cast.



Recount: Tuesday afternoon, the McCrory campaign formally requested a recount of the race. A candidate’s request for a recount must be granted if the margin of the race is less than 10,000 votes, and while it’s possible that the final canvass will put Cooper’s lead above that mark, it is not the case right now, meaning the recount will likely move forward.



“With serious concerns of potential voter fraud emerging across the state, it is becoming more apparent that a thorough recount is one way the people of North Carolina can have confidence in the results, process and system,” McCrory wrote.



Democrats scoff at McCrory’s request, arguing that a recount won’t save him. A spokesman for Cooper called it a “last-ditch effort to delay and deny the results of the election,” adding, “We are confident that a recount will do nothing to change the fact that Roy Cooper has won this election.”



It is true that the margin of victory, while well within the bounds of a statutory request, seems hard to overcome. There have been several notable recounts in recent memory nationwide, including the 2000 Florida presidential vote, the 2004 Washington state gubernatorial contest, and the 2008 U.S. Senate race in Minnesota. But those races’s tallies before recounts came in the hundreds or even tens of votes, not thousands. For McCrory to succeed, a large county would probably have to find a cache of thousands of votes that were uncounted or improperly counted, which is unlikely but not impossible.



A Legislative Intervention: Finally, there’s been a great deal of national attention given to the possibility that the General Assembly could intervene and hand the election to McCrory. State law says that in a contested election, the legislature should either determine who won the most votes or, if unable to do that, call a new election. The law further says that decision is not reviewable by state courts. Slate wrote that McCrory’s “real goal appears to be to delegitimize the results to such an extent that the state legislature—which holds a Republican supermajority—can step in and select him as the winner.”



In an interview with The News and Observer, state House Speaker Tim Moore, a Republican, didn’t rule the assembly taking the election out, but he didn’t sound enthused about it either: “The media has certainly covered the constitutional provision that gives the General Assembly the authority to weigh in on that, but given that the elections are not finalized at this point, I think further comment would be premature.”



In an excellent and useful column, Peter St. Onge of The Charlotte Observer explains some of the back story of the law, which was created after a particularly unusual case in which thousands of votes were lost due to a malfunctioning machine. Crucially, as St. Onge points out, a General Assembly decision might not be reviewable by state courts, but it certainly would be reviewable by federal courts, and as election-law expert Rick Hasen of the University of California-Irvine writes, attempting to coronate McCrory for a second term in the face of results that showed a Cooper victory could violate constitutional due-process and equal-protection guarantees. Hasen writes: “A brazen power grab without a plausible basis for overturning the results of a democratically conducted election? I expect the federal courts would take a very close look at such a thing.”



The concept of an effective coup has some strong historical resonances in North Carolina, where, in Wilmington in 1898, white supremacists toppled the local government in the only successful armed coup in U.S. history. Still, for all the uncertainty of the gubernatorial race, the Old North State has a long way to go—including the final canvass, the likely recount, and the resolution of the Civitas suit—before the General Assembly comes into the picture.


 •  0 comments  •  flag
Share on Twitter
Published on November 23, 2016 01:50

Atlantic Monthly Contributors's Blog

Atlantic Monthly Contributors
Atlantic Monthly Contributors isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Atlantic Monthly Contributors's blog with rss.