Atlantic Monthly Contributors's Blog, page 231

February 15, 2016

Black Deutschland: A Melocomic Novel of Experience

Image










In 1977, when he was 24 years old, Darryl Pinckney published his first essay for The New York Review of Books. Reviewing a volume about the black bourgeoisie’s social practices, Pinckney initiated his life’s work: interrogating and illustrating the history of African American life, which he described as “complicated, fragmented, disturbing to contemplate.”



Pinckney’s countless superb critical essays since encompass the length and breadth of African American letters. He’s also added to that body of literary works with his own novels, High Cotton (1992) and his difficult, vibrant new book, Black Deutschland, both of which feature narrators striving for self-recognition. His fiction benefits from his literary historical knowledge: His protagonists come from Midwestern branches of the black bourgeoisie and they tell their stories in the first-person memorial mode that’s central to the African American literary tradition, from Harriet Jacobs’s Incidents in the Life of a Slave Girl to Margo Jefferson’s Negroland.





Jed Goodfinch, the narrator/protagonist of Black Deutschland, is a middle-class Chicagoan attempting an Isherwood-inspired life in West Berlin during the late 1980s. Like so many of his expatriate forebears, he’s intent on both self-discovery and escape. But Pinckney improvises and revises the form he’s adopted, avoiding the temptation to lead Jed to easy resolution. In a novel about escaping the confines of home, family, racial narratives, and self-loathing, he argues that accepting those constraints is vital for the narrative of the autobiographical self to emerge.



While Pinckney’s prose and formal approach in Black Deutschland point to literary ancestors like W.E.B. Du Bois, James Baldwin, Elizabeth Hardwick, and Christopher Isherwood, the avuncular influence of the Harlem Renaissance writer Claude McKay is perhaps most strongly felt. Jed, for example, is like a late 20th-century version of McKay’s character, Ray, the self-doubting, ambivalent artist-intellectual who shows up in Home to Harlem (1928) and Banjo (1929).



Like McKay’s Banjo, Black Deutschland is a novel without a plot. And like Ray, Jed is a directionless, restless protagonist: He’s a coke-snorting, weed-smoking, Alcoholics Anonymous devotee fresh out of rehab at the novel’s start; he’s an African American homosexual who toggles between gay and straight sexual liaisons, and black and white lovers; and he’s a storyteller whose sense of narrative time sometimes skitters away from his control, causing sequences in Chicago and Berlin to overlap without explicit transitions. At times, this tick makes it difficult to keep Jed’s story in historical order. In other instances, it allows the two cities to speak to each other, illuminating elements of Jed’s experience that even he can’t see.



Linking the book’s fragments is like following Jed’s mind as it wanders in search of the roots of his ambivalence and alienation.

Because Pinckney doesn’t construct any direct confrontations about history or blackness, or assert any strict definitions of gay life or African American identity, the novel doesn’t feel explicitly or especially political. Hence, Black Deutschland feels more like a melocomic novel of experience. At the beginning of the book, Jed is in his early 30s and has just arrived in West Berlin for the second time to live with his cousin, Cello, and her German family. (Having gone deep in his cups of white wine and lines of cocaine on his first go around, Jed was forced back to Chicago for sobering up and growing up.) His return comes with a post as a research and writing assistant for an architect, the extravagantly named N. I. Rosen-Montag. After hours, he haunts ChiChi, a bar populated by straight, black American veterans, where he buys rounds and reenters the camaraderie of a male cohort, and the black American life he can’t seem to make work back home.



Outside of ChiChi, Jed is wobbly, unsure. He fumbles his job and his loves away. He’s attracted to Manfred, a young, straight German architect, but his desire is unrequited. Later, in the midst of an affair with with Duallo, a Franco-Cameroonian from Paris, Jed can’t tell if he’s “really in love” or if he’s just “relieved to have someone, to have joined the living.”



Pinckney uses references, quotations, flashbacks, and scenes from histories as buffers between Berlin and Chicago. He also uses them to open chapters or serve as interstitial points between sections. Sometimes this material feels disconnected from Jed’s story, but it helps to read these sequences as if they were unedited marginalia within a notebook draft of Jed’s memoirs. Pinckney also plays with time: While Jed recalls his late 1980s life from a future remove, his anecdotes and his working notes conjure, variously, Frederick Douglass and the 1893 Chicago World’s Fair, W.E.B. Du Bois as a 25-year-old doctoral student in 1890s Berlin, Rosa Luxemburg and the 1918 German Revolution, and the 1968 Democratic National Convention in Chicago, among other things. Linking these fragments is like following Jed’s mind as it wanders in search of the roots of his ambivalence and alienation.



While some asides loop back into play within a few pages, others don’t make sense until late in the work. The political upheaval of the late 1980s, for example, is largely marginalized in Jed’s narrative. Rather than allow Jed to really wrestle with the Reagan Years, the end of the Cold War, the rise of AIDS, or the specter of international terrorism, Pinckney broadcasts key political events on televisions that sit, float, and hang like stage backdrops for the scenes of Jed’s life: Pan Am Flight 103, Harold Washington’s funeral, Ayatollah Khomeini’s fatwa on Salman Rushdie, Huey Newton’s murder, Tiananmen Square, Rock Hudson’s ruined body. It’s only in the novel’s final chapter, when Jed finds himself among the revelers bringing down the Berlin Wall, that his story interlaces with the larger political moment.



Pinckney’s diction can occasionally be clunky. When Jed checks on Duallo during a party, the action is meant “to say to whomsoever to him cometh that the nectar in question was mine.” And sometimes his similes fall flat: A curved apartment building resembles “a diesel engine about to run him over.” But in other sections his writing is acutely sharp and smart. In the Chicago scenes, where Jed juxtaposes scenes from the civil-rights movement with scenes of his family mourning Mayor Harold Washington, Pinckney writes with great humor and tenderness. His writing limns a quality Jed’s uncle describes in one scene as “the negrificity of these proceedings.”



Pinckney knows that the black autobiographical self begins as a reflection of home.

Through his remembrances, Jed recognizes a version of himself in Cello: They’re both striving, intelligent, prone to addiction ... and ultimately friendless and lonely. In fleeing the U. S., they both want to escape the psychological, emotional, and social traps they perceive in black American life, and they mirror for each other their failure to find the imagined, impossible Berlin that would liberate them from themselves.



But Jed’s notes and fragments, his memories of family life in Chicago, of the civil-rights movement, of the Sorrow Songs and “what Frederick Douglass knew of them,” and how they took Du Bois “home to Negro-hating America,” create a chain that awakens him to the reality of America’s racial matrix. Jed recognizes in the end that to disengage from that linkage is to sit in seclusion. “I am one of the black American leftovers who sit by themselves,” he thinks. “I just wanted to be left alone. I was. I have been, my slowed footsteps a perfunctory but familiar chorus.”  



Pinckney knows that the black autobiographical self begins as a reflection of home. Even though literature empowers writers to create intriguing, beautiful, and endless cities to inhabit, families show up in imagined worlds to point to the other realities they belong to. Jed’s narrative isn’t intended to symbolize the black American story, but his route to self-recognition necessarily engages both his family history and the broader African American experience. Relinquishing those domestic and ethnic bonds reduces them to dust, making Jed simply the lexicographer, as he puts it, of his own “desire and ruin”


 •  0 comments  •  flag
Share on Twitter
Published on February 15, 2016 04:00

February 14, 2016

The Walking Dead: An Eye for an Eye

Image










Every week for the sixth season of AMC’s post-apocalyptic drama The Walking Dead, Lenika Cruz and David Sims will discuss the latest threat—human, zombie, or otherwise—to the show’s increasingly hardened band of survivors.


Lenika Cruz: David, If I had to sum up the episode up into four words it’d probably be: Boom, chomp, chop, bang.



“Boom” being the heavy-metal, face-melting explosion that opened the episode and wiped out a dozen new characters mere minutes after they were introduced. “Chomp” being, really, two sets of zombie death-chomps: Sam (good riddance to him and his haircut) and Jessie (sad-face emoji), both of whom got eaten by the Alexandrian walker horde. “Chop” being the horrific amputation of Jessie’s hand via a few messy axe swings by Rick, in an attempt to free Carl from her dying graspy. “Bang,” of course, refers to another big moment comic fans have been waiting for: the gunshot that left a gaping hole in Carl’s face where his eye used to be. Crazily, three of these things took place within a roughly 90-second span.





Which, when I write it all down, sounds pretty cool! As far as midseason premieres go, this episode could have done a lot worse: It had plenty of action, lots of critical plot shifts, the demise of important characters (R.I.P Yellow-Teeth Wolf), suspense, and character development. And yet I couldn’t help but feel largely unimpressed by an episode that I think I was intended to leave me with my jaw on the floor for an hour. I wasn’t a fan of how the show executed that climactic scene with Jessie, Sam, and Carl—it was poorly paced, and I found Rick’s flashbacks to be both tacky and unnecessary. If the show needs to remind viewers of how much he liked her, it probably didn’t do a great job of setting up their relationship in the first place. (On the bright side, Rick-Michonne shippers, rejoice!)



I suppose beginning a half-season with this much energy is a wise call, but I can’t help but feel that “No Way Out” should have been the midseason finale. (The actual one left us both unimpressed—I called it a “false ending,” you called it a “dud in every sense of the word.”) Part of my dislike for this episode comes from the fact that it reminded how bad everyone’s storylines ended up last November: Morgan and Carol having that ridiculous fight over the Wolf, who then took Denise hostage; Eugene being a giant baby; Glenn and Enid watching a stranded Maggie from afar; a perennially trembling Sam leading the group to almost-certain doom. I’m relieved, though, that most of the dangling plot threads are now tied up, because that opens the possibility for new subplots moving forward.



While writing these roundtables, I often want to directly copy-paste lines of criticism that you or I have offered in past articles in reference to new episodes. I think this maybe says less about my own laziness than it does about the fact that the show, six seasons in, simply continues to make the exact same kinds of mistakes over and over again: the inappropriately timed one-on-ones, the overly earnest soliloquies about The Way Things Are Now; the blatant disregard for the internal logic of the world itself when it’s convenient to the story at hand (all the years spent pretending that cloaking oneself in zombie guts is not a highly effective trick).



I couldn’t help but feel unimpressed by an episode that was meant to leave me with my jaw on the floor.

On a related note, last week I went to a Walking Dead event organized by the Smithsonian Associates that featured the director Greg Nicotero, the showrunner Scott Gimple, the prop master John Sanders, and the very British and delightful Andrew Lincoln (who plays the very Southern and not-delightful Rick Grimes). Even though I spend a lot of time watching and thinking and writing about the show, I felt a little out-of-place in the crowd of diehard fans (“What if they sniff out my ambivalence?!”). But during the question and answer portion, I was surprised to find that the most critical, even provocative, questions came from two self-professed obsessives (one said she had previously met Nicotero on a Walker Stalker Cruise for zombie lovers.) The lesson being that getting mad at the show for not being better (or for not being as great as it has proven it can be) is its own kind of affection—a weird, tough-love fandom.



So what of the future for the rest of season six? I learned a little bit from the Smithsonian Associates panel. Despite all the excitement about Negan, it doesn’t seem like he’ll be a major presence the rest of this season. Nicotero and Gimple said the season would be among the darkest so far, but also would attempt to weave in more comedy (what form that’ll take, who knows). Part of me expected Alexandria to fall, in line with the show’s habit of using tragedy to periodically force the characters back into nomadism. But Rick’s final heart-to-heart with a comatose Carl makes me think they’re not leaving these dorks behind anytime soon, especially now that they’ve all finally learned how to survive and defend themselves. If a bunch of pathetic suburbanites can rise up and make Rick feel hopeful for the first time since before the zombie apocalypse, I guess anything can happen.




David Sims: Honestly, at this point, I’ll take some comedy. Probably the best two moments of this episode were the bleakly funny ones—the deadly explosion that punctuated that tense opening sequence, and then Abraham and Sasha gleefully mowing down zombies with machine guns near the end. Both times, the show was teasing us with something it had done many times before: the deaths of some major characters. Both times, it seemed delighted with the fact that it had fooled us again—Abraham, Sasha, and Glenn will all live to fight another day, but don’t you dare think they’re remotely safe, viewer. It felt a little cheap, but the show was poking at its own self-seriousness, which is something it should do far more often.



Especially since, oh boy, the rest of this episode. Lenika, I agree with you that this probably should have served as the mid-season finale—there were at least a couple episodes last year that were just marking time, and it would have been far less confusing to just wrap up the fall of Alexandria last year, rather than plunging us back into the middle of this chaos. After that opening with Negan’s gang (who are obviously being positioned as the next big baddies), this entire episode was concerned with the logistics of the Alexandria survivors trying to escape, regroup, and purge their town of the zombie horde. It felt like a classic Walking Dead finale: Having spent weeks scattering its ensemble on various missions, they all come back together and help each other out in a bunch of big hero moments.



The majority of this episode was just an emotionless bloodbath.

So I liked the arrival of Abraham and Sasha, I liked Father Gabriel stepping up and being a hero, and I liked Glenn reuniting with Maggie. I even liked Carol finally getting her chance to take out that crazy Wolf, even if he seemed to be making some emotional progress in his time with the friendly doctor Denise. The zombie bonfire that closed the episode wasn’t too far off from what our colleagues Jeffrey Goldberg and John Gould suggested as a solution for the original pit of zombies way back at the beginning of this season. This show is invested in Alexandria as its base of operations, rather than having the gang wander on to some new territory, and there’s something admirable about its characters deciding to break the mold and stay put.



But to do that, The Walking Dead needed to clear Alexandria of all its boring characters, so the majority of this episode was just an emotionless bloodbath. I appreciate that the whole “slather yourself with zombie gore to move among the horde” trick cannot be used as a get-out-of-jail-free card by the writers over and over again, so it made sense that some of the characters trying to do that here had to bite the dust. But the show was so obviously killing off its most dramatically inert characters—the annoying kids—and then added Jessie to the body count because anyone in love with Rick obviously has to die, simply to add to his psychological burden. At this point, it might feel more audacious if this show stopped killing off characters. I don’t know that there’s any more surprises to be wrung from gory death scenes.



So where does this leave us, in the middle of season six? Alexandria’s walls have fallen, but the zombies have been burned away; whatever gentle citizen-driven government existed there has also been purged after attacks on all sides. The Wolves may or may not be coming back; Negan’s gang is definitely lurking around the corner. There’s a lot of rebuilding to be done, and poor Carl’s missing an eye now. Season six started with such a strong sense of purpose: There was a pit of zombies, and something had to be done about it. Now, all that purpose has been drained out of the show and replaced with a lingering sense of unease. The Walking Dead is often at its worst when it’s introspective and slow. If what you say is true, Lenika—that Negan isn’t showing up for a little while longer—I worry we have a depressing run ahead of us.


 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 19:00

Revisiting The Power and the Glory During Lent

Image










In the world of Graham Greene’s 1940 novel, The Power and the Glory, it’s a bad time to be a Catholic. The book’s hero is an unnamed priest on the run from Mexican authorities after a state governor has ordered the military to dismantle all vestiges of the religion. Churches are burned. Relics, medals, and crosses are banned. The price for disobedience is death. While many clerics give up their beliefs and accept their government pensions, the unnamed priest travels in secret, celebrating Mass and hearing confessions under the cover of night. Yet he’s also a gluttonous, stubborn, and angry man drowning in vices, and the religious ambition of his earlier years has been replaced with a constant desire to drink, hence Greene’s term for him: the “whiskey priest.” Tired of risking his life, the priest even prays to be caught.





A violent, raw novel about suffering, strained faith, and ultimate redemption, The Power and the Glory received literary acclaim—but not without catching the attention of Catholic censors, who called the book “sad” and denounced its “immoral” protagonist. Despite—or even because of—this vexed history, Greene’s novel is the perfect book to read during the season of Lent, which began Wednesday. Stereotypically a time for modern Christians to abstain from Facebook or chocolate or alcohol, Lent is the most dramatic time of the liturgical year—40 days of prayer, fasting, and cleaning one’s spiritual house, in hopes that honesty might lead to penance and good deeds. One vision of Lent emphasizes transcendence over struggle: the American Catholic writer Thomas Merton called it “not a season of punishment so much as one of healing.” But Greene’s dark novel and its deeply flawed protagonist offer a richer way to think of faith and self-reflection—one that average Christians might find more accessible and realistic than romantic narratives about belief.



In life and in fiction, Greene was more interested in sinners than saints, and the whiskey priest is no saint—at least not for most of the story. If anything, he’s closer to the modern conception of the antihero. His pride swells his sense of importance. In one village, where the faithful fear retribution from the military officers, the priest hesitates to leave: “Wasn’t it his duty to stay, even if they despised him, even if they were murdered for his sake, even if they were corrupted by his example?” Later, he’s selfish, crude, and heretical in one stroke: He eats a sugar cube that he discovered by a dead child’s mouth, rationalizing “If God chose to give back life, couldn’t He give food as well?” In these and countless other examples, Greene shows how easily dogma can disappear in the face of desperation.



But beneath the darkness of the priest’s actions is faith, which he bears witness to in two pivotal scenes. Arrested for possessing outlawed alcohol, he’s thrown into a small jail cell with a “pious woman,” who later notices a couple having sex in the corner. “The brutes, the animals!” she exclaims. And yet the priest counsels the woman to not think that their action is ugly, “Because suddenly we discover that our sins have so much beauty.” In lines that reflect the lived truth of Lenten struggle, the priest explains, “Saints talk about the beauty of suffering. Well, we are not saints, you and I. Suffering to us is just ugly. Stench and crowding and pain. That is beautiful in that corner—to them.” Greene’s aversion to sentimentality makes for palpable theology. He finds God in dirt and in blood—in the Christian struggle to make faith matter in life.



Ironically, the pious woman is herself the antithesis of Lenten reflection. The priest asks her to pray for him, but she responds, “The sooner you are dead the better.” He thinks about how he can’t see her face in the dark, and uses that blindness as a metaphor for judgment and misunderstanding: “When you saw the lines at the corners of the eyes, the shape of the mouth, how the hair grew, it was impossible to hate. Hate was just a failure of imagination.”



Nevertheless, after publication not everyone was convinced of The Power and the Glory’s spiritual value, least of all some members of the Vatican, which had been alerted to its questionable content. Greene’s literary celebrity at the time caused some high-level Catholic officials to fear how influential his novel’s depiction of Catholicism would be. One of the consultants appointed to assess the novel concluded that “literature of this kind does harm to the cause of the true religion.” As Peter Godman noted in a comprehensive 2001 piece on the novel for The Atlantic, “The moral and theological criteria of The Power and the Glory are ambiguous—so ambiguous that self-appointed censors have sniffed an odor of heresy in the book.”



Greene’s aversion to sentimentality makes for palpable theology. He finds God in dirt and blood—in the struggle to make faith matter in life.

This ambiguity—one of the very qualities that makes The Power and the Glory such a fascinating work—stems from Greene’s own complicated relationship to religion. A self-described “Catholic agnostic,” he had believed in “nothing supernatural” until his future wife pointed out his misunderstanding of the Virgin Mary in one of his film reviews. “I was interested that anyone took these subtle distinctions of an unbelievable theology seriously,” he said. After their engagement, he concluded “that if I were to marry a Catholic I ought at least to learn the nature and limits of the beliefs she held ... Besides, I thought, it would kill the time.” Greene “fought and fought hard” against belief on the “ground of a dogmatic atheism,” comparing his struggle to a “fight for personal survival.” In 1926, there was no grand epiphany, but a quiet shift: Greene took the baptismal name of St. Thomas the doubter.



So why does a 76-year-old novel about a sinful, alcoholic priest get my vote for the perfect book to read for Lent? Whereas a more optimistic believer might have written a devotional novel, Greene’s novel feels informed by the messy reality of lived belief. (He wrote it after traveling to Tabasco, Mexico, and learning about the brutal anti-Catholic laws imposed by its governor, Tomás Garrido Canabal.) Not despite but because of his sins, the whiskey priest is the prototypical Lenten character. ​Much like how Lenten resolutions—no chocolate! no TV!—are strained with each passing day of the season, the priest expects some degree of failure. But this doesn’t weaken him spiritually.



Dramatizing the Lenten struggle between doubt and faith, complacency and reflection, Greene’s novel examines what happens when an unfit believer is made responsible for the wellbeing of an entire community. Because the government has destroyed the physical artifacts of Catholicism, the priest must turn inward and confront his own doubts, and as the only remaining face of the church, he’s forced to air his private demons. But rather than reveling in these sins, the priest is crushed by their significance and seeks to replace greed with grace. “It was too easy to die for what was good or beautiful,” the priest reflects, before the novel’s tragic end. The world “needed a God to die for the half-hearted and the corrupt.” It’s the novel’s empathy for “the half-hearted and the corrupt”—and its recognition that even those people are worthy of salvation—that makes the story an unlikely but ideal one to revisit in the weeks before Easter.


 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 05:00

The Remarkable Life of Antonin Scalia

Image










Antonin Scalia, the judicial firebrand who stood as the intellectual leader of the U.S. Supreme Court’s conservative wing during his three-decade tenure as a justice, died Saturday at a ranch in western Texas. He was 79 years old.



“He was an extraordinary individual and jurist, admired and treasured by his colleagues. His passing is a great loss to the Court and the country he so loyally served,” Chief Justice John Roberts said in a statement on behalf of the Court.



President Obama, who will have the opportunity to nominate Scalia’s successor, offered his sympathies to the justice’s family on Saturday night. “He will no doubt be remembered as one of the most consequential judges to serve on the Supreme Court,” he said.



Scalia articulated a straightforward role for jurists during his 29-year career on the Court. The Constitution should be read as the Founders wrote it, he often argued, and laws should be interpreted as they are written. He rejected the idea of an evolving “living Constitution,” claimed by some of his colleagues. “I just say, ‘Let’s cut it out. Go back to the good, old dead Constitution,’” he told NPR in 2008.



Only he and Clarence Thomas championed originalism on the Supreme Court for most of his tenure, limiting the doctrine’s impact. Its adherents also initially found little room in the academy. “You could fire a grapefruit out of a cannon over the best law schools in the country—and that includes Chicago—and not hit an originalist,” he told a group of University of Chicago law students in 2003. But Scalia’s enthusiasm helped the school of legal thought enter the mainstream, with law schools such as Harvard eventually hiring professors who favor it.



“I mean, that’s amazing to me. [Fellow justice and former Harvard Law dean] Elena Kagan did that, and the reason she did it is that you want to have on your faculty representatives of all responsible points of view,” he said in a 2013 interview with Jennifer Senior. “What it means is that at least originalism is now regarded as a respectable approach to constitutional interpretation. And it really wasn’t 20 years ago, it was not even worth talking about in serious academic circles.” He rightly claimed a large share of the credit for himself.



The most colorful justice in living memory, Scalia relished his role on the Court. His aggressive questioning hastened the transformation of oral arguments into a lively exercise between the justices and the advocates. He considered each of his colleagues to be his friends, especially Justice Ruth Bader Ginsburg, whose famously warm friendship with him was recently portrayed in opera. His majority opinions aimed for an accessible, easy-to-read tone, with occasional interjections of wit. Scalia often remarked that he wrote his opinions so that first-year law students would want to read them. And his fiery dissents made him a household name, as loved by conservatives as he was loathed by liberals.



Antonin Scalia was born in Trenton, New Jersey, on March 11, 1936, to an Italian immigrant father and a mother whose parents also immigrated from Italy. Raised in a staunchly Catholic family, Scalia attended Georgetown University and then Harvard Law School. From there, he entered private practice and then taught law at the University of Virginia before serving in the Ford administration’s Justice Department. After Scalia spent a spell back in private practice and teaching during the Carter years, Ronald Reagan appointed him to the D.C. Circuit Court of Appeals in 1982.



Reading an Antonin Scalia opinion with which you agreed was like uncorking champagne.

Scalia joined the Supreme Court at the tail end of a major ideological transition in the late-20th century, as the justices from the heady liberal era of the Warren Court retired and gave way to an increasingly conservative bench under Chief Justice Warren Burger in the 1970s and 1980s. When Burger retired in 1986, President Ronald Reagan elevated Justice William Rehnquist to the top spot on the Court. To fill Rehnquist’s old seat, Reagan plucked Scalia from his posting on the D.C. Circuit, a common stepping stone to the Court. The U.S. Senate breezily confirmed him, in an era before raucous confirmation hearings and partisan votes for the justices, 98-0.



On the Court, he soon carved out a reputation for vivid, pugilistic writing. Reading an Antonin Scalia opinion with which you agreed was like uncorking champagne. In the 1988 case Morrison v. Olson, a 7-1 majority upheld the constitutionality of the Independent Counsel Act, which established an independent prosecutor outside of the Justice Department to investigate government officials.



Writing alone in one of his finest dissents, Scalia inveigled against the Court’s attack on separation of powers. Only the executive branch could conduct criminal investigations, he argued. Congress cannot lawfully grant that power a government official who wasn’t subordinate to the president. “Frequently an issue of this sort will come before the Court clad, so to speak, in sheep's clothing,” he wrote. “But this wolf comes as a wolf.”



Scalia’s vision of a government limited to the Founders’ original understanding of the Constitution endeared him to conservative legal thinkers. Republican politicians often touted him as a model justice on the campaign trail. Perhaps his most notable majority opinion came in 2008 when Scalia penned District of Columbia v. Heller, a landmark ruling that recognized an individual right to bear arms in the Second Amendment. Scalia also wrote United States v. Jones, which forbade warrantless GPS tracking of criminal suspects’ vehicles, and Gonzalez v. Raich, a ruling upholding the federal government’s power to ban marijuana in states where it is medically legal.



Alongside other justices in the majority, Scalia voted to strike down limits on corporate and union expenditures in Citizens United, to remove caps on individual campaign donations in McCutcheon v. FEC, in favor of the Partial-Birth Abortion Ban Act in Gonzalez v. Carhart, to protect flag desecration under the First Amendment in Texas v. Johnson, and to limit the federal government's use of the Commerce Clause in United States v. Lopez, among many others.



For liberals and their allies, Scalia was an implacable adversary. Their ire was well-earned. He joined the majority to gut the Voting Rights Act of 1965 in Shelby County v. Holder in 2013; at oral arguments, he referred to the historic law as a “racial entitlement.” He argued that no right to abortion could be found in the Constitution’s text and consistently voted to limit the practice. Had the Court struck down Roe v. Wade during his tenure, Scalia might have even written the opinion himself. Years later, when asked about Bush v. Gore, he said that critics of the contentious decision should “get over it.”



When the Court began to examine LGBT rights in the 1990s and 2000s, Scalia stood firmly opposed to their judicial recognition. He dissented from a series of rulings written by Justice Anthony Kennedy, a fellow Reagan appointee, that protected gay and lesbian Americans from discrimination. In a 2003 dissent from Lawrence v. Texas, which struck down sodomy laws nationwide, Scalia complained that the Court had “largely signed on to the so-called homosexual agenda.”



But he lost the fight in United States v. Windsor, which struck down the Defense of Marriage Act in 2013, where he predicted the ruling’s logic would eventually lead to marriage equality. “The real rationale of today’s opinion, whatever disappearing trail of its legalistic argle-bargle one chooses to follow, is that DOMA is motivated by ‘bare … desire to harm’ couples in same-sex marriages,” he wrote. “How easy it is, indeed how inevitable, to reach the same conclusion with regard to state laws denying same-sex couples marital status.”



Ironically, many lower courts subsequently did just that, often by citing Scalia’s own interpretation of the ruling. When the Court struck down bans on same-sex marriage nationwide last June in Obergefell v. Hodges, Scalia called it a “judicial putsch” and "a threat to American democracy.” Scalia frequently insisted he had no personal animus towards gay and lesbian Americans. Instead, he argued the rulings short-circuited the democratic process in favor of raw judicial power.



His longest-running argument with his colleagues was over one of the Court’s deepest fissures: the death penalty. Most justices believe that the Eighth Amendment limits how states can wield the ultimate punishment. Scalia argued instead that the amendment should be interpreted by its original understanding, when capital punishment was the norm in American criminal justice. Accordingly, he fiercely opposed most modern restrictions on its scope, including bans on juvenile death sentences and executions of the mentally disabled, that moderates like Justices Kennedy and John Paul Stevens favored.



True to his judicial philosophy, he stood by this interpretation even as other justices bent to the system’s injustices. When Justice Harry Blackmun penned an emotional 7,000-word dissent in 1994 to announce he would “no longer tinker with the machinery of death,” Scalia attached a brief opinion denouncing it. Abolitionists often drew his sternest wrath. The Eighth Amendment could not prohibit capital punishment, Scalia often told them, because other parts of the Constitution contemplate its existence. When Justice Stephen Breyer suggested it might be time to revisit the death penalty’s constitutionality last June, Scalia responded in his usual fashion.



“Welcome to Groundhog Day,” he quipped, before decrying that Breyer “rejects not only the death penalty, he rejects the Enlightenment.”



With the right justices alongside him at the right time, Scalia could have led a conservative constitutional revolution equal to the Warren Court’s liberalism in the 1950s and 1960s. But he never quite got there. Rehnquist was too hesitant. Justices Kennedy, Sandra Day O’Connor, and David Souter were too moderate. The Court’s liberals, from Justices William Brennan to Sonia Sotomayor, were too numerous. And Chief Justice John Roberts and Justice Samuel Alito were too late.



But if this bothered him, he never let on. “You know, for all I know, 50 years from now I may be the Justice Sutherland of the late-twentieth and early-21st century, who’s regarded as: ‘He was on the losing side of everything, an old fogey, the old view,’” he said in a 2013 interview. “And I don’t care.”


 •  0 comments  •  flag
Share on Twitter
Published on February 14, 2016 04:42

February 13, 2016

Obama's Views on Antonin Scalia—and the Justice's Successor

Image










President Obama called Justice Antonin Scalia, who died suddenly on Saturday at the age of 79, a “brilliant legal mind with an energetic style, incisive wit, and colorful opinions,” and said he intends to fulfill his constitutional responsibility and nominate a successor in due time.



“He influenced a generation of judges, lawyers, and, students, and profoundly shaped the legal landscape,” Obama said of Scalia. “He will no doubt be remembered as one of the most consequential judges and thinkers to serve on the Supreme Court.”



And, the president added: “Obviously, today is the time to remember Justice Scalia’s legacy. I plan to fulfill my constitutional responsibilities to nominate a successor in due time. There will be plenty of time for me to do so and for the Senate to fulfill its responsibility to give that person a fair hearing and timely vote.”



The president’s remarks cap the partisan debate that erupted almost as soon as news of the conservative justice’s death broke. Republicans fear that an Obama nominee to replace Scalia, who was nominated by President Reagan, would tilt the balance of the Supreme Court, which has four reliably conservative justices, four liberals, and one swing vote. Democrats hope an Obama appointment achieves precisely that.



Mitch McConnell, the Senate majority leader, said the vacancy on the Supreme Court created by Scalia’s death should be filled by the next president. Harry Reid, the Senate minority leader, urged Obama to “send the Senate a nominee right away.”



Presidential candidates from both parties weighed in as well



Senators Ted Cruz and Marco Rubio, both Republican candidates for the presidency, called for the next president to choose Scalia’s replacement on the Supreme Court. Hillary Clinton, the Democratic candidate, said those who call to keep the seat vacant until after the next president is sworn “dishonor our Constitution.”  



Obama’s remarks Saturday set up a prolonged fight with Republicans that, as my colleague Russell Berman reported, will play out both on  Capitol Hill and on the campaign trail.


 •  0 comments  •  flag
Share on Twitter
Published on February 13, 2016 18:04

Obama's Views on Antonin Scalia—and the Justice's Sucessor

Image










President Obama called Justice Antonin Scalia, who died suddenly on Saturday at the age of 79, a “brilliant legal mind with an energetic style, incisive wit, and colorful opinions,” and said he intends to fulfill his constitutional responsibility and nominate a successor in due time.



“He influenced a generation of judges, lawyers, and, students, and profoundly shaped the legal landscape,” Obama said of Scalia. “He will no doubt be remembered as one of the most consequential judges and thinkers to serve on the Supreme Court.”



And, the president added: “Obviously, today is the time to remember Justice Scalia’s legacy. I plan to fulfill my constitutional responsibilities to nominate a successor in due time. There will be plenty of time for me to do so and for the Senate to fulfill its responsibility to give that person a fair hearing and timely vote.”



The president’s remarks cap the partisan debate that erupted almost as soon as news of the conservative justice’s death broke. Republicans fear that an Obama nominee to replace Scalia, who was nominated by President Reagan, would tilt the balance of the Supreme Court, which has four reliably conservative justices, four liberals, and one swing vote. Democrats hope an Obama appointment achieves precisely that.



Mitch McConnell, the Senate majority leader, said the vacancy on the Supreme Court created by Scalia’s death should be filled by the next president. Harry Reid, the Senate minority leader, urged Obama to “send the Senate a nominee right away.”



Presidential candidates from both parties weighed in as well



Senators Ted Cruz and Marco Rubio, both Republican candidates for the presidency, called for the next president to choose Scalia’s replacement on the Supreme Court. Hillary Clinton, the Democratic candidate, said those who call to keep the seat vacant until after the next president is sworn “dishonor our Constitution.”  



Obama’s remarks Saturday set up a prolonged fight with Republicans that, as my colleague Russell Berman reported, will play out both on  Capitol Hill and on the campaign trail.


 •  0 comments  •  flag
Share on Twitter
Published on February 13, 2016 18:04

A Death That Reshapes U.S. Politics

Image










The sudden death of Antonin Scalia, an associate justice of the United States Supreme Court, on Saturday morning will shake up American politics like few events in recent memory, reshaping the 2016 presidential campaign and potentially leaving the Supreme Court deadlocked for more than a year.



In the short term, President Obama will have to decide who to nominate to replace the voluble conservative jurist, and the Republican-led Senate will have to decide whether to even consider the president’s pick in the heat of the election campaign. Majority Leader Mitch McConnell immediately signaled that an Obama nominee would not get a vote this year. “The American people should have a voice in the selection of their next Supreme Court justice,” the Kentucky Republican said in a statement. “Therefore, this vacancy should not be filled until we have a new president.” CNN reported Saturday evening that Obama intends to nominate a new Supreme Court justice, setting up a potential confrontation with Republicans that would play out both on  Capitol Hill and on the campaign trail.



The news of Scalia’s death broke just hours before Republican candidates were to debate in South Carolina, and Senator Ted Cruz swiftly called for blocking any nominee Obama sends to the Senate.




Justice Scalia was an American hero. We owe it to him, & the Nation, for the Senate to ensure that the next President names his replacement.


— Ted Cruz (@tedcruz) February 13, 2016


In a statement, Senator Marco Rubio praised Scalia and joined Cruz in calling for “the next president” to choose his replacement. Glowing testimonials also came in from President George W. Bush and his brother, Jeb. House Speaker Paul Ryan hailed Scalia’s Catholicism and said he “did more to advance originalism and judicial restraint than anyone in our lifetime.” Senator Orrin Hatch, a Republican, said in a statement that Scalia “led a much-needed revolution in the law” and then told Fox News that his death creates “probably the most important judicial vacancy in history.”



A spokesman for Senator Mike Lee, a Republican member of the Judiciary Committee, tweeted that the chances are “less than zero” that the Senate would confirm a new justice nominated by Obama, who has already appointed two of the court’s nine members. Even before Scalia’s death, the Senate’s process for confirming lower-court nominees had ground to a near halt.



Yet because of the close ideological divide on the Supreme Court, the vacancy caused by Scalia’s death could freeze action on several important cases, including major upcoming rulings on abortion rights, Obama’s Clean Power Plan to combat climate change, and the legality of his executive actions on immigration—all of which were expected to be decided by June. Whether or not Obama nominates a replacement, the makeup of the Supreme Court will now be a top issue both in the presidential race and the parallel campaign for control of the Senate this fall. With a large number of Republican senators in swing states up for reelection, a nomination battle could reshape the fall elections.



The Senate hasn’t rejected a Supreme Court pick since Ronald Reagan nominated Judge Robert Bork in 1987, and a nominee hasn't been filibustered since Lyndon Johnson nominated Abe Fortas, then an associate justice, to be chief justice in 1968.



Democrats immediately called on Obama to submit a nominee to replace Scalia.




The President can and should send the Senate a nominee right away. The Senate has a responsibility to fill vacancies as soon as possible.


— Senator Harry Reid (@SenatorReid) February 13, 2016




Would be unprecedented in recent history for SCOTUS to go year with vacancy. And shameful abdication of our constitutional responsibility.


— Senator Harry Reid (@SenatorReid) February 13, 2016



If Obama does try to test the Senate, who might he nominate? Speculation quickly turned to Judge Sri Srinivasan, 48, who was confirmed by a 97-0 vote by the Senate in 2013 to serve on the U.S. Court of Appeals for the D.C. Circuit. That court is considered the the second most prestigious in the country and has served as a stepping ground for numerous justices on the high court, including Scalia and Chief Justice John Roberts. Cruz and Rubio both voted in favor of his nomination. Another possibility is Judge Merrick Garland, 63, who also serves on the D.C. Circuit appellate court. Garland was an appointee of Bill Clinton and was considered as a possible safer choice before Obama nominated Elena Kagan to replace the retiring Justice John Paul Stevens in 2010.


 •  0 comments  •  flag
Share on Twitter
Published on February 13, 2016 15:58

The Republicans Rumble in South Carolina

Image










The Republican presidential field gathering in Greenville, South Carolina, on Saturday night is down to six, and it’s shrinking fast.



The New Hampshire primary culled another two candidates—Chris Christie and Carly Fiorina— from the race, and chances are that one or more of the hopefuls on stage Thursday night won’t make it to the next debate in Houston on February 26. But who’s next to go? Barring a big surprise, the winners of Iowa and New Hampshire, Ted Cruz and Donald Trump, should be safe until Super Tuesday at the beginning of March. Ben Carson is struggling to stay afloat as it is. The three candidates in the middle—Jeb Bush, Marco Rubio, and John Kasich—may have the most to lose. (Obligatory Jim Gilmore disclaimer: The former Virginia governor finally suspended his campaign on Friday afternoon after he was not invited to participate in the debate.)



The spotlight in Saturday’s 9 p.m. debate in CBS will undoubtedly shine brightest on Rubio. By now, everyone knows what happened to him in the last debate a week ago: He robotically repeated the same talking point about President Obama over and over again, became an instant butt of jokes, and sunk his chances for a strong second-place finish in New Hampshire. Rubio dropped to fifth and vowed that such a debate debacle “will never happen again.” The best thing the Florida senator has going for him is that his nemesis, Christie, will not be in South Carolina after suspending his campaign on Wednesday. Expect a self-deprecating joke or two from Rubio, who turned to humor a couple years ago when he was mocked for taking big gulps of water during his speech responding to the State of the Union address in 2013.



At this point, Rubio’s chief rival is probably his fellow Floridian, Jeb Bush, who might not have made it out of New Hampshire had Rubio not stumbled. It’s conceivable that only one of them will make it past the South Carolina primary on February 20, although the more bunched-up the candidates are, the more likely it is that they’ll both press on until March. Both Rubio and Bush have taken aim at Trump this week, and Jeb is bringing in big brother George to campaign for him in the Palmetto State. The occasionally over-candid former Florida governor also admitted to a radio host that a big part of his debate strategy was, “Don’t let Trump bully me.” Sounds about right.



Kasich could be the wild card in the debate. He put all of his chips on New Hampshire and “won” by finishing second. Yet now he really has to start all over in the rest of the country. South Carolina does not profile as the strongest state for him, and the limited polling that’s been done has him in the single digits. He could benefit from the low expectations, however, as he tries to hang in the race until Republicans vote in the big Midwestern states (like Michigan and his home state of Ohio) later in the spring.



As for the two front-runners, their on-again, off-again bromance is decidedly off as the campaign heads to South Carolina. Cruz has been hammering Trump over his support for eminent domain, airing a pair of ads attacking him on the issue. One is a traditional negative ad accusing him of “a pattern of sleaze,” while another, more light-hearted spot ends with a group of children destroying a toy house while shouting “eminent domain!” Trump has responded by reviving his questions about Cruz’s eligibility to be president, even threatening a lawsuit.




If @TedCruz doesn’t clean up his act, stop cheating, & doing negative ads, I have standing to sue him for not being a natural born citizen.


— Donald J. Trump (@realDonaldTrump) February 12, 2016


If that’s any indication, Saturday’s debate should be as lively as usual.


 •  0 comments  •  flag
Share on Twitter
Published on February 13, 2016 09:00

Beyoncé and Misty Copeland as Degas: The Week in Pop-Culture Writing

Image










Beyoncé in ‘Formation’: Entertainer, Activist, Both?

Jon Caramanica, Wesley Morris, and Jenna Wortham | The New York Times

“Like Nina Simone and peak Madonna before her (Beyoncé lands somewhere between the two as a polemicist), this is a woman who understands her own power, how to harness and magnetize us to it. I mean, I’m supposed to be out at dinner right now. Instead, I’m hunched over a computer contemplating the Beyoncé politic.”





Anti-Everything: The Culture of Resistance Behind Rihanna’s Latest Album

Erin MacLeod | NPR

“The album is called Anti. It’s anti-establishment, anti-expectations, but it’s also anti-colonial. Is Anti also a wide-ranging commentary on relationships? Sure it is. That’s part of what makes it a consistent, coherent representation of the postcolonial. It doesn’t have to be (or want to be) one thing. Rihanna is a one-woman argument for the importance of cultural studies.”



Justin Bieber Would Like to Reintroduce Himself

Caity Weaver | GQ

“Almost as soon as it broke, the OG Mally story took on a mythic quality. The primate, a pet owned by noblewomen in Renaissance art, and by Michael Jackson, became a symbol of Bieber’s excess. His loss of it was indicative of irresponsibility. His failure to reclaim it marked Bieber as uncaring: the father no monkey deserved.”



Macklemore, Hillary, and Why White Privilege Is Everyone’s Burden

Rembert Browne | Vulture

“It’s rarely graceful, but every time Macklemore does or says anything involving race or his whiteness and gets criticized for it, he goes away and comes back a little wiser. Is he there yet with ‘White Privilege II’? No. Does he need to be put on a pedestal for making this song? No. But is this song a net positive? Yes.”



Misty Copeland and Degas: Art of Dance

Stephen Mooallem | Harper’s Bazaar

“The story of her rise from living in a single room in a welfare motel with her mother and five siblings to the uppermost reaches of the dance world has become a sort of 21st-century parable: the unlikely ballerina, as Copeland referred to herself in the subtitle of her 2014 memoir, Life in Motion, who may be on her way to becoming the quintessential ballerina of her time.”



Head Over Heels

Doree Shafrir | BuzzFeed

“Madden has a well-honed instinct for what the Steve Madden girl wears and what she is willing to spend her money on, which has been the case since he started his company in 1990 with $1,100 in the bank, selling samples of a clog he had designed called the ‘Marilyn’ out of the trunk of his car. Since then, he has managed to sell more women’s shoes in this country than almost any other brand.”



It’s in America’s DNA to be ‘Divisive’

Wesley Morris | The New York Times Magazine

“With no supporting evidence, you are free to speculate that Obama’s race is at the root of his divisiveness. But you can’t just say ‘because he’s black,’ because that’s divisive, too. The bad math emits a radioactive glow. We keep dividing until there’s nothing left to say: That’s how it feels on both sides of the chasm.”



Charlotte Rampling, Oscars Lightning Rod, Talks Loss and Survival

Jada Yuan | New York

“Rampling talked about loss with the clarity of someone deeply familiar with digging herself out of sadness. (She’s also been vocal about her battle with depression.) ‘With new loss that I’m feeling, with Jean-Noël having left,’ she said, ‘I’ve got a more stable inner structure. Construite. The word actually is a sort of translation from French. It’s like a house. I’m a quite solidly built house now.’”


 •  0 comments  •  flag
Share on Twitter
Published on February 13, 2016 05:00

Against ‘Humanism’

Image










Meryl Streep, at the Berlin film festival this week, was asked why—given #OscarsSoWhite, given that it’s 2016, given that come onshe had convened an all-white panel to judge this year’s festival entrants. Invoking the rhetoric of an American president who had visited Berlin in the course of the last century, Streep dismissed objections to her panel’s monochromism. “The thing that I notice is that there is a core of humanity that travels right through every culture,” she said. “And after all, we’re all from Africa originally, you know. We’re all Berliners; we’re all Africans, really.”






Related Story



Suffragette and the Perils of Inspiration Porn






This wasn’t terribly surprising. When Streep was asked, last year, in the course of promoting her extremely feminist film Suffragette, whether she is herself a feminist, the actor replied that, no, she isn’t. Instead: “I am a humanist,” she said. “I am for nice, easy balance.”



In a culture navigating its way through the fraught fields of race and gender and class and power and privilege, Streep has gone out of her way, in her capacity as an artist and as a proximate public-intellectual, to reject the categories that might seek to divide us. She prefers to see the world from a loftier view.



And she isn’t the only celebrity to use her status as a “humanist” to explain why she is not an “ist” of another ilk. Sarah Jessica Parker recently met the business end of the indignation Internet when she explained, in an interview with Cosmopolitan, why she is not a feminist. (“As [the playwright] Wendy Wasserstein would say, I’m a humanist.”) Shailene Woodley and Marion Cotillard have espoused similar logicI like men! I believe in everyone!—in explaining why they, too, reject the “feminist” label. Humanism, that warm and welcoming alternative to “isms” of a more divisive strain, has lately served celebrities not just as a branding play (as when Stephen Colbert’s Late Show house band, Stay Human, kicked off the show’s first episode with a song titled “Humanism”), but also as a single-word dismissal of the social movements that are trying to bring about a more equitable world. Not always, but often. “Humanism,” used as an anti-ism, is a lexical version of all those people who claim, as if they are unique in the sentiment: “I think all lives matter.”



If transcendence is your aim—if you happen to prefer the soaring over the searing in your rhetoric and in your life—then “humanism” is an ideal term. It is soft and smooth and inviting and historically inflected and, above all, conveniently unfalsifiable. Who doesn’t believe in the value and the potential of collective humanity? Who wouldn’t be excited by all that might be achieved by, as Sarah Jessica Parker put it, “a humanist movement”? Humanism is the stuff of the Taj Mahal and Leonardo da Vinci and “one giant leap for mankind.” It is also, today, the stuff of cultural utopianism. Who wouldn’t love a world in which the seams of our great human tapestry are rendered effectively invisible?



“Humanism” is a lexical version of all those people who claim, as if they are unique in the sentiment: “I think all lives matter.”

In that sense, “humanism” makes for a self-contained tautology. But it also makes, as a piece of rhetoric, for a sentiment that is extremely glib: It is concern trolling, essentially, in the guise of inclusivity. Used as an alternative to feminism or any other civil-rights movement—used, broadly, as a justification for convening an all-white film-festival jury in the year 2016—it suggests that those movements are somehow petty or point-missing. That they ignore the beautiful human forest for its trees. That they insist on strife and manufacture drama and, all in all, have no chill. I am for nice, easy balance.



In all that, the deployment of “humanism” effectively forestalls conversation about gender or race or power or privilege or any of the other things that, especially right now, desperately need talking about. What do you say to someone who refuses to acknowledge divisions? To someone who seems to see social movements that fight systemic injustices as awkwardly thirsty? To someone who ignores the ongoing nature of the civil-rights movement, and the battles women have fought for equality? Streep’s recent film, Suffragette, features a character willingly martyring herself so that her fellow women might one day win the vote. “Humanism” treats that sacrifice as, effectively, a little bit awkward.



Which is all to say: To confess that one sees oneself, all social strife aside, as a “humanist” is not to confess a partisanship with our better angels. It is to willfully ignore history.



It is also to ignore, by the way, the history of the concept of “humanism” itself. “Humanism,” on the surface, suggests the Renaissance, and the flowering of human potential, and the ending of the Dark Ages, and education, and art. It whiffs of both Enlightenment and enlightenment. Humanism, certainly, embodied all that as a historical movement. But that was centuries ago. Today, most commonly, the term functions as an abbreviation of “secular humanism,” or the espousal of cultural values that have been disentangled from belief in the supernatural. It suggests the primacy of social norms over religious ones. “Humanism” suggests, essentially, “atheism that isn’t jerky about it.”



To confess that one sees oneself, all social strife aside, as a “humanist” is not to side with our better angels; it is to willfully ignore history.

Streep and her cohort, in treating “humanism” as an alternative to other movements, are ignoring that. Perhaps they are even confusing the word with a similar—but also very different—one: humanitarianism. As The Humanist, a site dedicated to this particular incarnation of “humanism,” explained of SJP’s dabbling with the term,




The definition of humanism is often confused with humanitarianism. While both promote human welfare and equality, and even the protection of our earth, humanism includes another aspect that many outside of the secular community tend to overlook: Humanists do not look to a higher power or authority to guide their morality. So, could it be that Parker truly is a secular humanist? Or, perhaps, has she fallen into the common trap of adding an “ism” to the end of any topic she may care about, while disregarding the actual definition of humanism?



I would guess the latter.




Perhaps she has. Regardless, there are many ironies here. One of them is that humanism, in all its incarnations, has historically involved a rejection of regressive thinking in favor of something more “enlightened,” more forward-thinking, more optimistic about what humans can achieve when they strive for something together. The celebrities’ brand of humanism, on the surface, promises to the do the same. “Why classify people?” Charlotte Rampling asked in her now-infamous questioning of the validity of #OscarsSoWhite.



But in a time of legitimate division and strife—in a time that equates progress with the recognition of social divisions rather than the rejection of them—it’s Rampling’s question that’s regressive. It’s humanism that is, counter to all logic, on the wrong side of history. That’s the real tautology here: We classify people because, well, we classify people. It might not be the world we want, but it is the world we have. Loftiness is lovely, but humans—from our African origins to the present day—were made, in the end, to walk on the ground.


 •  0 comments  •  flag
Share on Twitter
Published on February 13, 2016 04:00

Atlantic Monthly Contributors's Blog

Atlantic Monthly Contributors
Atlantic Monthly Contributors isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Atlantic Monthly Contributors's blog with rss.