Helen H. Moore's Blog, page 682
August 21, 2016
White privilege as economic reality: It would take African-Americans 228 years to reach the same level of wealth as whites
(Credit: erhui1979 via iStock)
Race is how economic class is lived in America. Consequently, from the nation’s founding to the present, race and class (as well as gender) are a social and political scaffolding on which opportunities and privileges are affixed in the United States.
Racism and classism are so intertwined that it would take hundreds of years for black Americans to have the same levels of wealth as whites. Writing at The Nation, Joshua Holland explains:
If current economic trends continue, the average black household will need 228 years to accumulate as much wealth as their white counterparts hold today. For the average Latino family, it will take 84 years. Absent significant policy interventions, or a seismic change in the American economy, people of color will never close the gap. Those are the key findings of a new study of the racial wealth-gap released this week by the Institute for Policy Studies (IPS) and the Corporation For Economic Development (CFED) … To put that in perspective, the wealthiest Americans — members of the Forbes 400 list — saw their net worths increase by 736 percent during that period, on average.
These outcomes indicate how institutional and systemic white supremacy creates disparate and unequal economic outcomes for people of color. Such outcomes are also a reminder that racism doesn’t just cause spiritual, psychological and physical harm to its victims, but is part of a broader economic and material system of intergroup power relationships. As sociologist Joe Feagin demonstrates in his books “Racist America,” “Systemic Racism,” “The Many Costs of Racism” and “White Party, White Government,” the country’s history of land theft (from First Nations peoples), labor theft (from African-Americans), and discrimination in hiring and promotion in the labor market (against nonwhites in general), is a type of subsidy that has transferred trillions of dollars to white Americans at the expense of people of color.
The extreme wealth gap that exists between black and white households has an impact across all income levels. As Melvin Oliver and Thomas Shapiro detailed in their landmark book, “Black Wealth/White Wealth: A New Perspective on Racial Inequality,” a white person born into a poor or working-class family has a higher chance of moving up the socioeconomic ladder intergenerationally than a black person born into an upper-class or rich family has of staying there in the same time period. This dynamic occurs because while a “rich” or “upper-class” black family may have more income in the short term, they still do not have the wealth or other assets that a “working-class” or “poor” white family has accrued across generations. Whites who are in the highest income brackets also have substantially more wealth than blacks and Latinos in the same cohort.
The black/white wealth gap is so extreme in the U.S. that whites who do not have a high school degree actually have more wealth than blacks who graduate college.
Here, the tentacles of the near and far past — slavery and Jim and Jane Crow — combine with continuing racial discrimination in the banking, housing and labor markets of the present to maintain the black/white wealth gap.
While the 228-year gap between black and white wealth gap may seem insurmountable, there are policy initiatives that could begin to close it. As Demos’ Sean McElwee explained at Salon in 2014, a “baby bond” program could create a wealth building opportunity for all Americans:
A baby bond is an endowment given to Americans at birth and maintained by the federal government until they are 18. The bond functions in a similar way to Social Security and can be sued to pay for college, buy a house or start a business. Hillary Clinton, in fact, briefly floated the possibility of a baby bond during her 2008 campaign, although the modest $5,000 sum she proposed is certainly smaller than ideal … Dr. Darrick Hamilton and William Darity Jr., leading proponents of a baby bond, propose a progressive bond that caps at $50,000 for the lowest wealth quartile. Such a bond could close the racial wealth gap in three generations. Their proposal would be given to three-quarters of Americans (based on wealth eligibility). They estimate that such a program would cost $60 billion a year, about one-tenth of the 2014 defense budget.
The baby bond need not increase the deficit. A recent CBO report finds that right now, tax credits primarily benefit the wealthiest citizens, at a cost of nearly $1 trillion a year. This money could easily fund an extensive baby bond program that would, over time, eliminate the racial wealth gap.
And as I have previously suggested here at Salon as well, the proper enforcement of anti-discrimination and civil rights laws in housing, finance, banking and the labor market, as well as the equal-protection clause of the Constitution, are also necessary components in any effort to address the racial wealth gap. Public schools should also have fair and equal funding regardless of the racial or class composition of the community. Because people of color have access to less wealth, they are more likely to accrue debt in order to complete a college or professional degree. To help prevent the Faustian bargain that is choosing between upward mobility and a lifetime of debt, low-interest student loan and grant programs from the federal government should be expanded.
Reparations for the still-living victims of American apartheid in the form of Jim and Jane Crow, as well as the descendants of black human property in the United States, should also be pursued. Many of the corporations that directly and indirectly benefited from anti-black racism as American public policy still exist; these corporate entities profited from those practices and should be forced to give some percentage of that wealth to their victims.
The racial wealth gap is a critical matter of public policy because it impacts almost every social indicator and life outcome. In the U.S., quality of health, likelihood of marriage, educational attainment, age of death, rates of incarceration and overall quality of life are directly impacted by an individual’s and household’s level of wealth. Correcting the racial wealth gap — and addressing the systemic and institutional factors that created and sustain it — would also present an amazing opportunity to increase human social capital and economic productivity. For example, a 2013 report from the Altarum Institute and the W.K. Kellogg Foundation concluded that “race, class, residential segregation and income levels all work together to hamper access to opportunity,” with the cost of racism estimated at $2 trillion a year.
On a macro level, money is political speech. The American political system, especially in the era of Citizens United, is a de facto plutocracy where the policy demands of the wealthy are privileged over those of the people. This plutocracy is also a racial one: White men are grossly overrepresented among that group. As the U.S. continues to become more racially diverse, the race wealth gap will further undermine faith in democracy by making elites even less responsive to the needs of the public. When economic power is concentrated among the few, and this occurs along lines of race, a multiracial democracy will inevitably face a crisis of legitimacy. Correcting the race wealth gap is one way of avoiding such a crisis.
It will take a great act of political will to remedy the racial wealth gap in America. Both Bernie Sanders and Hillary Clinton have proposed policies that would attempt to correct America’s extremely high levels of wealth and income inequality. Barack Obama may be the United States’ first black president, but he has also been averse to policy initiatives that would substantively address the specific needs of African-Americans. The Republican Party and movement conservatives have no interest in remedying the racial wealth gap and are, instead, interested in stoking the fires of white racial resentment and “economic anxiety” for electoral gains. One cannot overlook how many white Americans view racial inequality as a zero-sum game, in which expanding opportunities for people of color is seen as a threat to the whites’ power, wealth and influence. Consequently, the racial wealth gap is a material manifestation of unearned white advantages that its owners and beneficiaries are highly unlikely to surrender.
Austerity, neoliberal economics and policies of “colorblind racism” have combined to create a scenario where finding a solution to America’s racial wealth gap has become extremely difficult, if not impossible.
August 20, 2016
My weekend in a (sort of) cult: I signed up to learn meditation and almost became an infinite being
(Credit: Yuliya Savich via Shutterstock)
I suffered a bout of acute anxiety after witnessing a car accident. I had been standing outside with a friend when it happened — we were hanging out on a street corner in our Brooklyn neighborhood, chatting, our toddlers running around us in circles — when two cars collided and came close to hitting us.
We were both upset at the time, of course. Everyone around us was. Our kids were screaming; people around us were screaming. The drivers of both cars were upsettingly silent.
But well after that terrible event was over, after the ambulances came and went and we returned to our homes and hugged and our kids were pacified with snacks, I continued to relive what had occurred. My friend did not. I know this because I kept asking her how she was doing. She seemed fine. She didn’t have flashbacks. She didn’t, say, constantly obsess over death and read the dictionary all day because it was the most neutral, non-death-related book in her possession. Her heart rate didn’t hover around 150. Her startle reflex wasn’t through the roof. She was a little show-offy about it, to be honest. Showering, leaving the house, like some kind of superhuman.
I spent a couple of weeks suffering until my husband convinced me to call my psychiatrist. I didn’t want to. I felt stupid about it. I couldn’t even do trauma right, I thought. Get it from something impressive, like a war, or being on fire.
Even my psychiatrist didn’t believe that my intense reaction was from this particular incident. She wanted to think it was triggered from something earlier. “Wasn’t this the same corner you were on when 9/11 occurred?”
“No? I mean I lived near this corner. I wasn’t outside, though.”
“Yes, but if you were to go outside, did you see the smoke from there?”
“I…sure. I mean. Sure.”
“Well, you see. We have collectively been traumatized by 9/11. This was a recurrence of a much older wound.”
“This was a really bad car accident, though.”
“I’m sure it was. Now I’m going to tell you something personal about myself,” my psychiatrist told me, “but it’s for professional reasons. You’ll see why in a minute.” She paused. “After 9/11, I felt moderately agitated and had an accelerated heart rate.”
She looked at me.
I realized that was the whole story, and I said, “Oh, okay, thanks.”
At this point in my life I had not yet met a psychiatrist who was not, in some way, weird as shit.
She continued. “I attended a meditation course being offered at the time by a foundation, and I found it extremely helpful. I think you should take this course. It’s going to help your anxiety and be a lot healthier for you than taking Xanax every day.”
“I definitely don’t have a problem taking Xanax every day,” I told her. “Xanax is fine.”
“This course will be better,” she said. “I just want to warn you, however, that the place is a little cultish.”
“Do you mean it’s a cult?” I asked. Now she had me interested.
“Not exactly. Sort of. It might be. But it’s an innocuous cult, if it is one. Just go in and learn the technique and ignore the cult parts.”
I was intrigued. I’d never come this close to a cult before. Would they try to get me to join up? Could I take the course with some ironic distance, and maybe compose a witty essay about it afterward?
As it turned out: no, and no. I could not. I fell all the way in, and if they’d even tried a little to compel me, I would have become a devotee in a hot second.
***
I always knew I’d be an easy get for any cult. My first inclination is to agree with anything a person is saying to me. I can shut this off once I get to know someone and it turns out they’re my enemy or loathsome in some way, but always, when I first meet a person I want to be seen as agreeable and pleasant.
As a freshman in college I was befriended by a cool sophomore named Cambria — a name that should belong to a fashion designer or an era, not a regular college kid — who told me about this great moneymaking venture she was getting into. “It’s called a pyramid scheme,” she said. “And here’s why it can’t lose.”
“Say no more,” I said. “And take 100 dollars from me. Sold.”
All I had to do, she told me after I gave her my money, was con a few more people into joining up, which of course I never did. The moment she exited my dorm room I came to my senses, realized what I had done, and never mentioned the $100 to Cambria ever again. It was my own fault, really. I half-expected Cambria to refund me my $100, but she never did. She’s an OB/GYN now and I assume she makes a nice living. Probably none of it from her pyramid scheme, but you never know. Meanwhile I’m a writer, and frankly that loss of $100 still stings.
I’m no different from most of us. I want easy money, and failing that, easy answers. I want a plan that will solve everything complicated in my life. I don’t have time to figure everything out, and even if I had time, I don’t have the confidence that I’m going to figure it out on my own. In other words, I want a cult. A nice one, though. None of that suicide business. So when my psychiatrist told me about this place that would whisk away all my worries, I called immediately. Did they want me to also renounce all earthly possessions and swaddle myself in rough linens? I would have considered it!
It turned out they did not ask me to do those things, but in order to get this meditation technique from them I first had to last through a three-day workshop.
The first day was about eight hours long, led by a young woman wearing white linens, like she was Jesus, and no shoes (also like Jesus), gazing at us serenely with her long natural lashes and perfect skin, her light brown hair pulled back into a bun. I was with about ten other people ranging in age from their 20s to their 50s — professionals, men and women of all races. We looked like someone had chosen us to be on the cover of a continuing ed catalog.
We didn’t learn any techniques on the first day. We were promised the secrets that the guru had discovered through years of intense prayer, but it was a complicated technique that we had to earn through a day of exercises that would “open us up.” Pictures of the guru were everywhere. He was heavily bearded and beaming.
For our first exercise we had to mill around as if at a cocktail party, but instead of making small talk we had to look each person in the eye and say “I belong to you.” I considered jumping out the window. I would rather plummet three stories to the pavement below than say “I belong to you” to stranger. Everyone else, fortunately, looked equally embarrassed. We all chuckled and muttered “Ibelongtoyouhahahaha,” shrugging amiably like, this is what we gotta do, right? For the patented guru technique?
And then the instructor approached me and looked me in the eye with her clear blue eyes and said, unapologetically, in a clear voice, “I belong to you,” and I believed it. It dazzled me. My mouth burbled out some sounds back at her, but all I could think was: I want that. That thing she has? Want it. It might have been her perfect posture or her complexion, but it seemed that beams of goodness were shooting right out of her face. I wanted in.
The exercises continued all day: We demanded “who are you?” to each other until one or the other person cried; we danced with our eyes closed while our teacher and her equally beautiful assistants banged on their tambourine and exhorted us to let go, really let go; we laid down on our backs while the teacher instructed us to remember being younger and younger and then remember before we were born and remember being infinite beings. I became an infinite being with the rest of the class. I was all in.
By the time I got home at the end of the day, though, I was mostly embarrassed. It felt like I had gotten too drunk and made an ass of myself. I hadn’t made out with this class of strangers, at least, but I had definitely told them I loved them and maybe some of us hugged for a beat longer than is socially acceptable. I told my husband about the events of the day with all the ironic distance I could muster. “You’re not going back, right?” he asked.
“Well,” I said, “I’ve already paid.”
The next day I walked in determined to keep myself sane and maybe take mental notes for the humorous essay I would write. Within minutes I was back under the spell of the guru, or more specifically, his beautiful acolytes. Would I tell the class how I was a wretch and unloved? I sure would! Would I cry in the arms of a Colombian stranger because I had never known the love and acceptance that he had known? It’s honestly a little hard to remember!
While I began to doubt that a meditation technique even existed, I didn’t exactly care anymore. I felt better than I had in weeks. This might have been a load of bullshit, but it was bullshit that had relieved me of my racing heart and death obsession.
That’s when the instructor and her assistants (all of them beautiful and serene) told us we were ready; we were going to learn the much-vaunted technique from the guru himself! The teachers were so excited that I thought he was going to make an entrance, perhaps descend via a pulley system from the ceiling.
We all sat cross-legged on the ground. It was time.
What we proceeded to learn was not a meditation technique at all but really just a complicated way to hyperventilate for 45 minutes. The first part consisted of vigorous inhales and exhales and felt like something a Russian bad guy would do before engaging in fisticuffs or jumping into the Baltic Sea.
And then the guru was in the room. Or rather, his voice, whistling out from the speakers around us.
He began to lead us through his sacred breathing thingy. He intoned the words “So” and “Ham” while we were to inhale on “so” and exhale on “ham.” It went like that for forever. Sometimes it was slow, sometimes it was fast. So….ham. So ham. So ham! SO HAM! I thought it would make a good tagline for ham. I wanted ham.
Mostly I got dizzy. At one point I felt my right hand curling into a fist with my index finger outstretched and then my hand began to climb into the air. It was like I had a giant foam finger in my mind that told me that we were all #1. After we were finished and all talking about our experiences, I mentioned this foam-finger phenomenon.
“It sounds like you were pointing toward heaven,” the teacher said.
(I learned later that your muscles can contract involuntarily when you hyperventilate. Science can be so disappointing.)
We went through the technique a few times, just to get it down. I felt like I had exited my body, but I was close, floating a few inches above it. I was disappointed that all we had done was, basically, pant, but at the same time, I felt amazing.
I left that class determined to breathe like this every day. I tried, really I did. But I had a toddler, and I was so tired. Ninety-nine percent of the time I’d fall asleep. The timing of the technique was complicated, and soon enough I forgot it. We had been invited to retake the class anytime for free, but there was no way I was going to spend another three days rebirthing and telling people I belonged to them.
As much as I want to make fun of the class, though, it worked. My panic attacks ended. (For a while, anyway.) I stopped obsessing about the crash and the screaming and the blood. I could play with my kid again and not cry about the inevitability of loss. Maybe the class was silly, but damned if it didn’t cure me. At that moment in my life, it was exactly the cult I had needed.
‘See something, say something’ culture is dangerous: How it spawns Islamophobia and keeps America insecure
This article originally appeared on AlterNet.
Early on the morning on June 17, 2016, the Massachusetts Bay Transportation Authority (MBTA) announced to its Twitter followers some “minor delays” on the Orange Line. The delays resulted from a “police action” involving heavily armed transit cops at the Wellington station in Medford, Mass. The cops had been responding to reports of suspicious activity. Two men were questioned and then let go. “Some people riding our system noticed two people that appeared to be Middle Eastern,” MBTA general manager Frank DePaola told a local newspaper, “and in their opinion, they were acting suspicious.” The suspicious activity in question was prayer, which some MBTA riders believed to be suspicious. DePaola called the incident a “general misunderstanding.”
Two weeks later, the Avon Police Department in Ohio received a 911 call from the sister of a hotel clerk alerting them to a man “in full head dress with multiple disposable phones pledging his allegiance to ISIS.” A second call by the clerk’s father requested that police be sent to the hotel. Minutes later, the suspicious man found himself surrounded by officers with guns drawn. He was handcuffed and searched. After speaking with the clerk, police concluded there had been a “miscommunication” and the man had not in fact made any comments about ISIS. City officials later apologized to the man, hoping the person who made the false accusations “can maybe learn from those.”
If reports of suspicious activity are educational opportunities, then law enforcement agencies are exceptionally slow learners. For years various state and federal agencies have actively encouraged people to report suspicious behavior as part of efforts to prevent terrorist attacks. In its 2010 National Security Strategy, the White House called for establishing “a nationwide framework for reporting suspicious activity.” The Department of Homeland Security (DHS) launched its “See Something, Say Something” program shortly afterwards, joining the FBI’s eGuardian and the Director of National Intelligence’s (DNI) Information Sharing Environment Suspicious Activity Reporting (ISE-SAR) programs. All three programs are partnered with the Department of Justice’s (DOJ) umbrella organization, the Nationwide Suspicious Activity Reporting Initiative (NSI).
The programs ask people to report suspicious behavior in train stations, airports, national monuments, and even Walmart stores. Predictably, they have led to racial and religious profiling, facilitated by the rampant Islamophobia in the country. Muslims are frequently reported for innocuous activities. Two Middle Eastern-looking men praying in Boston and a man speaking Arabic in Avon are enough to warrant suspicion for some. A recent shutdown of JFK airport’s Terminal 8 in New York City when travellers clapping for the Olympics was wrongly reported as an active shooter provided a prime example of how “see something, say something” culture only increases dysfunction and danger for Americans.
Targeting Muslims for simply being present
Summaries of suspicious activity report (SARs) obtained by the American Civil Liberties Union (ACLU) of California from the Central California Intelligence Center and the Joint Regional Intelligence Center in 2013 confirm that these reports frequently identify Muslims and Arabs engaged in quotidian activities. Much of the time Muslims are not even engaged in any activities; they are simply present.
According to one summary, a vigilant citizen reported that “there was a substantial increase in the presence of female Muslims fully dressed in veils/burkas” at a mall. Another reported a “suspicious gathering” of people “of what appear to be Muslim Faith or Middle Eastern descent.” One report identified “a Middle Eastern male adult physician who is very unfriendly.”
Sometimes people filing the reports can be confused by various shades of brown, not quite sure what ethnicity they should report. A report maintained by the Los Angeles Regional Intelligence Center, for example, notes that an informant saw “two young women, possible middle eastern or hispanic, taking pictures and video of the 110 freeway structure.” The same informant also saw “a middle eastern male sitting in a BMW talking on his cell phone near where the two women were taking the pictures.”
In addition to Muslims making an appearance in neighborhoods and malls, many reports target people for the simple act of taking pictures, a constitutionally-protected activity which was cast as suspicious by the FBI itself when it first announced its eGuardian program in 2008. SAR summaries provide some insight into what this looks like in practice: “I was called out to the above address regarding a male who was taking photographs of the [redacted],” reads one report. “The male stated, he is an artist and enjoys photographing building in industrial areas … [and] stated he is a professor at San Diego State private college, and takes the photos for his art class.” This, evidently, was a cause for concern for at least one individual.
As is apparent, these reports are not based on any reasonable suspicion of criminal or potentially criminal activities and depend almost entirely on the biases of ordinary people. The Functional Standard for SARs hardly helps matters. It requires an activity to merely be “reasonably indicative” of criminal activity to qualify as suspicious, a standard that is much lower than the “reasonable suspicion” criteria common in criminal justice procedures. Consequently, as DHS’ Privacy Impact Assessment for ISE-SAR states, “there is a risk that more information about individuals who have no relationship to terrorism may be recorded.” Given the current, pervasive Islamophobia in the United States, it is no surprise that many of the individuals reported for acting suspiciously are innocent Muslims.
Also concerning are the privacy and civil liberties implications of collecting, maintaining, and sharing a vast trove of data regarding individuals not suspected of any crimes. The FBI’s eGuardian program initially retained the data for 30 years until concerns raised by Fusion Centers prompted it to limit its retention to five years. However, as the ACLU notes, “even after Suspicious Activity Reports are deleted from eGuardian, the FBI retains the reports for at least an additional 30 years in another location.”
Gathering personal data for mass surveillance
Gregory T. Nojeim, the Director of Freedom, Security, and Technology at the Center for Democracy and Technology, previously raised concerns about how the retained data might be used. As he testified before the House Committee on Homeland Security in 2009, “there seems to us a high risk that this information will be misinterpreted and used to the detriment of innocent persons.” This is particularly troubling given intelligence agencies continue to “classify legitimate political activity as ‘terrorism’” and “spy on peaceful activists.” Similarly, the ACLU notes that “overbroad reporting authority gives law enforcement officers justification to harass practically anyone they choose, to collect personal information and to pass such information along to the intelligence community.”
There also remain many unanswered questions about the effectiveness of such programs. There are concerns that SARs flood intelligence agencies with useless information that these agencies must then analyze, wasting valuable time and resources. As a Congressional Research Service (CRS) report puts it, the “goal of ‘connecting the dots’ becomes more difficult when there is an increasingly large volume of ‘dots’ to sift through and analyze.” There is an increased risk of “‘pipe clogging’ as huge amounts of information are … gathered without apparent focus.” Information collected through SARs has in fact increased dramatically. According to a report by the Brennan Center for Justice published in 2013, the number of ISE-SARs increased “almost tenfold” from January 2010 to October 2012, from about 3,000 to nearly 28,000.
Many agencies do not even bother to measure the effectiveness of these programs. As an ACLU report on the FBI’s abuse of authority pointed out, “neither the ISE Program Manager nor the FBI track whether SAR programs deter terrorist activities or assist in the detection, arrests, or conviction of terrorists, and they have not developed performance measures to determine whether these programs have a positive impact on homeland security.”
Years of asking the public to report any suspicious behaviors based on loose criteria has ensured that the See Something, Say Something philosophy has become ingrained in public consciousness. Ahmed Mohamed, the famous “clock kid” who was arrested last year because a teacher believed his homemade clock resembled a bomb, did not simply fall victim to the bigotry of school officials. His arrest and interrogation were enabled by such a culture of suspicion, aided by the current hysteria about terrorism and Muslims. Tepid denunciations of Islamophobia aside, the Obama administration can be credited for this culture, which continues to cast suspicion on innocent Muslims.
Reading Amy Schumer: The powerful missed message in “The Girl With the Lower Back Tattoo”
Cover detail of Amy Schumer's "The Girl With the Lower Back Tattoo" (Credit: Gallery Books)
Amy Schumer isn’t who you think she is. And perhaps she isn’t who she thinks she is, either. The comedian’s debut book, “The Girl with the Lower Back Tattoo,” published this week by Gallery Books, is being widely received as autobiography, memoir, confession. But it’s not really any of these things.
It’s a performance.
While Amy Schumer is ostensibly telling us the truth about her life, she’s also telling us a story. And once life becomes a story, it is no longer real life but art. Schumer is a comedian, an oral storyteller bent on making people laugh. And so her life is transmuted into words on the page, into a collection of episodic stories for all of us on dating and sex and self-love and empowerment in the workplace.
Schumer is adamant that her book is not autobiography or memoir, and that it “has NO SELF-HELP INFO OR ADVICE FOR YOU,” she begins the book in the prefatory “A Note to My Readers.” In the same “Note” she also says, “what I can help with is showing you my mistakes and my pain and my laughter.” This may seem perfunctory given the tenor of the book, yet herein lies the message: When Amy Schumer talks about herself she talks about us, too. The function of her storytelling is to illuminate the systemic, macro and micro versions of oppression that every woman deals with on a daily basis. She acts this out—both in this book and also on her award-winning Comedy Central Show, “Inside Amy Schumer”—in order to purge our shame through performing the grotesque. This collective catharsis—along with entertainment—is one of the functions of storytelling.
I ’ m a real woman who digests her meals and breaks out and has sweet little pockets of cellulite on her upper thighs that she ’ s not apologizing for. Because guess what? We all have that shit. We ’ re all human beings.
But “The Girl with the Lower Back Tattoo” is no confession—Schumer does not want absolution. Not even from her shitty tramp stamp tattoo, which got infected and today still looks “like a ‘Mad Max’ war boy’s head scar.”
So now, fifteen years later, I ’ m thirty-five, and any time I ’ m in a bathing suit people immediately know in their hearts that I ’ m trash. Any time I take my clothes off for the first time in front of a man and he sees it, he also knows in his heart that I ’ m trash and that I make poor, poor decisions . . . But I promise you from the bottom of my heart I don’t care. I wear my mistakes like badges of honor, and I celebrate them.
That Schumer’s book title parodies the popular but highbrow Swedish crime novel “The Girl with the Dragon Tattoo” is not just some random literary allusion. Schumer invokes the girl who gets raped and fights back. It’s a clever elision from high art to low art, but it’s part of the book’s message. How does a girl become a victim—if not of a crime syndicate, of culture? Why do girls get tramp stamp tattoos? Why are women so at the mercy of fashion? Why do we contemplate getting French manicures when guys just want to “JAM [THEIR] WEINER IN YOUR POOPER”?
Beyond the many powerful and empowering takeaways of “The Girl with the Lower Back Tattoo”—from loving the hustle to self-love—perhaps the most overlooked is that of a woman’s right to not only make mistakes, but to make art out of them.
But not only should women not really make mistakes; they should not really be making art either. And if they do make art, they’ll be judged for it, as themselves—whoever that is. Schumer knows women who are artists are not allowed to be separate creatures from their art. As I have written elsewhere, there is a sexist double standard that prohibits women from identifying as artists, and instead minimizes and marginalizes their creativity as autobiography, autobiographical musings and diary doodles. As such, women’s creativity is sexualized, infantilized and blatantly dismissed. Our artwork is ostracized through the categorization of it as “women’s art.” Schumer, in turn, is not regarded as a comic but a “female comic,” as if the adjectival appellation both is imperative to understanding her comedy, and, more insidiously, suggests its worth. She writes in her chapter “An Exciting Time for Women in Hollywood”:
After putting in so much effort to make a good movie, it felt pretty demeaning when they called it a “female comedy.” This meaningless label painted me into a corner and forced me to speak for all females, because I am the actual FEMALE who wrote the FEMALE comedy and then starred as the lead FEMALE in that FEMALE comedy. They don’t ask Seth Rogan to be ALL MEN! They don’t make “men’s comedies.” They don’t ask Ben Stiller, “Hey, Ben, what was your message for all male-kind when you pretended to have diarrhea and chased that ferret in Along Came Polly?”
Schumer is a comedian. Male or female, “a comedian’s endgame,” I contended in an earlier piece on Schumer’s feminist brand of comedy, is “to create an art of transgression. A comedian is by definition not a spokesperson. A comedian is not a politician. A comedian has no essential social good or moral design in mind when creating and performing her art.” If anything, the aim of comedy is to transgress—to broach unsavory, contentious social issues that indirectly may lead to a kind of emotional or psychological catharsis. This indirect aim is reflected in the awkward syntax of Schumer’s noted objective of this book: “what I can help with is showing you my mistakes and my pain and my laughter.”
Schumer’s vehicle for comedy is storytelling, and through playing the role of storyteller she performs comedy through her body to reveal the comedy of all bodies, and women’s bodies in particular—pancake boobies and chicken-noodle-soup-smelling vaginas and all.
Funny, then, how people are reading this girl with a lower back tattoo as an honest conveyer of the innermost soul of Amy Schumer—whoever that is. How do we trust the “deepest confessions” of such an unreliable narrator? She has asserted the book is not an autobiography or memoir, so why would we bestow the narrator with any kind of reliability? The trickster, the cunning artist will in one breath tell you she’s “a pathological liar” and then in the next say, “Just kidding,” which is cleverly what Schumer does in the chapter aptly titled “Things You Don’t Know About Me.”
True to her comedic form, Schumer pulls a fast one while she reveals all: the key is to understand that a story is just a story. Stories, just like all forms of narrative, are interpretations of things. There is no truth, only interpretation.
What Schumer imparts to the reader is that the greatest power of the storyteller, of the comic, is that of controlling the narrative. She credits this ability as both the start of her career as a comedian and as the origin of her womanhood in her chapter about her bat mitzvah. During her chant of the Torah portion, her voice cracked. But instead of cracking from the audible chuckles in the audience, she laughed. “I laughed hard,” she recalls. “I was laughing at myself. We were all laughing together—a real laugh that went on for a while. I’m pretty sure that’s why I officially became a woman that day. Not because of the dumb ancient ceremony where children are gifted bonds they can’t cash until they’re twenty-five. . . No, I became a woman because I turned a solemn, quiet room into a place filled with unexpected laughter. I became a woman because I did, for the first time, what I was supposed to be doing for the rest of my life.” For the comedian, controlling the narrative is as often measured in language as it is in laughs.
The book’s comedic moments are balanced by more serious chapters, like the story of how her “first time” equalled being sexually assaulted by her boyfriend who “just helped himself to [her] virginity”: “Many girls remain silent about their experiences. And that is their choice. I’m opening up about my ‘first time’ because I don’t want it to happen to your daughter or sister or friend someday. I want to use my voice to tell people to make sure they have consent before they have sex with someone.”
In “The Worst Night of My Life” Schumer documents her long-term relationship with a physically and psychologically abusive boyfriend. If she was unable to control the narrative in the past, she’s doing so now. She writes in the concluding paragraph of “The Worst Night of My Life”:
I’m telling this story because I’m a strong-ass woman, not someone most people picture when they think “abused woman.” But it can happen to anyone. When you’re in love with a man who hurts you, it’s a special kind of hell, yet one that so many women have experienced. You’re not alone if it’s happening to you, and you’re not exempt if it hasn’t happened to you yet. I found my way out and will never be back there again. I got out. Get out.
The message Schumer offers us is fairly commonplace after 40 years of feminism and over a decade of identity politics, in spite of the fact that it still needs to be said:
Love yourself! . . . Your power comes from who you are and what you do! . . . I know my worth. I embrace my power. I say if I ’ m beautiful. I say if I ’ m strong. You will not determine my story. I will. I will speak and share and fuck and love, and I will never apologize for it. I am amazing for you, not because of you. I am not who I sleep with. I am not my weight. I am not my mother. I am myself. And I am all of you.
It’s commonplace, that is, except for the last two lines, which juxtaposes “I am myself” with “And I am all of you.” A sly nod to Sandra Bernhard’s “Without you I’m nothing,” “I am all of you” says that she is what we make her to be: “female comic,” “feminist icon,” or “girl with the lower back tattoo.” The point is that Schumer unapologetically occupies space—on the page, on the stage and in the world—and in doing so creates space for all women. Space for women to create art. Space for them to just be.
Are you ready to join the paint party? How a bar-based crafts event became a very big business
A Paint Nite Event.
A half dozen or so years ago, Sean McGrail attended a birthday party with his pal Dan Hermann where a group of adults were enjoying a group painting exercise. At first the two Boston-area friends were surprised that the participants were eagerly sharing each other’s artwork via social media. And that’s when it hit them: This was viral marketing 101. If you give people a fun, shareable activity, they will become advocates for it, spreading the word (and in this case image) by way of their Facebook, Instagram and Tumblr accounts.
If only there was a way to make money on that, right?
Hermann and McGrail figured out one way to do it, and in 2012 they founded , a painting party business that has grown into a company that earned $55 million in 2015. From its Boston-area headquarters, Paint Nite manages 54 employees and has hosted events in 115 cities, most but not all of them in the U.S. — not bad for a startup seeded with just $6,800, earning it the No. 2 spot on Inc. magazine’s annual list of the fastest-growing private companies in America.
Paint Nite’s popularity corresponds with a rise of the so-called maker movement, where American embrace their inner artisans through projects like DIY cheese kits, cosplay costumes or hippie duvet covers sold on Etsy.
What makes Paint Nite a vibrant business as opposed to a humble hobby is the company’s acumen. They hire local artists and supply the materials for them to host painting-and-drinking parties at local bars and restaurants. Tickets go for $45, with revenue shared by the local artists and the company. The venues benefit by hosting events that attract happy, preoccupied drinking crowds on otherwise slow nights. Paint Nite has become a big hit among women, who make up most of its clientele.
Recently the company expanded its offer to include Plant Nite, where instead of painting, the partygoers build small terrariums.
Salon spoke to McGrail, Paint Nite’s president and co-founder, about his brushstroke of genius.
What was the moment when you realized, “hey, there’s a business to be had offering bars activities for their customers beyond Tuesday trivia nights”?
Dan and I attended that birthday party where friends were drinking and painting and it had a very silly, childlike joy to the event that intrigued us. We wanted to avoid the overhead associated with leasing, designing and constructing a storefront studio, so we thought adopting the trivia-in-a-bar model would be a low-cost way to get started, particularly in high-rent districts in major cities. But the day we knew we had a “real business” was when we had a room of 50 people who were complete strangers and who were all there to paint with us in a bar. It’s one thing to convince family and friends to buy your product; it’s another when strangers see value in it.
What does Paint Nite offer that a bar or restaurant can’t do on its own?
Most bars and restaurants are focused on producing great food and drink and managing a staff that provides great service to patrons. Promotional events would be a distraction for them to market, staff and provide supplies for. Events are a headache that’s not in their wheelhouse of day-to-day business. It’s smarter for them to outsource that headache and promotion to Paint Nite.
So are your customers exclusively bar and restaurant owners, or can individuals come to you for help in arranging a venue?
Even though a majority of our public events take place at restaurants, we also work with multiple venue partners for special private events that take place in sports stadiums like Fenway Park and on the smaller size, in private homes. In short, we’ll work anywhere alcohol is served!
You also have Plant Nite. Instead of painting, people come together built their own mini tabletop gardens. Clearly the idea here is you give people a good time out and something to bring home to remember it.
Yes, we see ourselves as providing creative entertainment where people are engaged participants and they go home with a memento of a fun evening with friends.
It’s kind of mind-boggling to me that in four years you’ve grown this into a business with $55 million in annual revenue. How did you grow so fast?
There’s a large portion of the population that is “craft curious.” They don’t want to commit to a college-level art class, and they don’t want to schlep around town shopping for art supplies they might only use once. Add cocktails and music to the mix, and they know that it’s a judgment-free zone where fun takes priority over the art, and it becomes accessible to a much larger audience than a typical art class.
The trial-by-Netflix age: “Making a Murderer,” “Serial” and the mixed blessings of media justice
Adnan Syed, Brendan Dassey (Credit: Reuters/Carlos Barria/AP/Morry Gash)
If you like culture that “makes a difference,” these are good times. An enormously engaging podcast has led not only to listeners getting a greater understanding of the justice system, it’s brought so much attention to a flawed murder case that it’s provoked a retrial. And a grimly entertaining documentary series about a murder case in Wisconsin has led to the overturned conviction of a confused teenager who appears to have been bullied into his confession.
The success and chatter around the “This American Life” podcast “Serial” — that first season, about the murder of a high school girl outside Baltimore, became the most popular and celebrated in the form’s brief history — and the Netflix series “Making a Murderer” makes it pretty certain that more programs like these will come down the pike. These are not true-crime, tabloid TV shows or angry radio rants, but well-researched, intelligently presented examples of their respective genres. They’re the latest addition to a tradition of cinematic documentaries — including Errol Morris’s “The Thin Blue Line” and Amy Berg’s “West of Memphis” — that look at dodgy cases and that sometimes lead to convictions being overturned. They’re both also highly binge-able examples of serialized nonfiction, enjoying longer periods of buzz than a stand-alone documentary film might.
But while these are not exactly cases of “trial by media,” they are examples of popular entertainment altering decisions by the justice system, which is supposed to be impervious to these kinds of pressures.
Some involved with the cases, including victims and their families, have expressed frustration with the media exposure. But some critics have questioned the larger effect of this trend, too. In a thoughtful New Yorker piece, for example, Kathryn Schulz traces a history of the unfair-conviction efforts, but cautions that “we still have not thought seriously about what it means when a private investigative project — bound by no rules of procedure, answerable to nothing but ratings, shaped only by the ethics and aptitude of its makers — comes to serve as our court of last resort.”
The effort to take another look at convicted criminals — especially those who may have not had the education or resources to adequately defend themselves — goes back at least as far as the late 1940s, when the detective writer (and former lawyer) Erle Stanley Gardner wrote a column for Argosy magazine called “The Court of Last Resort.” (It briefly became a television show in the ‘50s.)
But television historian Tim Brooks says that shows about catching criminals, not exonerating them, have been both more prevalent and more popular than those that seek to exonerate the wrongly convicted. “TV picked up on [the questionable conviction trend],” he writes via email, “after it only emerged in real life.”
And indeed, the last few decades have seen a social-justice movement to assist the wrongly convicted; the most famous example is probably The Innocence Project, a nonprofit founded in 1992 which often uses DNA evidence.
Public interest in these shows is part of a larger trend within politics, the legal system and popular culture, says Ingrid Eagly, a professor at the UCLA school of law.
“I think it’s running on a parallel track to things like Black Lives Matter, cop-watching and The Innocence Project,” she says. “There’s been a growing consensus that the system has gotten too criminalized. You see state legislatures decriminalizing some things, President Obama issuing clemencies. And a sense that it correlates with race and poverty.” (Steven Avery and nephew Brendan Dassey, the defendants in “Murderer,” are poor, barely educated whites; Adnan Syed, the convicted killer who is the subject of “Serial,” is a Muslim of Pakistani descent. A new book, by Syed’s family friend Rabia Chaudry, an attorney who connected Syed to the team at “Serial,” frames his difficulties with the legal system as growing in part out of his religion and ethnicity.)
Eagly, who shows the documentary “Staircase” — which led to a new trial for a writer accused and convicted of murdering his wife — thinks these programs perform a public service. “The public, until recently, hasn’t been as involved in criminal courts as they could be,” she says. “Criminal trials are supposed to be public, and the community — friends and family — are supposed to be involved.” But in the real world, the vast majority of cases — close to 97 percent of federal trials, and almost as many state trials, she says — end with a guilty plea before they ever go to trial. In some cases, these pleas come from innocent people pressured into confession as a way of avoiding lengthy sentences.
Part of what’s surprising, given public sentiment that celebrates “Serial” and “Murderer” for their ability to advocate for the wrongly convicted, is that the creators of these programs don’t see themselves as crusaders.
Julie Snyder, executive producer of “Serial,” says she and host Sarah Koenig were attracted to Syed’s case not out of an effort to right a wrong, but to take a close look at a complicated process. “It wasn’t completely clear what had happened,” Snyder says. “This was as case where we had a lot of questions, where there were a lot of layers.” It didn’t seem to her to be as cut and dried as the usual wrongful-conviction case. “It became clear that this was a weird case — there was no obvious bias or prosecutorial misconduct. We’re drawn to stories with ambiguity, because it exists in real life.”
In fact, despite the fact that Syed’s murder conviction has been vacated — made legally void — Koenig confessed her confusion with the case throughout “Serial,” up until the very end of the season. She never entirely takes sides.
The documentarians behind “Murderer” — currently chasing developments in the Avery and Dassey cases for a second season and not available for comment — have tried to maintain a similar sense of nuance and integrity. One episode of “Murderer” shows a Dateline NBC producer telling the camera, “Right now murder is hot … And we’re trying to beat out the other networks to get that perfect murder story.” The duo who made “Murderer” are trying — we’re supposed to come out of this thinking — for something more subtle.
“We do not consider this advocacy journalism in the least,” show creator Moira Demos said at the Television Critics Association meeting in January. “We are not taking sides. If anything, this is a social justice documentary. As we said before, we chose Steven Avery because we thought his experiences offered a window into the system. We don’t have a stake in his character, in his innocence or guilt. That was … not the question that we were raising.”
Her partner, Laura Ricciardi, beat back criticism about their goals similarly. “This is a documentary,” she said. “We’re documentary filmmakers, we’re not prosecutors, we’re not defense attorneys. We did not set out to convict or exonerate anyone. We set out to examine the criminal justice system and how it’s functioning today.”
Still, “Making a Murderer,” is seen by some close to the case, according to The New Yorker’s Schulz, as “less like investigative journalism than like highbrow vigilante justice.”
The real danger, says Hollywood Reporter television critic Daniel Fienberg, is that a show or podcast can become so persuasive that audiences think they’re getting the entire story — the objective truth — behind a murder and the ensuing trials. “It’s wrong to think of any of these shows as being unmediated texts,” he says.
“In a better world we’d see investigative reporting doing this. We don’t live in a world like that any more, but we do live in a world where people do this as documentary makers. And documentary has a different set of criteria than journalism. These involve storytellers telling stories — so it will probably lean to the most sensational telling. And the more that we have success with these, we’ll get a lot of bad ones.”
In a better world, too, documentaries in various genres could look at the systemic flaws in the justice system instead of isolated cases here and there. But the human craving for narrative and characters may make it difficult to attract an audience for more sweeping coverage.
Snyder, of “Serial,” points out that these shows are not acting as judge and jury, but rather urging the legal system to do its job when it’s falling down on it. “I feel gratified because we’re causing the system to take another look,” because of serious legal flaws, she says — not mob justice. “I don’t think these cases are getting vacated convictions because of a high Netflix rating.”
Ryan Lochte, Donald Trump and the steep decline of American democracy
Donald Trump; Ryan Lochte (Credit: Reuters/Chris Keane/Mike Segar/Photo montage by Salon)
One of the more embarrassing aspects of the Olympics is the quadrennial effort by journalists to mine cultural meaning from an amorphous and heterogeneous spectacle that, by its very nature, can be used to make any point or support any argument. To paraphrase a wonderful poem by the neglected British humorist Edward Lear, the Olympics contain all the morals that ever there were, and set an example as well. Want to argue that the Olympics are a jingoistic war-substitute? A litmus test for feminist progress? An overproduced schmaltz-fest? A hidden tale of backroom injections and falsified test results? Come on in, the water’s fine.
David Brooks of the New York Times, perhaps our most esteemed practitioner of pseudo-profound cultural inquiry, offered a sterling example this week with a column wondering why so many young Americans were so pessimistic about our democracy. After all, American athletes are doing so well at the Olympics! To be fair, Brooks moves on from that brain-boggling non sequitur to a somewhat sequential argument that America is good at other things too, and the purported mood of national pessimism is therefore unmerited. (I’m pretty sure he has heard of the Soviet Union, and is aware that athletic achievement has no obvious correlation with democracy or capitalism.)
Some of Brooks’ tossed-off examples are debatable at best, as when he argues that the notoriously erratic Food and Drug Administration “is the benchmark for medical standards.” But that’s not the point. Brooks never mentions Donald Trump, the unquestioned avatar of American pessimism, although I would say the entire column was written in the long, dark Trumpian shadow. He mentions swimmer Ryan Lochte only in passing, as an example of the “amazingly American stories and personality types” found at the Olympics.
Trump and Lochte are amazingly American, no doubt. Taken together, they effectively provide the answer to Brooks’ incoherent question. The disgraced American swimmer and the disgraced American candidate found themselves in similar predicaments this week. It’s a marriage made in hell, or at least in purgatory: These two clowns epitomize the disordered state of the American psyche, circa 2016, almost too perfectly.
Lochte and Trump are a pair of arrogant, ignorant jerks who believe that neither conduct nor character actually matters, and who feel entitled to rescue themselves from any sticky situation through the strategic application of lies and money. They’re the white men who give white men a bad name (which is, of course, deeply unfair). They’d almost be comical, if one of them weren’t endangering the future of the republic and if they weren’t working so hard to reinforce the entire world’s negative stereotypes about Americans. (Which are, of course, deeply unfair … no, I’m sorry, I can’t say it with a straight face.)
At this point it has become a truism to describe Trump as a candidate molded by reality TV, but that doesn’t make it untrue. Lochte has also played himself on television, with considerably less success. Beyond that, both men are fundamentally creatures of the digital age, shaped by an Internet culture of pixel-thin celebrity, instantaneous “hot takes” and moronic or hateful behavior with virtually no consequences.
Lochte was compelled to hire a “crisis P.R. expert” named Matthew Hiltzik, who has previously worked for Alec Baldwin and Justin Bieber, after allegations surfaced that Lochte had vandalized a gas-station restroom in Rio de Janeiro and then invented a story about being robbed by police officers. The man now known to Twitter as “Swim Shady” — a reference to his resemblance to the rapper Eminem — subsequently issued an apology on his Instagram account, without making it entirely clear what he did to be sorry about. He should have been “more careful and candid in how [he] described the events of that early morning,” Lochte wrote. (Or, more likely, his well-compensated mouthpiece wrote.) Doesn’t that imply that some degree of care and candidness was involved in his apparently fraudulent account of a nonexistent crime? Regrettably, it was less than an optimal amount.
I feel confident that at some point in his life Donald Trump has smashed up a bathroom, but that wasn’t his issue this week. The Republican presidential nominee — that phrase still sounds hilariously unlikely, long after the fact — was compelled to hire an entire new campaign team. (Is this the third reboot, or the fourth?) It includes a so-called campaign CEO named Stephen Bannon, who is also the head of Breitbart News, a media organization that specializes in lightly sanitized versions of extreme right-wing conspiracy theories. The Southern Poverty Law Center has described Breitbart as “the media arm of the Alt-Right,” meaning the loose network of racist, paranoid anti-government fanatics found on the farthest fringes of conservative politics. Former Breitbart reporter Jordan Schachtel, who quit the site in March, has called it “an unaffiliated media Super PAC for the Trump campaign.”
Trump’s latest campaign shuffle happened because — do I really need to list the reasons? Who can keep track? His poll numbers are spiraling toward the drain, numerous prominent Republicans have either deserted him or resorted to pretending he doesn’t exist, and every time he opens his mouth he alienates someone new. Although Trump’s feud with Khizr and Ghazala Khan, the Muslim immigrant parents of a U.S. Army officer killed in Iraq, has faded from the headlines, that may have marked his political point of no return. It was his cognate to Mitt Romney’s “47 percent” recording four years ago, except juiced up with extra hate and testosterone.
As for alleged misdeeds, the list is long, but those have never seemed to damage Trump during his presidential campaign. This is the candidate who boasted, early in the year, that he could murder someone in broad daylight without affecting his poll numbers; he serves as an inspiration and role model to the Ryan Lochtes of the world. Interestingly, Paul Manafort, the veteran Washington operative who just resigned as Trump’s campaign chair on Friday, did not possess the same immunity. In the wake of Trump’s ambiguous or admiring comments about Vladimir Putin, and his suggestion that Russian hackers might release Hillary Clinton’s State Department emails, Manafort’s apparent financial ties to the ousted Ukrainian president widely perceived as a Putin stooge became toxic.
I’m not suggesting that anyone should expend a moment’s sympathy on Manafort, whose political slime trail extends several decades into the past and who, at the very least, seems to have repeatedly mischaracterized the nature of his extensive consulting and lobbying work on behalf of the former Ukrainian regime. But from here, it looks as if Manafort was fired from the Trump campaign not because he took millions of dollars from a sketchy foreign autocrat but because he got caught, and became an embarrassment. He was unable to skate past the alleged wrongdoing, in Trump-Lochte fashion, and so diverted attention from the Great Pumpkin-like glory of his boss.
Trump also apologized for something this week during a speech in North Carolina, which felt weird to everyone watching and can only have been deeply uncomfortable for his supporters. The entire point of Donald Trump (to the extent there is one) is a guy with no filter of civility or decency or “political correctness,” who will say whatever damn thing comes into his head and shrug it off afterwards. His repeated charge that Barack Obama was the founder of ISIS was first presented as factual, then as sarcastic and finally as “sarcastic, but not that sarcastic,” a rare instance of idiocy colliding with Solomonic ingenuity.
Trump’s reference point for his apology was, if possible, even vaguer than in Lochte’s case, not to mention deflected into deep space by the use of the second person. “Sometimes in the heat of debate and speaking on a multitude of issues,” he said, “you don’t choose the right words or you say the wrong thing. I have done that. And, believe it or not, I regret it.” That “believe it or not” is what’s called a “tell” in poker. Or maybe it’s just an Internet formulation. Many of us have clicked on headlines assuring us that we won’t believe what outrageous secret was about to be revealed, only to discover that it either wasn’t a secret or we didn’t believe it.
For Donald Trump or Ryan Lochte to believe in something, or to express genuine regret, would require some conception of the world outside their enormous egos, and also some conception of a moral code that ought not to be transgressed. Their instrumental and cynical understanding of politics and celebrity and sport and everything else — the worldview behind the fake apology that never addresses the misdeed, or seeks to remedy the harm — is certainly not new, and not yet ubiquitous. How far has it spread, and how much damage has it done?
I’m not especially troubled by Brooks’ statistic suggesting that one-quarter of younger Americans say they believe democracy is either a “fairly bad” or “very bad” political system. You can argue that those people are wrong, but on empirical grounds that’s not an inherently irrational belief. When one of our major political parties nominates someone who transparently doesn’t believe in democracy, or at any rate has no idea how it works — and when at least 40 percent of the public plans to vote for him — we might have a problem.
Will we discover a twin Earth? 20 big questions about the future of humanity
(Credit: Kyle Hilton, Scientific American)
This article was originally published by Scientific American.
1. Does humanity have a future beyond Earth?
“I think it’s a dangerous delusion to envisage mass emigration from Earth. There’s nowhere else in the solar system that’s as comfortable as even the top of Everest or the South Pole. We must address the world’s problems here. Nevertheless, I’d guess that by the next century, there will be groups of privately funded adventurers living on Mars and thereafter perhaps elsewhere in the solar system. We should surely wish these pioneer settlers good luck in using all the cyborg techniques and biotech to adapt to alien environments. Within a few centuries they will have become a new species: the post-human era will have begun. Travel beyond the solar system is an enterprise for post-humans — organic or inorganic.”
—Martin Rees, British cosmologist and astrophysicist
2. When and where do you think we will find extraterrestrial life?
“If there is abundant microbial life on Mars, I suspect that we will find it within 20 years — if it is enough like our form of life. If an alien life-form differs much from what we have here on Earth, it is going to be difficult to detect. It’s also possible that any surviving Martian microbes are rare and located in places that are difficult for a robotic lander to reach. Jupiter’s moon Europa and Saturn’s moon Titan are more compelling places. Europa is a water world where more complex forms of life may have evolved. And Titan is probably the most interesting place in the solar system to look for life. It is rich in organic molecules but very cold and has no liquid water; if life exists on Titan, it will be very different from life on Earth.”
—Carol E. Cleland, philosophy professor and co-investigator in the Center for Astrobiology at the University of Colorado Boulder
3. Will we ever understand the nature of consciousness?
“Some philosophers, mystics and other confabulatores nocturne pontificate about the impossibility of ever understanding the true nature of consciousness, of subjectivity. Yet there is little rationale for buying into such defeatist talk and every reason to look forward to the day, not that far off, when science will come to a naturalized, quantitative and predictive understanding of consciousness and its place in the universe.”
—Christof Koch, president and CSO at the Allen Institute for Brain Science; member of the Scientific American Board of Advisers
4. Will the entire world one day have adequate health care?
“The global community has made tremendous progress toward health equity over the past 25 years, but these advances have not reached the world’s most remote communities. Deep in the rain forest, where people are cut off from transportation and cellular networks, mortality is the highest, access to health care is the most limited and quality of care is the worst. The World Health Organization estimates that one billion people go their entire lives without seeing a health worker because of distance. Health workers recruited directly from the communities they serve can bridge the gap. They can even fight epidemics such as Ebola and maintain access to primary care when health facilities are forced to shut their doors. My organization, Last Mile Health, now deploys more than 300 health workers in 300 communities across nine districts in partnership with the government of Liberia. But we can’t do this work alone. If the global community is serious about ensuring access to health care for all, it must invest in health workers who can reach the most remote communities.”
—Raj Panjabi, co-founder and chief executive at Last Mile Health and instructor at Harvard Medical School
5. Will brain science change criminal law?
“In all likelihood, the brain is a causal machine, in the sense that it goes from state to state as a function of antecedent conditions. The implications of this for criminal law are absolutely nil. For one thing, all mammals and birds have circuitry for self-control, which is modified through reinforcement learning (being rewarded for making good choices), especially in a social context. Criminal law is also about public safety and welfare. Even if we could identify circuitry unique to serial child rapists, for example, they could not just be allowed to go free, because they would be apt to repeat. Were we to conclude, regarding, say, Boston priest John Geoghan, who molested some 130 children, ‘It’s not his fault he has that brain, so let him go home,’ the result would undoubtedly be vigilante justice. And when rough justice takes the place of a criminal justice system rooted in years of making fair-minded law, things get very ugly very quickly.”
—Patricia Churchland, professor of philosophy and neuroscience at the University of California, San Diego
6. What is the chance Homo sapiens will survive for the next 500 years?
“I would say that the odds are good for our survival. Even the big threats — nuclear warfare or an ecological catastrophe, perhaps following from climate change — aren’t existential in the sense that they would wipe us out entirely. And the current bugaboo, in which our electronic progeny exceed us and decide they can live without us, can be avoided by unplugging them.”
—Carlton Caves, Distinguished Professor in physics and astronomy at the University of New Mexico
7. Are we any closer to preventing nuclear holocaust?
“Since 9/11 the United States has had a major policy focus on reducing the danger of nuclear terrorism by increasing the security of highly enriched uranium and plutonium and removing them from as many locations as possible. A nuclear terrorist event could kill 100,000 people. Three decades after the end of the cold war, however, the larger danger of a nuclear holocaust involving thousands of nuclear explosions and tens to hundreds of millions of immediate deaths still persists in the U.S.-Russia nuclear confrontation.
Remembering Pearl Harbor, the United States has postured its nuclear forces for the possibility of a bolt-out-of-the-blue first strike in which the Soviet Union would try to destroy all the U.S. forces that were targetable. We don’t expect such an attack today, but each side still keeps intercontinental and submarine-launched ballistic missiles carrying about 1,000 warheads in a launch-on-warning posture. Because the flight time of a ballistic missile is only 15 to 30 minutes, decisions that could result in hundreds of millions of deaths would have to be made within minutes. This creates a significant possibility of an accidental nuclear war or even hackers causing launches.
The United States does not need this posture to maintain deterrence, because it has about 800 warheads on untargetable submarines at sea at any time. If there is a nuclear war, however, U.S. Strategic Command and Russia’s Strategic Missile Forces want to be able to use their vulnerable land-based missiles before they can be destroyed. So the cold war may be over, but the Doomsday Machine that came out of the confrontation with the Soviets is still with us — and on a hair trigger.”
—Frank von Hippel, emeritus professor at the Woodrow Wilson School of Public and International Affairs at Princeton University and co-founder of Princeton’s Program on Science and Global Security
8. Will sex become obsolescent?
“No, but having sex to conceive babies is likely to become at least much less common. In 20 to 40 years we’ll be able to derive eggs and sperm from stem cells, probably the parents’ skin cells. This will allow easy preimplantation genetic diagnosis on a large number of embryos — or easy genome modification for those who want edited embryos instead of just selected ones.”
—Henry Greely, director of the Center for Law and the Biosciences at Stanford University
9. Could we one day replace all of the tissues in the human body through engineering?
“In 1995 Joseph Vacanti and I wrote for this magazine about advances in artificial pancreas technology, plastic-based tissues such as artificial skin and electronics that might permit blind people to see [see ‘Artificial Organs,’ by Robert Langer and Joseph P. Vacanti; Scientific American, September 1995]. All of these are coming to pass, either as real products or in clinical trials. Over the next few centuries it is quite possible that nearly every tissue in the body may be able to be replaced by such approaches. Creating or regenerating tissues such as those found in the brain, which is extremely complex and poorly understood, will take an enormous amount of research. The hope is, however, that research in this area will happen quickly enough to help with brain diseases such as Parkinson’s and Alzheimer’s.”
—Robert Langer, David H. Koch Institute Professor at the Massachusetts Institute of Technology
10. Can we avoid a “sixth extinction”?
“It can be slowed, then halted, if we take quick action. The greatest cause of species extinction is loss of habitat. That is why I’ve stressed an assembled global reserve occupying half the land and half the sea, as necessary, and in my book ‘Half-Earth,’ I show how it can be done. With this initiative (and the development of a far better species-level ecosystem science than the one we have now), it will also be necessary to discover and characterize the 10 million or so species estimated to remain; we’ve only found and named two million to date. Overall, an extension of environmental science to include the living world should be, and I believe will be, a major initiative of science during the remainder of this century.”
—Edward O. Wilson, University Research Professor emeritus at Harvard University
11. Can we feed the planet without destroying it?
“Yes. Here’s what we need to do: reduce crop waste, consumer waste and meat consumption; integrate appropriate seed technologies and management practices; engage consumers about the challenges farmers face in both the developed and the developing world; increase public funding for agricultural research and development; and focus on advancing the socioeconomic and environmental aspects of farming that characterize sustainable agriculture.”
—Pamela Ronald, professor emerita in the Genome Center and the department of plant pathology at the University of California, Davis
12. Will we ever colonize outer space?
“That depends on the definition of ‘colonize.’ If landing robots qualifies, then we’ve already done it. If it means sending microbes from Earth and having them persist and maybe grow, then, unfortunately, it’s not unlikely that we’ve done that as well — possibly on Mars with the Phoenix spacecraft and almost certainly inside the Curiosity rover, which carries a heat source and was not fully baked the way Viking had been.
If it means having humans live elsewhere for a longer period of time, but not reproduce, then that’s something that might happen within the next 50 years or so. (Even some limited degree of reproduction might be feasible, recognizing that primates will be primates.) But if the idea is to construct a self-sustaining environment where humans can persist indefinitely with only modest help from Earth — the working definition of a ‘colony,’ according to the various European colonies outside of Europe — then I’d say this is very far in the future, if it’s possible at all. We currently have a very inadequate understanding of how to build closed ecosystems that are robust to perturbation by introduced organisms or nonbiological events (Biosphere 2, for example), and I suspect that the contained ecosystem problem will turn out to be much more challenging than the vast majority of space colonization advocates realize. There are a wide range of technical problems to solve, another being air handling. We haven’t bothered to colonize areas underwater on Earth yet. It’s far more challenging to colonize a place where there’s hardly any atmosphere at all.”
—Catharine A. Conley, NASA planetary protection officer
13. Will we discover a twin Earth?
“My money’s on yes. We’ve found that planets around other stars are far more abundant and diverse than scientists imagined just a couple of decades ago. And we’ve also found that the crucial ingredient for life on this planet — water — is common in space. I’d say nature seems to have stacked the deck in favor of a wide range of planets, including Earth-like planets. We just have to look for them.”
—Aki Roberge, research astrophysicist focusing on exoplanets at NASA Goddard Space Flight Center
14. Will there ever be a cure for Alzheimer’s?
“I am not sure if there will be a cure, per se, but I am very hopeful that there will be a successful disease-modifying therapy for Alzheimer’s disease within the next decade. We have now started prevention trials that are testing biological interventions even before people show clinical symptoms of the disease. And we don’t have to cure Alzheimer’s — we just need to delay dementia by five to 10 years. Estimates show that a five-year delay in the terrible and expensive dementia stage of the disease would reduce Medicare dementia costs by nearly 50 percent. Most important, that would mean that many older people could die while out ballroom dancing rather than in nursing homes.”
—Reisa Sperling, professor of neurology at Harvard Medical School and director of the Center for Alzheimer Research and Treatment
15. Will we use wearable technologies to detect our emotions?
“Emotions involve biochemical and electrical signals that reach every organ in our bodies — allowing, for example, stress to impact our physical and mental health. Wearable technologies let us quantify the patterns in these signals over long periods of time. In the coming decade wearables will enable the equivalent of personalized weather forecasts for our health: 80 percent increased probability in health and happiness for you next week, based on your recent stress/sleep/social-emotional activities. Unlike with weather, however, smart wearables can also identify patterns we might choose to change to reduce unwanted ‘storm’ events: Increase sleep to greater than or equal to nine hours per night and maintain current low-moderate stress, for a 60 percent reduced likelihood of seizure in the next four days. Over the next 20 years, wearables, and analytics derived from them, can dramatically reduce psychiatric and neurological disease.”
—Rosalind Picard, founder and director of the Affective Computing research group at the M.I.T. Media Lab
16. Will we ever figure out what dark matter is?
“Whether we can determine what dark matter is depends on what it turns out to be. Some forms of dark matter allow detection through small interactions with ordinary matter that have so far evaded detection. Others might be detectable through their influence on structures such as galaxies. I’m hopeful we will learn more through experiments or observations. But it’s not guaranteed.”
—Lisa Randall, Frank B. Baird, Jr., professor of science in theoretical physics and cosmology at Harvard University
17. Will we get control of intractable brain diseases like schizophrenia or autism?
“Diseases like autism and schizophrenia remain elusive because neuroscience hasn’t found a structural problem to fix. Some interpret this to mean future answers lie purely in biochemistry, not neural circuits. Others argue the key is for the neuroscientist to start to think in terms of overall brain architecture — not specific neural failures. Still, when thinking about the future, I am reminded of the Nobelist Charles Townes’s remark that the wonderful thing about a new idea is you don’t know about it.”
—Michael Gazzaniga, director of the SAGE Center for the Study of the Mind at the University of California, Santa Barbara
18. Will technology eliminate the need for animal testing in drug development?
“If human organs on chips can be shown to be robust and consistently recapitulate complex human organ physiology and disease phenotypes in unrelated laboratories around the world, as suggested by early proof-of-concept studies, then we will see them progressively replace one animal model at a time. That will eventually lead to significant reductions in use of animal testing. Importantly, these devices also will open up new approaches to drug development not possible with animal models today, such as personalized medicines and development of therapeutics for specific genetic subpopulations using chips created using cells from particular patients.”
—Donald E. Ingber, founding director, Wyss Institute for Biologically Inspired Engineering at Harvard University
19. Will gender equality be achieved in the sciences?
“Gender equality can be achieved, but we can’t just sit back and wait for it to happen. We need to ‘fix the numbers’ by recruiting more women into science and technology. We need to fix the institutions by implementing dual-career hiring, family-friendly policies, and new visions of what it means to be a leader. And, most importantly, we need to fix the knowledge by harnessing the creative power of gender analysis for discovery and innovation.”
—Londa Schiebinger, John L. Hinds Professor of History of Science at Stanford University
20. Do you think we will one day be able to predict natural disasters such as earthquakes with warning times of days or hours?
“Some natural disasters are easier to see coming than others. Hurricanes approach over days, volcanoes often build up to an eruption over days to hours, tornadoes strike within a few minutes. Earthquakes are perhaps the greatest challenge. What we know about the physics of earthquakes suggests that we will not be able to predict earthquakes days in advance. But what we can do is predict the damaging ground shaking just before it arrives and provide seconds to minutes of warning. Not enough time to get out of town, but enough time to get to a safe location.”
—Richard M. Allen, director, Berkeley Seismological Laboratory, University of California, Berkeley
Enhancing voter turnout: Voting laws negatively affect the most vulnerable Americans
Voting booths in St. Joris Weert, Belgium, June 13, 2010. (Credit: AP/Geert Vanden Wijngaert)
This piece originally appeared on BillMoyers.com.
American democracy has been on a roll. Over the past few weeks, courts across the country have affirmed that we are better when everyone participates and each voter’s voice is heard.
Voter-ID laws, designed to suppress the vote, have been taking a beating. In Texas, North Carolina and North Dakota, courts struck down or modified voter-ID laws that make it harder for low-income, minority and student voters to cast their constitutionally protected ballots. A similar victory in Wisconsin is in limbo after a federal appeals court stayed the lower-court ruling pending appeal. With these decisions, the courts — including one of the most conservative appellate circuits in the land — announced that vote suppression tactics limiting Americans’ right to vote won’t be tolerated. It’s a win for voters and a win for democracy.
This year, 28.5 percent of eligible citizens voted in presidential primaries.
But pernicious laws aren’t the only things standing in the way of a robust and active democracy. Anyone who takes a minute to reflect on our election season so far would do well to ask, what more needs to be done? Turnout for presidential elections barely pushes 60 percent on a good day — see President Obama’s first election in 2008 — and turnout for primaries pales in comparison. This year, 28.5 percent of eligible citizens voted in presidential primaries. While that figure seems shockingly low, it’s on the higher end of the spectrum; participation in the 2008 primaries reached a record level of 30.4 percent.
And, believe it or not, primary season isn’t over. This August, when many families are hitting the road or the beach for a last summer fling before school starts, 12 states are holding primary elections to pick nominees for fall House and Senate contests. On Tuesday alone, voters are going to the polls to nominate congressional candidates in South Dakota, Wyoming and Alaska. The outcomes of these races are important — just ask Barack Obama and his predecessor, George W. Bush, what a difference a Congress can make. Many of the states where voters are being asked to go to the polls to choose members of Congress already held presidential primaries earlier this year.
One has to wonder if voter confusion might not be one of the factors.
In a country where participation in primaries is consistently so low that it landed the United States 138th place out of 169 democracies in an international evaluation of turnout, one has to wonder if voter confusion might not be one of the factors involved. Consider:
In New York’s 1st Congressional District, 45,636 Democratic voters turned out to cast ballots in the April presidential primary. For June’s competitive congressional Democratic primary, the figure was 12,641. The drop-off was similarly abysmal in the state’s 22nd Congressional District among Republicans: 67,505 voted in the presidential primary. But in a hotly contested June GOP congressional primary, just 23,250 Republican voters turned out. (And New Yorkers might be surprised to learn that they have yet another primary ballot to cast, on Sept. 13, for state and local candidates).
In Missouri’s 1st Congressional District, which includes the city of St. Louis and much of St. Louis County, 147,597 Democrats voted in the March presidential primary. For last week’s three-way Democratic primary for the U.S. House seat, the number of Democrats voting was 89,182. In Missouri’s heavily Republican 4th Congressional District, 130,638 Republicans cast ballots in the presidential primary, compared to 101,888 for the district’s three-way GOP congressional primary.
Much needs to be done on voter registration reform; states with same-day registration have significantly higher turnout rates in both general and primary elections, and a new reform — automatic voter registration — has the potential to raise those numbers further.
Full restoration of the protections of the Voting Rights Act, earlier gutted by the Supreme Court’s decision in Shelby Co. v. Holder, would prevent additional suppressive laws from hitting state books; the Voting Rights Advancement Act before Congress would ensure no voter’s ballot could be blocked by discrimination. These fixes aside, there’s more we should do to address how the primary system itself ensures all eligible Americans come together to select each party’s nominee.
In light of this year’s patterns during the primaries, three problems requiring solutions are at the top:
1. Everyone should participate, but caucuses discourage that.
If you want people to stay home, throw a caucus. When voting at a caucus, typically you have to spend a few hours checking in, listening to speeches and then aligning with one candidate over another. Given this chunk of time, parents with child care responsibilities and workers pulling down late shifts often miss out on the chance to cast their votes. Moreover, not every state’s caucus ensures a secret ballot; some require that individuals physically gather around a candidate while being counted. Indeed, the rules can feel a bit “inside baseball.” As a result, caucuses typically attract political insiders over everyday Americans, leading to polarizing elections instead of down-the-middle selections.
During this past election season, caucus states had some of the lowest turnout compared with primary states. Aside from Iowa, which perpetually has held the nation’s first caucus in a presidential election year, and Idaho, whose unaffiliated voters can register with a party on Election Day, the remaining nine states with caucuses saw turnout under 15 percent — with at least half the states in the single digits. For their Democratic Party caucus, North Dakota’s voters turned out at an abysmal rate of 0.7 percent. With a system like this, only a privileged few decide for the many whose name should head the top of the ticket.
2. Voting should be accessible and secure, but early registration deadlines are an obstacle.
In some states — Minnesota, for example — an eligible citizen can both register and vote on the date of the primary, thereby permitting those who aren’t as politically involved to still choose a nominee. Most states, though, don’t permit that option, and most impose deadlines by which a registered voter must change his or her affiliation in order to vote a different ticket in the primary. New York state, taking that rule to the extreme, requires an individual to make such a change 193 days before its April primary, so it’s no surprise that turnout in New York hovered at around 20 percent. Few people pay attention to the election that far out, and fewer still have chosen a candidate by the deadline. This is especially troubling for the growing number of Americans who identify as independent, aligning themselves with a candidate rather than a party.
3. Our current primary schedule consistently benefits some voters to the detriment of others.
Not surprisingly, New Hampshire, the second state to vote (right after Iowa’s caucuses) and home of the first primary of the season, boasted the highest voter turnout with 53 percent — a figure well above most other states, which hover between 20 and 30 percent turnout. Because they hold the first primary and caucus respectively, New Hampshire and Iowa get the most attention from the media and candidates alike, and it’s reflected in the turnout stats. The old saying may be, “as New Hampshire goes, so goes the nation,” but a state with under 1.5 million residents, 94 percent of whom are white, does not represent this country as a whole. New Hampshire’s golden spot during the presidential elections gives it unfair advantage in candidate selection over the rest of the states — and the rest of Americans.
When it comes to selecting elected officials, Americans pay the most attention during presidential election years. States know and capitalize on this; most keep moving primary and caucus dates up in a race to get to the top of the pack — garnering media and candidate attention, not to mention campaign dollars. A more appropriate spacing, perhaps even a rotating schedule among the states, would also make it possible to schedule down-ballot elections for the same date — ensuring that more citizens vote for the offices that impact them most.
Our current primary system gets berated each election season, with good reason. It doesn’t work well. If we want to truly raise our turnout rates, and ensure that all eligible Americans are participating, we’ve got to come up with solutions. Otherwise, come the presidential election, 60 percent turnout will still look pretty good.
The rise of irreligion is the GOP’s real demographic crisis
Ted Cruz (Credit: Reuters/Jonathan Ernst)
In the past several years, many trees have been felled and pixels electrocuted in the service of discussion about the impact of Hispanics on the American electorate. No one knows for sure which way they’ll vote in the future but everyone is interested in discussing it. Curiously, though, an even larger political shift is taking place yet receiving almost no attention whatsoever from political reporters — the emergence of post-Christian America.
Judging solely from the rhetoric and actions of the candidates who sought the Republican Party’s presidential nomination this year, you would be hard-pressed to tell much difference between 2016 and 1996, the year that the Christian Coalition was ruling the roost in GOP politics. Sure there was a lot more talk about the Middle East than before, but when it comes to public displays of religiosity, many of the would-be presidents have spent the majority of their candidacies effectively auditioning for slots on the Trinity Broadcast Network.
Even Donald Trump, the thrice-married casino magnate turned television host, went about reincarnating himself as a devout Christian, despite his evident lack of familiarity with the doctrines and practices of the faith.
Former Arkansas Governor Mike Huckabee and former Pennsylvania Senator Rick Santorum, both of whom won Iowa in past years, dropped out after failing dismally in the Hawkeye State’s caucuses. Louisiana Governor Bobby Jindal quit months before even a single vote had been cast. Texas Senator Ted Cruz, despite being significantly better financed and supported by more conservative leaders than previous Christian nationalist candidates, was barely able to win any primary states at all; his main strength was in caucus states where popular appeal wasn’t as important.
But Cruz’s difficulties were no different than those faced by previous Christian Right presidential candidates; none has ever even gotten close to the nomination. Cruz’s failure to get the nomination even in the face of the GOP voters’ knowing that an orange buffoon would get it instead is a perfect window into trends that will set the pace of American politics for decades to come: Americans are moving away from Christianity, including people most likely to vote Republican.
While the process of secularization has been slower-moving in the U.S. compared to Europe, it is now proceeding rapidly. A 2014 study by Pew Research found that 23 percent of Americans say they’re “unaffiliated” with any religious tradition, up from 20 percent just three years earlier. The Public Religion Research Institute confirmed the statistic as well with a 2014 poll based on 50,000 interviews indicating that 23 percent of respondents were unaffiliated.
The trend away from faith is only bound to increase with time. According to Pew, about 36 percent of adults under the age of 50 have opted out of religion. At present, claiming no faith is the fastest growing “religion” in the United States. Between 2007 and 2012, the number of people claiming “nothing in particular” increased by 2.3 percent, those saying they were agnostics increased by 1.2 percent and those claiming to be atheists increased by 0.8 percent. No actual religious group has experienced anywhere near such growth during this time period.
Looked at over the longer term, the trend is even more discernible. In 1972, just 5.1 percent of Americans said they had no religious affiliation, according to the University of Chicago’s General Social Survey. In 2014, that number was 20.7 percent, an increase of more than 400 percent.
To put that growth in perspective, consider that Hispanics were 4.5 percent of the U.S. population in 1970 (according to the Census Bureau) and 16.9 percent by 2012 (according to GSS). Despite receiving almost no attention whatsoever, people with no religion are both more numerous and increasing their numbers at a faster pace than people of Hispanic descent. (Unfortunately GSS did not measure Hispanic origin until 2000 so the comparison isn’t completely perfect.)
While those statistics on the growth of religiously unaffiliated Americans ought to be impressive enough to warrant serious discussion, the reality is that public polling almost certainly underestimates the numbers of the faithless because many religious Americans have strongly negative opinions of those who are atheists or agnostics. This negativity makes non-believers less willing to publicly admit to their opinions.
A 2014 study by the Public Religion Research Institute found that people of all races and religious creeds (or lack thereof) were more likely to claim they attended church services in a telephone survey than they were during a self-administered web survey where their opinions would not be solicited by a person in conversation.
According to the research, religiously unaffiliated people were 18 percent more likely to say they attended church services on the phone than they were online. Americans in general were 13 percent more likely to give the religiously correct answer in a phone survey.
The Religiously Unaffiliated are More than Unchurched
Of the few conservatives who have actually responded to this momentous demographic development, the typical response has been to claim that this large group of non-believers are simply “un-churched.” Over time, the argument goes, these people will return to the sanctuary and back into the Grand Old Party. The argument might be a comfortable one to conservatives of faith, but it is not supported by the facts.
When asked to identify their specific beliefs about the nature of God for a 2008 poll by the American Religious Identification Survey (ARIS), 7 percent of respondents with no religious affiliation advocated an atheist perspective, 35 percent were agnostics, and 24 percent were deists. Just 27 percent of respondents said they “definitely” believed in a personal God. In a private online survey conducted by the Public Religion Institute, just 19 percent of the religiously unaffiliated agreed with the statement that “God is a person.” Forty-three percent of respondents said they did not believe in God while 35 percent said that they believed God is an impersonal force.
While some of those who are unaffiliated do profess a belief in God, a huge majority of those with no religion appear utterly uninterested in joining up with any particular faith tradition. A full 88 percent told Pew they were “not interested.” That is likely because Americans of no faith have strong, negative viewpoints about religious organizations, overwhelmingly characterizing them as “too concerned with money and power, too focused on rules and too involved in politics.” Nearly half of these individuals describe themselves as neither spiritual nor religious.
While religiously unaffiliated people in days gone by might have been “unchurched,” this is no longer the case.
Fewer Christians Means Fewer Republicans
The implications of Americans’ exodus from cultural Christianity are significant for the political right because the religiously unaffiliated appear to have a real preference for Democrats. In fact, a person’s religious perspective is generally the most accurate predictor aside from party identification of how he or she will vote.
It is this changing aspect of the electorate that will have more of an impact on the conservative movement’s future than any other demographic shift. Already, it has decimated Republican vote totals in many western states such as California, Montana, New Mexico and Colorado. True, California and New Mexico have substantial Hispanic populations but Montana does not and neither does Northern California, the furthest left region of the Golden State. The fact of the matter is that many white voters are abandoning faith and as they do, they are leaving the Republican Party as well. Many younger white voters are never even joining up with religion—and the Republican Party by extension. This demographic trend is creating what might be called the “Godless Gap,” a voting disparity that is particularly harmful to Republicans since Democrats have been much better at getting votes among Christians than the GOP has among the irreligious.
While secular people have always favored Democrats for as long as the data goes back, the situation has actually become even worse in recent years for the GOP. Republicans have long trailed Democrats among non-religious Americans (hereafter called “Nones”) but since the late 1990s, they have even been behind independents, according to GSS. Research conducted by ARIS also confirmed this overall trend even though it did not ask people to indicate a party toward which they might incline.
In the 1990 ARIS study, 42 percent of respondents who claimed no religion said they were “independents,” 27 percent said they were Democrats, and 21 percent said they were Republicans. According to the 2008 poll (the most recent), 42 percent of people with no religious affiliation said they were “independents,” 34 percent said they were Democrats, and just 13 percent said they were Republicans.
A 2012 survey by Pew Research confirmed this trend as well. Asked about their voting preference during the previous presidential election cycle, people of no faith said they had voted in about the same proportion for Barack Obama as white evangelical Protestants did for John McCain. Pushed to identify their own partisanship, a full 63 percent said they favored Democrats. Just 26 percent said they leaned toward Republicans.
If partisanship and religious identification were actually independent of each other, this type of shift would not be nearly so pronounced since as the ranks of the non-religious grow, they ought to be exhibiting characteristics more in common with the general population, as one can observe when one examines Nones in non-political contexts such as incomes, divorce rates and (to a lesser degree) racial composition. And yet that is not what appears to be happening when we examine their political preferences.
The likely reason why Republicans have declined in popularity among the non-religious is GOP’s long habit of identifying itself as a Christian party. The later attempt to add in a “Judeo-” prefix has done little to stop the bleeding.
As increasing numbers of whites and Asians have chosen non-Christian religions or no faith tradition at all, they are also leaving the Republican Party. Some are joining up with Democrats but many are choosing “none of the above,” just like what they are doing with religion. Much of this movement parallels already established patterns observed by Jewish voters, who were much more inclined toward Republicans before Christian nationalism became a force within the party.
Republicans Probably Lost Young Adults Due to Decline of Faith
As has already been noted, people claiming “no religion” in surveys are much more likely to be young. As mentioned above, just over 30 percent of adults between the ages of 18 and 29 are Nones. But generational attrition—the gradual replacing of older religious people with younger secular ones—is not the only reason why the ranks of the Nones have expanded. People under 65 have also become more secular in recent years. As noted by Pew:
Generation Xers and Baby Boomers also have become more religiously unaffiliated in recent years. In 2012, 21 percent of Gen Xers and 15 percent of Baby Boomers describe themselves as religiously unaffiliated, up slightly (but by statistically significant margins) from 18 percent and 12 percent, respectively, since 2007.
Their lack of interest in religion is having an effect on the voting patterns of younger Americans. After winning voters ages 18-29 in the 1972, 1984 and 1988 presidential elections (Reagan lost them by one point in 1980), the best the Republican Party has done among this age group is a 47-47 tie in 2000. Even that was a hollow achievement, however, because 5 percent of the young voted for left-wing Green Party candidate Ralph Nader.
In 2004, with Nader no longer a factor, young voters broke for Democrat John Kerry 54 percent to 45 percent. In 2008, Democrat Barack Obama won 66 percent of their votes to John McCain’s meager 32 percent. In 2012, Obama did slightly worse among this age group (which is almost a given since he did so well the first time). He still overwhelmingly won their votes 60 percent to 37 percent, however.
The past shows that young people are not natural knee-jerk Democrat voters, but clearly Republicans have been losing younger voters lately. Religious differences are almost certainly a factor. According to a 2014 poll commissioned by the American Bible Society, just 35 percent of adults between the ages of 18 and 29 believe the Bible “contains everything a person needs to know how to live a meaningful life.” The millennial generation is also much more skeptical about the role of the Bible within society. Just 30 percent of that age group surveyed said they thought the Bible had “too little influence” on Americans. By contrast, 26 percent said the Bible had “too much influence” on society.
The “Godless Gap” and the 2012 Election
Beyond the national trends, the increase in secularization has also had an effect in the different regions of the country where Nones are concentrated. As noted by the 2008 ARIS study, 20 percent of people living in California, Oregon and Washington were non-religious, 19 percent of people in the Mountain West were Nones, and a full 22 percent of individuals living in New England had no religious faith.
It is no coincidence that as non-belief has increased in these regions, the Republican party’s fortunes there have declined accordingly. The 2012 election provided many examples of how Republicans are losing elections thanks to the Godless Gap. In seven key states (Pennsylvania, Florida, Virginia, Wisconsin, Michigan, Iowa and New Hampshire), Mitt Romney won the majority of the Christian vote but ended up losing overall because he was defeated so soundly among non-Christians.
2012 Presidential Vote by State and Religious Belief
Source: Exit poll conducted by Edison Media Research
State
Protestant Obama
Protestant Romney
Catholic Obama
Catholic Romney
Unaffiliated Obama
Unaffiliated Romney
Iowa
46
53
47
52
75
22
Florida
42
58
47
52
72
26
Pennsylvania
49
51
49
50
74
25
Virginia
45
54
45
55
78
22
Wisconsin
45
54
44
56
73
25
Michigan
48
51
44
55
New Hampshire
42
56
47
53
73
26
Even though the state is famous for its religiosity, in Iowa, Nones were indisputably the margin of victory for Obama. According to exit polls, Romney won the votes of the 62 percent of Iowans who called themselves Protestants (53-46) and the Catholic 26 percent (52-47) but he overwhelmingly lost the None vote 75 to 22 percent. With its overwhelmingly white population, Iowa was Romney’s to lose. And he did—by doing so poorly among white voters with no religious affiliation. In the end, the former Massachusetts governor lost the Hawkeye State by less than 100,000 votes.
Non-Christians also put Obama over the top in Pennsylvania, a state which Romney’s top advisers believed was “really in play” right up until Election Day. And they were right—so long as one only looked at the vote of the Christian faithful (77 percent of the electorate). Romney actually managed to win both the Protestant and the Catholic votes quite narrowly, 51-49 and 50-49 respectively, but his tremendous loss among the 12 percent of Pennsylvanians who were not religious overwhelmed his share of the Christian vote. Because he lost the None vote 74-25, Romney ended up losing the state 52 to 47 percent.
The same thing happened in Florida as well, another state that Romney was counting on winning. He cleaned up among the 51 percent of Protestant voters (58-42), won the 23 percent Catholic vote (52-47) but ended up losing the 15 percent None vote 72 percent to 26 percent. He also sank among the non-Christian religious as well. In the end, Romney lost the state by just 74,309 votes. Had he done just a little better among non-Christians, Romney would have been able to put the Sunshine State into his column.
Virginia was another state that was Romney’s to win had he done better among non-Christians. According to exit polls, Romney captured small majorities among Protestants (54 percent to 45 percent) and Catholics (55 to 45) but was clobbered among non-Christian believers (78-22) and among those with no religion (76-22).
The None vote also cost Romney the state of Wisconsin. As with the other states examined above, Romney won the Catholic vote (56 percent to 44 percent) as well as the Protestant vote (53 to 45 percent) but lost so overwhelmingly among non-believers (73 to 25 percent) that he ended up losing the Badger State 53 to 46 percent.
The same thing happened in Romney’s native Michigan, where he won among Protestants 51 to 48 percent and among Catholics (55 to 44 percent) but lost so overwhelmingly among non-believers that he ended up losing the state 54 percent to 45 percent.
The former Massachusetts governor also lost New Hampshire despite winning the votes of both Catholics (54 percent to 46 percent) and Protestants (57 percent to 42 percent). Because he lost the None vote so badly (71-28), Romney ended up losing the state’s electoral votes by less than 40,000 votes.
Based on the data above, it is safe to say that the Godless Gap cost Mitt Romney the election.
While many of the Nones who voted against him are hard-core Democrats who never would have considered voting GOP, it is not unreasonable to think that Romney could have done better among non-Christians, especially given the decline in Republican partisanship among Nones mentioned above. Had Romney managed to improve his performance among people who don’t believe the Bible is true, he could have won as many as 304 electoral votes.
GOP’s Choice: Christian Nationalism or Political Reality?
That so many non-Christians would choose not to vote for Republicans and conservatives really should come as no surprise considering the fact that many Christian conservatives—even at the very highest echelons of power and influence—seem to be utterly unaware that their repeated use of Christian symbolism and their insistence on promoting religious liberty only for Christians can be perceived as offensive or non-inclusive to people who do not share their beliefs.
As conservative author Dinesh D’Souza, a Christian immigrant from India, has described it: “Whenever a Gujarati or Sikh businessman comes to a Republican event, it begins with an appeal to Jesus Christ. While the Democrats are really good at making the outsider feel at home, the Republicans make little or no effort.” That’s also true of people who do not believe in any faith.
Even if non-Christians do not take offense at being excluded, at the very least such public displays of Christian belief at ostensibly secular events certainly do not encourage them to participate or to become enthusiastic. National Review columnist Jonah Goldberg (who is Jewish by ancestry although he is non-practicing) described the phenomenon well in a 2012 column:
“I’ve attended dozens of conservative events where, as the speaker, I was, in effect, the guest of honor, and yet the opening invocation made no account of the fact that the guest of honor wasn’t a Christian. I’ve never taken offense, but I can imagine how it might seem to someone who felt like he was even less a part of the club.”
As bad as things are now for Republicans with regard to secular voters, however, they seem to be worsening. A 2012 study by the Pew Research Center found that the Democratic share of the None vote has increased significantly since 2000 when it stood at 61 percent. In 2004 it rose to 67 percent. In 2008, an incredible 75 percent of the religiously unaffiliated voted for Barack Obama. In 2012, not quite as many, 70 percent, did so again. As it stands, people with no faith tradition have shifted a full nine points toward Democrats.
Unless action is taken—and this must include a concession that most Americans support same-sex marriage—as the non-Christian portion of the country continues to grow, the prospects for the conservative movement are going to attenuate as the Godless Gap widens.
Following Mitt Romney’s 2012 loss, there has been a lot of discussion about how conservatives can better reach out to non-whites. The Right will probably need to have a similar discussion about doing the same for non-Christians, especially since many non-whites are also non-Christian.
Regardless of what happens to GOP candidates in November, Christian conservatives face a choice. They can embrace identity politics and become a small group of frustrated Christian nationalists who grow ever more resentful toward their fellow Americans, or they can embrace reality and render unto Caesar the things which are Caesar’s.
Matthew Sheffield is a journalist currently working on a book about the future of the Republican Party. You may follow him on Twitter: @mattsheffield. This article is reprinted by permission from Praxis.