Andrew Sullivan's Blog, page 165
September 3, 2014
Putting Women In A Light Box
Alex Heimbach suggests that books such as Women Photographers: From Julia Margaret Cameron to Cindy Sherman perpetuate a “gender ghetto”:
The book collects the work of 55 practitioners, from pioneers of the form to contemporary photojournalists. [Author Boris] Friedwald also includes short bios of each artist as part of his goal to present “the variety and diversity of women who took – and take – photographs. Their life stories, their way of looking at things, and their pictures.” Sounds admirable enough. Yet it’s impossible to imagine an equivalent book titled Men Photographers: From Eugène Atget to Jeff Wall. Male photographers, like male painters, male writers, and male politicians, are the default. The implication, intentional or not, is that no matter how talented, female photographers are women first and artists second.
Ideally, endeavors like Friedwald’s serve to illuminate lesser-known artists, who may have been discounted because of their gender (or race or sexual orientation or class). But more often such exercises become a form of de facto segregation, whether it’s a BuzzFeed quiz on how many of the “Greatest Books by Women” you’ve read or a Wikipedia editor isolating female novelists in their own category. These projects are often undertaken in a spirit of celebration, but their thoughtlessness generally renders them pointless at best and misogynistic at worst.
(Photo: Julia Jackson by Julia Margaret Cameron, 1867, via Wikimedia Commons)



Writing Tip: Don’t Write
Overcoming writer’s block took years for Bill Hayes, who advises that “not writing can be good for one’s writing; indeed, it can make one a better writer” (NYT):
Then I woke one day, and a line came to me. It didn’t slip away this time but stayed put. I followed it, like a path. It led to another, then another. Soon, pieces started lining up in my head, like cabs idling curbside, ready to go where I wanted to take them. But it wasn’t so much that pages started getting written that made me realize that my not-writing period had come to an end. Instead, my perspective had shifted.
Writing is not measured in page counts, I now believe, any more than a writer is defined by publication credits. To be a writer is to make a commitment to the long haul, as one does (especially as one gets older) to keeping fit and healthy for as long a run as possible. For me, this means staying active physically and creatively, switching it up, remaining curious and interested in learning new skills (upon finishing this piece, for instance, I’m going on my final open-water dive to become a certified scuba diver), and of course giving myself ample periods of rest, days or even weeks off. I know that the writer in me, like the lifelong fitness devotee, will be better off.



September 2, 2014
Stop Saying “Officer-Involved Shootings”
Let’s talk about “officer-involved shootings.” That is the formal term, used by seemingly all American local news broadcasts, for when a cop shoots someone. Instead of saying “‘Cops’ crew member killed by police officer,” the headline is, “‘Cops’ crew-member killed after officer-involved shooting.” (It just sort of happened, after that shooting.) There is also “police involved shooting,” a term I first noticed being used by the local New York evening news team last May.
These terms are terrible and journalists should not use them. They are cop-speak. Local news reporters love nothing more than adopting cop-speak, because local news is built on manufacturing fear of crime and venerating of police officers, but both of these terms fail the crucial test of actually being coherent explanations of what happened. Of course police would invent an obfuscatory euphemism for when they shoot people – they would be fools not to try to come up with a nice way of saying “we killed someone” – but the press’ job is supposed to be to translate those euphemisms into plain English.
“Officer-involved shooting” absolves the person who actually pulled the trigger of responsibility, turning the shooting into an apparently inevitable act. The officer was just involved! As Natasha Lennard at Vice News puts it:
The phrase “police-involved shooting” is a careful construction, which, like the criminal justice system more broadly, tends to point blame away from cops. It is code for “the cops shot someone.”
To a reporter, “officer-involved shooting” should sound as grating to the ear as “bear-involved large mammal attack.”
The two terms, now ubiquitous, appear to be very successful modern coinages. Neither phrase seems to have been in usage at all before the 1970s. Usage of “officer involved shooting” soared during the 1980s and 1990s, with “police involved shooting” not catching on until the 2000s.
Where did the term come from? The LAPD has, for years, produced an annual “Officer Involved Shooting” report (NYT) and has had an “officer involved shooting unit” since 1987 or earlier. I wouldn’t be surprised if the phrase made its way into the press’ lexicon via former LAPD chief (and racist paramilitary policing pioneer) Daryl Gates, a man who rarely shied from television cameras. (If anyone knows the actual origin of the phrase, please let us know: dish@andrewsullivan.com)
The International Association of Chiefs of Police, by the way, publishes “Officer-Involved Shooting guidelines” (pdf). The guidelines aren’t about how not to shoot someone, but more about what to do once you have shot someone. The entire document is sort of incredible in its careful consideration of the emotional and mental state of the officer, and its complete silence on the status of the person the officer actually shot. For example:
Following a shooting incident, officers often feel vulnerable if unarmed. If an officer’s firearm has been taken as evidence or simply pursuant to departmental policy, a replacement weapon should be immediately provided as a sign of support, confidence, and trust unless there is an articulable basis for deviating from this procedure. Officers should be kept informed of when their weapon is likely to be returned. Care should be taken to process and collect evidence from the officer as soon as practicable to provide an opportunity to change into civilian clothing.
It is vital that you give the officer his gun back as soon as possible, or else he might feel bad, about shooting someone.
I can’t say this definitively, because, as we’ve learned this month, there is no national database of police shootings, but American cops seem to shoot other people far more often than people shoot cops. The number of police killed by firearms peaked in the early 1970s, and has steadily declined since. It hasn’t cracked 100 officers in any year over the last decade. Meanwhile, around 400 people a year are killed in “justifiable police homicides,” according to the only official numbers available for police homicides. (And that report doesn’t even pretend to be a complete account of everyone killed by police officers.) “Police involved shooting” may not be quite as obfuscatory a phrase as it was designed to be, simply because the majority of American shootings “involving” cops seem to be shootings by cops.
(Photo: Montgomery County police officers qualifying at their indoor shooting range in Rockville, Maryland on August 23, 2007. For story on ammunition rationing due to the war in Iraq. By James M. Thresher/The Washington Post/Getty Images.)



The Death Rattle Of Islamism?
Graeme Wood isn’t the first writer to touch on the significance of Abu Bakr al-Baghdadi’s declaration of a “caliphate”, but his substantial exploration of the meaning of the term gets to why it’s so weird that Baghdadi has chosen it to describe his so-called Islamic State when other radical Islamist groups have steered clear of such declarations:
Mostly … caliphate declarations have been rare because they are outrageously out of sync with history. The word conjures the majesty of bygone eras and of states that straddle continents. For a wandering group of hunted men like Al Qaeda to declare a caliphate would have been Pythonesque in its deluded grandeur, as if a few dozen Neo-Nazis or Italian fascists declared themselves the Holy Roman Empire or dressed up like Augustus Caesar. “Anybody who actively wishes to reestablish a caliphate must be deeply committed to a backward-looking view of Islam,” says [University of Chicago historian Fred] Donner. “The caliphate hasn’t been a functioning institution for over a thousand years.”
And it isn’t now, either. The designation of the ISIS “caliphate” still smacks of delusional grandiosity more than anything else. There is no downplaying its brutality or denying that it would do great violence to the West if given the chance, but the Islamic State is no superpower: more than anything else, its sudden rise owes mainly to the fact that Syria and Iraq are fragile states, and its savagery has the sleepwalking states of the Arab world to the threat of jihadism like never before. The enemies it is making on all sides, especially among other Muslims, would seem to suggest that ISIS may burn out nearly as quickly as it caught fire. Could the madness of ISIS be the final fever of a dying ideology?
What seems most promising to me in the backlash against ISIS is the extent to which that backlash relies on the genuine principles of Islam itself. We know that some of the fighters traveling from the West to fight alongside ISIS know next to nothing about the religion. We have evidence that jihadist movements like Boko Haram and the Taliban are widely despised in their spheres of influence. Here, Dean Obeidallah takes a look at how leaders of Muslim countries and communities are more or less unanimously condemning the false Islam of the jihadists:
The religious and government leaders in Muslim-dominated countries have swiftly and unequivocally denounced ISIS as being un-Islamic. For example, in Malaysia, a nation with 20 million Muslims, the prime minister denounced ISIS as “appalling” and going against the teachings of Islam(only about 50 have joined ISIS from there). In Indonesia, Muslim leaders not only publicly condemned ISIS, the government criminalized support for the group. And while some allege that certain Saudi individuals are financially supporting ISIS, the Saudi government officially declared ISIS a terrorist group back in March and is arresting suspected ISIS recruiters. This can be a helpful guide to other nations in deterring ISIS from recruiting. A joint strategy of working with Muslim leaders in denouncing ISIS and criminalizing any support appears to be working. And to that end, on Monday, British Muslim leaders issued a fatwa (religious edict) condemning ISIS and announcing Muslims were religiously prohibited from joining ISIS.
This all has me wondering if ISIS, the reductio ad absurdum of radical Islamism, doesn’t herald the downfall of that ideology altogether. Bear in mind that political Islam hasn’t always been exclusively reactionary: the first avowedly Islamic politics of the modern era, first articulated before the Muslim Brotherhood’s founders were even born, was the Islamic Modernism of Muhammad Abduh, Rashid Rida, and Jamal al-Din al-Afghani. Here were pious Muslims arguing that Islam was fully compatible with rationalism and making arguments for universal literacy and women’s rights from the same Muslim revivalist standpoint from which Hassan al-Banna and Sayyid Qutb would later espouse a more conservative vision of Islamic politics in modernity.
The illiberal strain of Arab Islamism, its Iranian counterpart, and the more radical jihadist movements that grew out of these movements (or alongside them, depending on which historian you ask) have been the major representatives of political Islam in the late 20th and early 21st centuries. There’s no reason, however, to believe that this condition is permanent or that a less reactionary form of Islamic political thought, or even an Islamic liberalism after the model of the Modernists, could not take hold in the Muslim world given the right set of circumstances. Islamism, particularly in its more extreme varieties, has long articulated an Islamic state operating under a “pure” interpretation of Islamic law as a utopian vision. Now, here is an Islamic State, a “caliphate” no less, that claims to do just that, and the outcome is rather dystopian. Torture, gang rape, slave brides, beheadings, crucifixions, and child soldiers are not what most Muslims have in mind when they imagine the ideal Islamic society. I would wager that these horrors will turn more Muslims against radical Islamism than toward it.
This is all by way of saying, as a reminder, that “Caliph Ibrahim” (Baghdadi) represents Muslims about as thoroughly as Tony Alamo represents Christians. The fact that he has attracted enough funding and followers to run roughshod over northern Iraq and eastern Syria is nothing to brush off, but it’s not winning him any friends, and it doesn’t make his ideology any less ridiculous. It’s certainly not “Islam”, at least not as any Muslim I know practices it. That’s why I suspect it will fail, like most grandiose visions of world domination do. And by radicalizing the Islamic heartland against radicalism, as it were, perhaps ISIS will take the entire edifice of radical Islamism down with it.



The Game Of Life
Simon Parkin appreciates Spermania, a video game in which “players assume the role of a plucky sperm that must navigate the kinks and curves of an undulating fallopian tube,” as a “good joke that’s well told.” He describes how the game’s creators at the Ramallah-based PinchPoint, Inc. had to overcome the barrenness of the gaming industry in Palestine:
PinchPoint is, according to the company’s co-founder and C.E.O., Khaled Abu Al Kheir, the first venture-capital-backed Palestinian video-game studio. Despite recent efforts to grow the I.T. sector in the Palestinian territories with incubators, accelerators, and venture-capital firms, there are only a handful of video-game developers in the area. Partly, this is due to the unique challenges of establishing a startup in a turbulent region. “Local events here definitely affect our focus and stress us out,” Basel Nasr, one of the game’s developers, told me. “We have no airport or control over our land borders, so travel costs extra time and money. This makes it more challenging to plan overseas trips, as well as to connect with foreign video-game studios around the world in order to learn and share our experiences.” Likewise, the lack of a vibrant industry in the region makes expanding the studio a tremendous challenge. “There’s an almost non-existent talent pool in Palestine for video-game development,” Kheir said.
As for whether the game has proven controversial in Palestine:
Contrary to the team members’ expectations, most of their friends and families supported Spermania’s subject matter. “The theme itself might be a bit controversial,” [developer Basel] Nasr, who designed the game’s cartoonish aesthetic, said. “But the art style gives the game a light and humorous feel. Most people laugh about the idea, and we haven’t received any threats. My two sons, who are five and two, enjoy the game, although they don’t know what it’s really about.”



Face Of The Day
An Afghan girl look through the door of her house in an old section of Kabul on September 2, 2014. Afghanistan’s economy has improved significantly since the fall of the Taliban regime in 2001 largely because of the infusion of international assistance. Despite significant improvement in the last decade, the country is still extremely poor and remains highly dependent on foreign aid. By Wakil Kohsar/AFP/Getty Images.



“The ‘Great Man Theory’ At Its Most Frightful”
Andrew Heisel read more than 600 Amazon customer reviews of Mein Kampf, and came away disturbed:
Again and again, reviewers praise Hitler as “one of the most powerful men in history,” or “the greatest mover in history.” He was a “man of strong principles, discipline and good organizational skills,” and overcame poverty “to create the worlds largest empire.” Try to set aside your negative feelings for a moment and appreciate the impact: “Greatness is not measured by good or evil. Greatness is. Fascist or not, Hitler was a great leader.” The praise is qualified, but the tribute paid to morality often feels trivial alongside the esteem. Hitler “did some bad things,” one of the above says. Although he “crossed that line and spiraled into madness” and “evil,” says another, he was “wonderful leader.” Few leaders, offers another, have “matched the depth of his dedication, evil though it was.” They see that he’s a “monster” just like many of the other reviewers; they just don’t think it’s worth dwelling on instead of the positive takeaways.
Some would suggest this discourse is the effect of relativism, and there’s some of that in there, but I think, more than that, it is the value-neutral language of enterprise, where what matters most is getting things done—having an impact, being a “mover.” It’s a language that reveres action, power, and profit as goods in themselves and overlooks the ethical failings of those with power. With mere achievement as your focus, you can whittle away the details until Hitler has an affinity with Jesus. It’s the “Great Man Theory” at its most frightful. If you accomplish so much, you become beyond judgment, become simply History.



The Economics Of Superhero Flicks
Erika Olson recommends Harvard Business School professor Anita Elberse’s Blockbusters: Hit-Making, Risk-Taking, and the Big Business of Entertainment:
Her statistics-driven approach shows that no matter what facet of the entertainment industry you’re talking about – and no matter how contrary to common sense it may seem – those who make the biggest financial investments in a select few products are actually taking the least risky path to success. Perhaps that’s why 40 (!!!) big-budget superhero movies will be hitting theaters between now and 2020. Or why 1998 was the last year that stand-alone (versus sequel/trilogy/universe) films made up the majority of an annual “top-ten highest-grossing movies” list. In 2011, the entire top 12 were franchise titles.
Now, as Scott Tobias of The Dissolve recently pointed out, it’s not like “blockbuster” always equates to “awful.” But for anyone who still enjoys – or wants to make – an indie or otherwise original film, Elberse’s findings are important to understand.



Kicking The Torture Habit
In an interview about her new book, Mainstreaming Torture, Rebecca Gordon unpacks the way she uses virtue ethics to show why we should resist the use of torture:
The torture that I am concerned with is institutionalized state torture – the kind of organized, intentional program carried on by governments. It’s not Jack Bauer saving Los Angeles on 24. It’s not some brave person preventing a ticking time-bomb from going off by torturing the one person who can stop it. We must stop thinking of torture as a series of isolated actions taken by heroic individuals in moments of extremity, and begin instead to understand it as a socially embedded practice. A study of past and present torture regimes suggests that institutionalized state torture has its own histories, its own traditions, its own rituals of initiation. It encourages, both in its individual practitioners and in the society that harbors it, a particular set of moral habits, call them virtues or vices as you prefer. …
I think that my approach to the ethical problem of institutionalized state torture is based on a more accurate representation of what torture is. If torture were simply a set of isolated actions, then consequentialist or deontological approaches might be adequate for judging each act. Torture is, in a sense, more than the sum of individual actions, each of which can be assessed de novo, weighed by an ethical calculus of costs and benefits, or through the mental testing of the effects of universalizing a maxim. Actions create habits. We become brave, as Aristotle says, by doing brave acts. And, in the case of allowing other people to be tortured as the price of an illusory guarantee of our own personal survival, we become cowards by doing cowardly ones.
I think that most of the time in real life, people act first and identify their reasons for acting later. If most of the time we act out of habit, shouldn’t those habits be good ones?



Where The Drivers Drive You Away
Brian Palmer determined the worst places to drive in the US:
No. 5: Baltimore. Baltimoreans just can’t keep from running into each other. They were outside the top 10 in fatalities, DWI deaths, and pedestrian strikes, but their rate of collision couldn’t keep them out of the top five overall.
No. 4: Tampa, Fla. Tampa doesn’t do any single thing terribly, but it is consistently poor:
18th worst in years between accidents, fifth in traffic fatalities, tied for 11th in DWI fatalities, and 10th in pedestrian strikes. If the city had managed to get outside the bottom half in any individual category, Tampa residents might have avoided this distinction.
No. 3: Hialeah. The drivers of Hialeah [Florida] get into a middling number of accidents, ranking 11th among the 39 candidates. But when they hit someone, they really mean it. The city finished third for fatalities. They also have a terrifying tendency to hit pedestrians.
No. 2: Philadelphia. Drivers in the city of brotherly love enjoy a good love tap behind the wheel. Second-places finishes in collisions and pedestrian strikes overwhelm their semi-respectable 16th-place ranking in DWI deaths.
No. 1: Miami. And it’s not even close. First in automotive fatalities, first in pedestrian strikes, first in the obscenity-laced tirades of their fellow drivers.
So basically avoid Florida.
(Photo of boat blocking traffic on I-95 in Miami via Flickr user That Hartford Guy)



Andrew Sullivan's Blog
- Andrew Sullivan's profile
- 153 followers
