Andrew Sullivan's Blog, page 178
August 21, 2014
How Dangerous Is Police Work?
Daniel J. Bier goes over the statistics. He finds that, “In 2013, out of 900,000 sworn officers, just 100 died from a job-related injury. That’s about 11.1 per 100,000, or a rate of 0.001% 0.01%”:
Policing doesn’t even make it into the top 10 most dangerous American professions. Logging has a fatality rate 11 times higher, at 127.8 per 100,000. Fishing: 117 per 100,000. Pilot/flight engineer: 53.4 per 100,000. It’s twice as dangerous to be a truck driver as a cop—at 22.1 per 100,000.
Another point to bear in mind is that not all officer fatalities are homicides. Out of the 100 deaths in 2013, 31 were shot, 11 were struck by a vehicle, 2 were stabbed, and 1 died in a “bomb-related incident.” Other causes of death were: aircraft accident (1), automobile accident (28), motorcycle accident (4), falling (6), drowning (2), electrocution (1), and job-related illness (13).
Even assuming that half these deaths were homicides, policing would have a murder rate of 5.55 per 100,000, comparable to the average murder rate of U.S. cities: 5.6 per 100,000. It’s more dangerous to live in Baltimore (35.01 murders per 100,000 residents) than to be a cop in 2014.



Every Sex Worker Is Somebody’s Daughter, Ctd
The sex-worker-as-daughter debate, which Elizabeth launched, continues. Two readers cite two different missing pieces from the conversation thus far. One writes:
I am amazed by the Every Sex Worker Might be Somebody’s Daughter thread’s blind spot: not one person brought up the men who do sex work. Escorts and male performers in straight and gay pornography are all… somebody’s son. Yet that doesn’t seem to worry anyone much. The same double-standard as always: sexually active women are sluts, sexually active men are studs.
The other sounds off:
The thread on this topic seems remarkably tone-deaf.
Should we evaluate all public policy issues through a “would you want your son/daughter to…” lens? Of course not. Is there lots of misguided, counter-productive, or irrelevant moralism and paternalism involved in some public policy? Sure. So some of the points made in the thread are well-taken, taken in isolation. But.
We also are all somebody’s child, or parent, or caretaker, or sibling, or spouse, etc. And these relationships tap into a specific part of our brain, and give us a specific set of perspectives on life. And sometimes it is positively healthy to ask ourselves to access that part of our thinking and feeling to a greater extent. At the least, speaking as if a whole realm of human awareness should be amputated from public concerns seems at best hugely unrealistic. Just think about the gay marriage issue. Homophobia was fine if you barely knew gay people even existed. Civil unions seemed OK if you knew that they existed, but didn’t know too much about their lives. But gay marriage became a moral imperative for many people because they knew and loved gay people personally, and saw them as, well…somebody’s daughter, or your own son or daughter. And that made a difference in how people saw the issue.
On another note, I think that people who want to jettison the “think about if she were somebody’s daughter” approach are just pretty naive about men. This line isn’t just a tool of patriarchal oppression. It’s used to counter-act male instincts (women have them also, but less strongly). And if you give men permission to stop asking the “what if she were…” questions, and give them free reign to assume that she might just as easily be a porn star, you might find that the results have a lot less to do with smashing the patriarchy than you first thought.



A Sudden Crisis
As the UN refugee agency launches its largest aid effort in more than a decade to help the hundreds of thousands of displaced people in northern Iraq, Swati Sharma remarks on how rapidly the humanitarian disaster has unfolded:
The rate at which the situation in Iraq has deteriorated is the largest reason why it is being called one of the worst humanitarian disasters in recent years. Let’s compare it with Syria.
While the climate there is extremely volatile, it has been deteriorating for more than three years. In comparison, conflicts in Iraq mostly started this year, and the worst of it commenced in June, when the Islamic State (then ISIS) took Mosul. Today, the number of displaced Iraqis is at 1.5 million — small in comparison to Syria’s 6.5 million — but almost 600,000 of them fled their homes in the past two months. Still, many were able to find homes and shelters in communities around northern Iraq.
In early August, Islamic State moved farther north. When the militant group took the northern region in and around the town of Sinjar, where many Yazidis live, more than 200,000 had to flee. Many were stranded on Mount Sinjar with dwindling resources, causing the Obama administration to launch airstrikes against the Islamic State.



Why Kidnap Journalists?
Jason Abbruzzese examines how journalists in conflict zones have become common targets for abduction:
The kidnapping of journalists is a relatively new issue. Reporters in conflict zones well understood the risks, but occupied a relatively sheltered position. “Pre-internet and pre-social media, pretty much all parities to wars and conflicts understood that they needed journalists to communicate their message, their view, to get the word out,” [Dart Center for Journalism and Trauma director Bruce] Shapiro says. Another part of the problem: major media organizations have closed foreign bureaus and become reliant on freelancers as cheap alternatives. Without the backing of major media organizations, these freelancers tend to be at even more risk — especially if they and their families happen to live in the country where the conflict is taking place.
Jack Shafer stands back:
The killing of an innocent reporter violates what many of us would call an unwritten social contract stipulating that journalists deserve protection because they’re witnesses to history, not state actors. …
The old framework, in which reporters are generally tolerated, may be coming to an end, especially on the Syria, Iraq, and Libya battlegrounds. As the New Yorker‘s Jon Lee Anderson writes today, “Yesterday’s guerrillas have given way to terrorists, and now terrorists have given way to this new band [from the Islamic State], who are something like serial killers.” Serial killers tend to reject social contracts.
As we mourn Foley’s death, we need also acknowledge how routine the killing of reporters has become world-wide, and not just on the war-front. According to statistics compiled by the Committee to Protect Journalists, at least 706 reporters have been murdered since 1992, and only 25 percent of them while covering a war. The remainder was assigned to other beats — crime, corruption, politics, human rights, and the like. Of the total dead, 94 percent weren’t foreign correspondents, they were local reporters.
David Rohde, who was kidnapped by the Taliban in 2008, compares American and European approaches to negotiating with terrorists:
There are no easy answers in kidnapping cases. The United States cannot allow terrorist groups to control its foreign policy. One clear lesson that has emerged in recent years, however, is that security threats are more effectively countered by united American and European action. The divergent U.S. and European approach to abductions fails to deter captors or consistently safeguard victims.
Last month, a New York Times investigation found that al-Qaeda and its direct affiliates had received at least $125 million in revenue from kidnappings since 2008—primarily from European governments. In the last year alone, they received $66 million. “Kidnapping hostages is an easy spoil,” Nasser al-Wuhayshi, the leader of al-Qaeda in the Arabian Peninsula, wrote in a 2012 letter to the leader of an al-Qaeda affiliate in North Africa, “which I may describe as a profitable trade and a precious treasure.”
And James Traub probes the moral dilemma inherent in choosing whether or not to do so:
Should states pay ransom to kidnappers? If you are a friend or loved one of the victim, the answer is obviously yes. But even a more remote observer could cite the moral argument that the obligation to treat people as ends rather than means — what Kant calls the “categorical imperative” — forbids one to place the life of the abductee in a balance with abstract goods, like “sending a message” that kidnapping doesn’t pay. In any case, the consequences of capitulation are remote and hypothetical; the life is terribly real. …
The consequences of capitulating to terrorist kidnappers are ruinous. As a recent New York Times investigation revealed, “Kidnapping Europeans for ransom has become a global business for Al Qaeda, bankrolling its operations across the globe.” That’s why no European government will admit to making payments. The thought of Steven Sotloff jammed into a pit, awaiting death, when he might have been freed for nothing more than money, is unbearable. But the thought of rewarding the Islamic State for its savagery is also unbearable. A humane response to a monstrous act engenders more monstrousness.



Mental Health Break
Will Michael Brown’s Shooter Go Free?
Paul Cassell previews the trial of officer Darren Wilson:
[P]roving a crime in the Brown shooting will require close attention to the details, particularly details about the shooting officer’s state of mind. Even if the officer made a mistake in shooting, that will not be enough to support criminal charges so long as his mistake was reasonable — a determination in which the officer will receive some benefit of the doubt because of the split-second judgments that he had to make. And, of course, if it turns out that Michael Brown was in fact charging directly towards the officer (as recent reports have suggested), the officer’s actions will have been justified under state law and no charges should be filed. Trial lawyers know that one thing above all else decides criminal cases: the facts. And that is what we’re waiting for now.
Yishai Schwartz expects Wilson to get off because of Missouri law:
In other states, claims of self-defense need to be proven as more likely than not, or in legal speak, to a “preponderance of the evidence.” It’s still the state’s obligation to prove “beyond a reasonable doubt” that the defendant actually killed the victim. But once that’s established, the prosecution doesn’t also have to prove “beyond a reasonable doubt” that the killing wasn’t justified. That’s because justifications—like self-defense—require the accused to make an active case, called an “affirmative defense,” that the circumstances were exceptional. The logic here is simple: As a rule, homicide is a crime and justification is reserved for extraordinary cases. Once the state has proven that a defendant did in fact kill someone, it should be the accused’s obligation to prove his or her actions were justified.
Not in Missouri. Instead, as long as there is a modicum of evidence and reasonable plausibility in support of a self-defense claim, a court must accept the claim and acquit the accused. The prosecution must not only prove beyond a reasonable doubt that the defendant committed the crime, but also disprove a defendant’s claim of self-defense to the same high standard.



August 20, 2014
What Do-It-Yourself Funerals Can’t Offer
It’s an interesting question, how we’ll handle death and grief as religion’s place in our lives declines. I don’t mean that the old answers about what “happens” when we die will need to be reworked, exactly, because it seems clear that, no longer believing in afterlife, most will just acknowledge that nothingness awaits us. There only will be the “sure extinction that we travel to,” as Larkin put it. But that still leaves the issue of how to mourn the dead, in the very practical sense of what to do when a loved one dies. Emma Green looks at Candi Cann’s recent book, Virtual Afterlives: Grieving the Dead in the Twenty-First Century, and on how this post-religious dilemma is being handled:
For most of human history, religious ceremony has helped people deal with death, providing explanations about souls and the afterlife along with rituals to help the living deal with their grief. Not all religions do death the same way. “There are certain denominations within Christianity and certain religions in general that do a better job of remembering the dead,” said Cann. “Like the Catholics: There’s a very set calendar for remembering, and it’s still tied down to the religious calendar.”
Tattooing yourself with a dead person’s remains is one new way of memorializing death in the absence of faith, she said. “As society becomes more secular, and people are more and more turning to that ‘spiritual but not religious category,’ they’re forming their own do-it-yourself ways of remembering the dead.”
Green goes on to describe other trendy options, from personalized caskets to “theme” funerals to arranging the deceased in scenes taken from their actual lives. I find all this fascinating, and, especially if a family isn’t religious, don’t begrudge them personalizing the funeral in whatever way they’d like. I do, however, wonder how this changes the grieving process, and would like to say a good word for the old-fashioned religious rituals.
Perhaps the most attractive feature of the do-it-yourself remembrance of the dead is how it allows for a celebration of the deceased’s life, in all idiosyncratic particularities. I certainly get that. But I also would argue that depersonalizing the grieving process, if that’s the right phrase for it, offers solace of a different sort. To fall back on the patterns of religious liturgy, to feel that it’s not up to you to conjure the right way to honor the dead, to turn to words and rituals handed down for centuries – all this can be powerfully comforting as well. It allows for a sense of participation in the ongoing human drama of life and death, of not being the first to experience the pain of loss. You aren’t grieving from scratch. There’s a relief to knowing your experience is not unique, a consolation from the solidarity doing what so many others have done before you, and will do after you too are dead. Green cites a funeral director who describes ritual as “mindless,” and not in a pejorative way, which is another way of saying that religious ritual allows you to get out of your own head in a way that can be a relief.
There’s also the beauty of certain religious funeral rites that can’t easily be replaced, beauty which provides its own salve to the grieving. A friend of mine once said that as you’re dying, you want to be Roman Catholic, because the priest can be counted on to come and give you the sacraments, to be predictable and orderly as the end nears. But after you die, then you want to be an Anglican, such is the beauty of the Book of Common Prayer’s Rite I funeral service, with it’s psalms and prayers in the language of Shakespeare and the King James Bible. I think he’s right about that. It’s how I want my funeral done – you can read it here.



The Hawk Gap
Last week, after observing that the prospective 2016 candidates are taking much more hawkish positions on foreign policy issues than public opinion would suggest, Beinart suggested that this might be one more deleterious effect of money on our political system:
For a century, Americans have responded to disillusioning wars by demanding a less interventionist foreign policy. It happened after World War 1, after Korea, after Vietnam, and it’s happening again in the wake of Afghanistan and Iraq. The difference between this moment and past ones is the role of money in politics. As on so many issues, politicians’ need to raise vast sums from the super-rich makes them ultra-responsive to one, distinct sliver of the population and less responsive to everyone else. The way campaign finance warps the political debate over financial regulation is well known. What we’re witnessing this year is a case study in the way it warps the foreign-policy debate as well.
Daniel Drezner’s not so sure about that, pointing out that foreign policy talk is about as cheap as it gets:
Beinart’s thesis is that this gap has grown even more in recent years, but I’m not sure that’s what going on. The most important fact about American foreign policy and public opinion is that Americans just don’t care all that much about the rest of the world. Sure, they’ll express less interventionist preferences when asked, but most of the time they don’t think about it. It’s precisely this lack of interest that gives presidents and foreign policymakers such leeway in crafting foreign policy. … Statements about how one would do things better on the foreign policy front are among the best examples of cheap talk you’ll find in Washington. Why? Because the world will look different in January 2017 than it does today. So of course these proto-candidates can say they’d do things differently. No one will hold them to these claims if they’re elected, because the problems will have evolved.
Larison agrees with Drezner. In another post touching on this opinion gap, he takes down the notion of a “paradox” in the public’s attitude toward Obama’s foreign policy:
According to this story, Obama has given Americans the foreign policy they say they want, but they now disapprove of Obama’s foreign policy, so we’re supposed to believe that there is a “strange duality” at work. Instead of coming to the much more straightforward conclusion that Obama is not giving Americans the foreign policy they want (and that his foreign policy is still too activist and meddlesome), elite interventionists of different stripes engage in a lot of groundless speculation that the public actually wants the same things that the interventionists themselves want. It’s not obvious that most Americans “want a president to lead” in this case. The obsession with such “leadership” is primarily one shared by elites, and their idea of “leadership” requires a degree of U.S. activism overseas that the public hasn’t supported for years. The public-elite gap on foreign policy has rarely been wider than it is now because most Americans have no real interest in the “leadership” role for the U.S. or the president that foreign policy elites demand.



The Meaning Of #Ferguson
That's me in the cloud of tear gas tweeting in #Ferguson. http://t.co/9fMCdRGQYX—
Antonio French (@AntonioFrench) August 12, 2014
Over the weekend, David Carr marveled at how well Twitter has matured as a tool for journalism:
For people in the news business, Twitter was initially viewed as one more way to promote and distribute content. But as the world has become an ever more complicated place — a collision of Ebola, war in Iraq, crisis in Ukraine and more — Twitter has become an early warning service for news organizations, a way to see into stories even when they don’t have significant reporting assets on the ground. And in a situation hostile to traditional reporting, the crowdsourced, phone-enabled network of information that Twitter provides has proved invaluable. …
In and of itself, Twitter is not sufficient to see clearly into a big story; it’s a series of straws that offer narrow views of a much bigger picture. But as a kind of constantly changing kaleidoscope, it provides enough visibility to show that something significant is underway.
Along those lines, Amma Marfo focuses in on how important Twitter has become to the black community, particularly over the past week:
Twitter’s lack of algorithms to control the display of content means that posts are elevated in popularity only by the people who favorite, Retweet, and share screen captures of impactful or informative messages. Such a structure allows the insight of the observant but relatively unknown amateur, alongside high-profile and highly educated (another population that uses Twitter in high volume), to stand alongside one another. This egalitarian information sharing model is welcome for historically disenfranchised populations. This could be key for its popularity with other minority groups such as Hispanics. Its use among African-Americans continues to rise, as does the increasing use of Twitter as a credible means to gauge public opinion and the newsworthiness of given topics.
But it’s worth noting that the overall social media ecosystem is not always like this. Last week, Zeynep Tufekci pointed out the difference in following the Ferguson protests on Twitter, which shows you all the tweets from whoever you follow in real time, and Facebook, which uses an algorithm to determine both what you see and when you get to see it. To highlight the frenzy of Ferguson tweets last Wednesday night she flagged this graph:
But when she checked her Facebook feed during that spike, there was nothing at all about the story, not until the next morning:
Overnight, “edgerank” –or whatever Facebook’s filtering algorithm is called now — seems to have bubbled [the Ferguson items] up, probably as people engaged them more. But I wonder: what if Ferguson had started to bubble, but there was no Twitter to catch on nationally? Would it ever make it through the algorithmic filtering on Facebook? Maybe, but with no transparency to the decisions, I cannot be sure.
Would Ferguson be buried in algorithmic censorship?
And as she goes on to note, Twitter already does use an algorithm to determine what topics trend nationally, which may have partially delayed the onset of Ferguson’s social media attention last week. Also, it looks like Twitter is now messing even further with what their users see, as Jay Yarow explains:
Until now, your timeline was filled only with tweets from the people you follow, or retweets from those same people. In other words, you got only the content for which you opted in. [The new policy] opens up the possibility for Twitter to start putting tweets from people you don’t follow in your feed. … By doing this, Twitter makes its timeline more like Facebook’s News Feed, which populates based on algorithms that measure likes and interests.



Face Of The Day
A fan waiting for a screening of ET: The Extra Terrestrial last night at Somerset House in London, England. Photo by Ben A. Pruchnie/Getty Images.



Andrew Sullivan's Blog
- Andrew Sullivan's profile
- 153 followers
