Oxford University Press's Blog, page 1018

September 28, 2012

The seven myths of mass murder

By J. Reid Meloy, Ph.D.



For the past 15 years my colleagues and I have conducted research on mass murder, the intentional killing of three or more individuals, excluding the perpetrator, during one event. Recent cases of mass murder have pointed to misconceptions about this rare and frightening act, and I would like to shed some light on what I consider the seven myths of mass murder.


Myth 1: They “snap.”


Immediately following a mass murder, there is a steady stream of newspaper headlines and what I call “entertainment profilers” on television who proclaim that the individual “snapped.” There is no psychological term called snapping, but many assume that mass murder is done impulsively, with great emotion, and without planning or preparation. Almost all mass murders don’t fit this profile.


Research consistently shows that mass murderers research, plan, and prepare for their act of violence for days, weeks, and even months. The fantasy may have incubated in their minds for years, even though the time, place, and target had yet to be determined. The act usually occurs after a major loss in love or work, and this may “start the clock” wherein final detailed preparation begins. I have forensically evaluated a number of mass murderers in prison or forensic hospitals, and with few exceptions, there was no evidence of a high state of emotional arousal when the killings occurred. Witnesses who have survived mass murders invariably describe the shooter as cool, calm, and deliberate — a lack of emotion that is a corollary of planned violence.


Myth 2: They can easily be divided into “psychopaths, psychotics, and depressives”


David Cullen, the journalist and author of Columbine, an excellent book on the high school mass murder in Colorado in 1999, has asserted this formulation. Unfortunately, his diagnostic classification of mass murderers is much too simplistic. Most are complex in their motivations and psychopathology. They often have both mental and personality disorders.


Mental disorders range from chronic psychotic disturbances, such as paranoid schizophrenia diagnosed in the Jared Loughner case, to major depression, bipolar, and other disorders. This may sound like splitting hairs, but when it comes to risk mitigation, fully understanding the range and complexity of these individuals’ disturbances is critical.


Personality disorders also abound in this group and are often a mixture of antisocial, paranoid, narcissistic, and schizoid traits — or in layperson’s terms, someone who habitually engages in criminal behavior, is suspicious of others’ actions, is self-centered and grandiose with little empathy, and is chronically indifferent toward others and detached from his emotional life. What Cullen has done is a disservice to the millions of individuals who are clinically depressed or have a psychotic disorder and pose no more risk of violence to others than your neighbor. Loughner has given paranoid schizophrenia a bad name — many other factors contributed to his attempted assassination and mass murder.


Myth 3: Incidents of mass murder are increasing


When a mass murder occurs, it receives instant and pervasive news coverage. Unfortunately, we are prone to overestimate the frequency of an event by its prominence in our minds, and mass murder is no exception. This is a very rare phenomenon and is neither increasing nor decreasing in the US. Since 1976 there have been about 20 mass murders a year. 2003 was the most violent year for mass murder, with 30 incidents and 135 victims. Virginia Tech, Fort Hood, Edmund Oklahoma, and San Ysidro still resonate in the public consciousness, however, reminding us that these events do happen. A positive counterpoint is that rates of all violent crime have significantly decreased over this same time period, from 48 victims per 1000 persons in 1976 to 15 victims in 2010. The most lethal school mass murder in US history was in Bath, Michigan, in 1927, a bombing that resulted in 45 deaths, mostly children in the second to sixth grades.


Myth 4: Banning assault weapons will lower the frequency of mass murder


The most popular weapon chosen by mass murderers in the US is a 9 mm pistol, often a Glock. Usually they bring two or three firearms to the scene, and assault weapons such as the AR 15 or AK 47 are generally not utilized. Therefore it should come as no surprise that between 1994 and 2004, when the federal assault weapons ban was in effect, there was no decrease in the average number of mass murders per year in the US. However, guns do kill people. As a gun owner myself, and a believer in the Second Amendment, I find it appalling that virtually anyone can purchase a firearm with little effort, money, or time in the US. I believe that firearms ownership is a right that should have requirements: demonstrable competency in its use and mental stability.


Myth 5: Psychotic individuals cannot plan in a precise and methodical manner


The majority of adult mass murderers are psychotic, meaning they have broken with consensual reality and perceive the world in an idiosyncratic and often paranoid way. Yet they may research the internet for weapons, practice video games to sharpen their marksmanship, purchase weapons and ammunition, conduct surveillance of the target, and carry out their mass murder, all from within a delusion.


A delusion is a fixed and false belief and may provide a rock-solid motivation for mass murderers. Paradoxically, delusions may help them commit irrevocably to paths of homicidal destruction. Our research has also found that mass murderers who are psychotic have higher casualty rates than those who are not. Typically they select victims who are complete strangers, who in their minds make up a “pseudocommunity” of persecutors bent on their destruction.


Myth 6: It must be the drugs they are abusing


It is true that most quotidian violence involves drug use, particularly alcohol. In cases of spousal homicide, the victim, perpetrator, or both are often intoxicated at the time. In mass murder, however, drug use is minimal, whether alcohol or other illicit substances. We think this is because the mass murderer does not want drugs to cloud his consciousness at the time. They could interfere with his planning, preparation, and most importantly, his tactical goal, which is often to maximize his casualty rate. We even found two cases where the mass murderer utilized therapeutic amounts of sedating drugs to help him remain calm during the shooting.


Myth 7: Mass murder can be predicted and prevented


Unfortunately this will never happen given the simple fact that we cannot predict such an extremely rare event. If we attempt to do so, we will grossly over-predict its occurrence and perhaps infringe upon individual rights and freedoms. However, we can mitigate the risk of such events by paying attention to behaviors of concern. This stopped Richard Reid from bringing down an airplane over the Atlantic in December 2001, when a passenger noticed he was trying to light his sneaker with a match. It contributed to the prevention of another ideologically driven mass murder in Times Square on 1 May 2010 when two street vendors noticed a suspicious van parked on a busy corner and alerted the police; two days later Faisal Shahzad was arrested as he sat on a plane at Kennedy bound for Dubai. Such situational awareness is critical to interdict someone in the final stages of an attack.


But there is another warning behavior that is quite frequent: mass murderers will leak their intent to others — a phrase expressed to another, or posted on the internet, that raises concern. It may be overt: “I’m going to kill my supervisor and his cohorts tomorrow;” or it may be covert: “don’t come to work tomorrow, but watch the news.” The logical reaction should be to alert someone in a position of authority; however, most people don’t. It surfaces after the event, with the rationale, “I just didn’t think he was serious.” Trust your emotional reactions of anxiety, wariness, or fear, and let law enforcement investigate.


J. Reid Meloy, Ph.D. is Clinical Professor of Psychiatry at the University of California, San Diego, and President of Forensis, Inc., a nonprofit dedicated to forensic psychiatric and psychological research. He co-edited Stalking, Threatening, and Attacking Public Figures (OUP, 2008) with Lorraine Sheridan and Jens Hoffmann, and is currently co-editing another volume entitled International Handbook of Threat Assessment, which is scheduled to publish in 2013. Learn about his latest news by following Forensis on Twitter at @ForensisInc.


Subscribe to the OUPblog via email or RSS.

Subscribe to only psychology articles on the OUPblog via email or RSS.

View more about this book on the




 •  0 comments  •  flag
Share on Twitter
Published on September 28, 2012 05:30

Opposing narratives of success in politics

By Stephanie Li



While our presidential candidates are known far in advance of the Democratic and Republican National Conventions, party conventions remain intriguing spectacles for the kind of human detail they offer about the men who aspire to the Oval Office. Every four years pundits and political commentators observe that conventions have become increasingly scripted affairs that lack the spontaneity of times past, but party conventions serve to present individual narratives as much as specific policy positions. Viewers of the Republican National Convention (RNC) and Democratic National Convention (DNC) may not know what each party plans to do in Afghanistan or with our nation’s struggling public schools, but after three days of speeches, they know something about the biography of each candidate. The stories about Barack Obama and Mitt Romney offered by the DNC and RNC are remarkable because they offer such opposing versions of American identity.


President Bill Clinton opened his DNC speech with a quote from Democratic Chairmen, Bob Strauss, who “used to say that every politician wants you to believe he was born in a log cabin he built himself.” The narrative of self-reliant uplift has been a mainstay of the political memoir especially those penned by presidents and presidential candidates. However, Clinton’s speech, hailed by many as the most effective of the entire campaign season, seeks to dismantle what Jill Lepore identifies as the predictable trajectory of presidential narratives. She writes that “even the best of them tell, with rare exception, the same Jacksonian story: scrappy maverick who splits rails and farms peanuts and shoots moose battles from the log cabin to the White House by dint of grit, smarts, stubbornness, and love of country.” By contrast, Clinton assured listeners that Democrats “believe that ‘we’re all in this together’ is a far better philosophy than ‘you’re on your own.’”


Clinton’s focus on working together to build a better country, one that strengthens the middle class and offers greater opportunity to the poor, resonates with the themes of Obama’s first book, Dreams from My Father: A Story of Race and Inheritance (1995). Our current President’s self-conscious coming of age story challenges the image of rugged individualism long associated with presidential biographies. Obama writes eloquently about how his mother, maternal grandparents, and paternal relatives shaped his life in fundamental ways. In her speech at the DNC, Michelle Obama described how she was first drawn to her husband because his family sacrificed for him in the same way that her own parents ensured that she received a quality education.


President Barack Obama and his daughters, Malia, left, and Sasha, watch on television as First Lady Michelle Obama begins her speech at the Democratic National Convention, in the Treaty Room of the White House, Tuesday night, Sept. 4, 2012. (Official White House Photo by Pete Souza)


By contrast, when discussing her husband’s rise to prominence at the RNC, Ann Romney stated, “I can tell you Mitt Romney was not handed success. He built it.” Mrs. Romney didn’t disregard the power of working together or helping others. In one of the most powerful lines of her speech, she said of her husband, “Mitt doesn’t like to talk about how he has helped others because he sees it as a privilege, not a political talking point.” What’s striking here is that even when emphasizing her husband’s support of and outreach to others, Ms. Romney presents him as a kindly benefactor, not as someone who ever relied upon the help of others. The Romneys may indeed “help their neighbors, their churches and their communities,” but this is help they have never needed. While Michelle Obama spoke of her husband’s vulnerabilities and the people who made his success possible, Ann Romney presented a picture of her husband as a tireless patriot who will take the country “to be a better place.”


Ms. Romney’s speech echoed the theme of the RNC’s second day, “We built it,” a rejoinder to Obama’s widely circulated comments from a campaign event in Roanoke, Virginia. The President, stated:


“If you were successful, somebody along the line gave you some help. There was a great teacher somewhere in your life. Somebody helped to create this unbelievable system that we have that allowed you to thrive. Somebody invested in roads and bridges. If you’ve got a business, you didn’t build that. Somebody else made that happen.”


Obama could have been talking about himself and his own rise to national and international prominence. In his staid speech at the DNC, he employed the second person pronoun as much as the first. Every victory he identified from the last four years was followed by a rousing, “you were the change… you did that… you made that possible… you did that.” For Obama, there is little separation between his success and the people who made his success possible. His life narrative embraces collectivity while Romney adheres to the tired conventions of rugged individualism. Increasingly the choice between the presidential candidates is one between productive cooperation and stark self-reliance.


Stephanie Li is associate professor of English at the University of Rochester. She is the author of “The Parallel Lives of Bill Clinton” in American Literary History and three books including Signifying Without Specifying: Racial Discourse in the Age of Obama (2011) and Something Akin to Freedom: The Choice of Bondage in Narratives by African American Women (2009).


[image error]


Recent Americanist scholarship has generated some of the most forceful responses to questions about literary history and theory. Yet too many of the most provocative essays have been scattered among a wide variety of narrowly focused publications. Covering the study of US literature from its origins through the present, American Literary History provides a much-needed forum for the various, often competing voices of contemporary literary inquiry.


Subscribe to the OUPblog via email or RSS.

Subscribe to only law and politics articles on the OUPblog via email or RSS.

Subscribe to only literature articles on the OUPblog via email or RSS.




 •  0 comments  •  flag
Share on Twitter
Published on September 28, 2012 03:30

Permission-giving: from Cromwell to Kate Middleton


By Keith Grint



Some of my more radical academic colleagues remain inordinately sceptical of the role of individual leaders set against the tectonic plates of economic systems, social classes, genders, political alliances and ethnic groups. To suggest that individual leaders might make a difference is to place an unwarranted responsibility upon mere actors when the real issue is ‘the system’ — whatever the system is. However, I want to suggest that we look again at permission-giving as just one aspect that encourages or discourages followers from specific acts. Let me turn to an apocryphal story from the Korean War to illustrate this. As in the Second World War, the American Army (like the British army but unlike the German army) removed discretion at the lower echelons and deposited most  knowledge and decision-making within the officer corps. As a consequence when Allied troops landed on D-Day in Normandy German soldiers targeted Allied officers as the most effective way to immobilize the invaders. In the Korea War little seemed to have changed; when American soldiers were captured their North Korean captors frequently resorted to torturing American officers — since the ordinary soldiers seldom had significant information — and as a consequence American soldiers were required to strip their emblems of rank if capture looked likely. This left the North Koreans with a problem: how to determine the officers amongst a group of captured US soldiers? The result, according to the apocryphal version of events, was that one enterprising North Korean interrogator demanded that all the American soldiers in his charge remove their trousers — at which point everyone looked at their officer for permission to strip.


This ‘permission-giving’ aspect of leadership is a critical, and critically under-rated, aspect. In Neil Mitchell’s book, Agents of Atrocity, he argues that leaders make a crucial difference in the occurrence, or prevention, of human rights abuses through their permission-giving. Thus rather than assuming that context determines the actions of leaders he suggests that leaders always retain a degree of choice in both their, and their followers’, actions. In effect, some leaders allow or even encourage their followers to engage in mass rape and murder after conquest while others actively prevent it. The most interesting case is Oliver Cromwell. In the English aspect of the English Civil War the pillaging of captured cities was common on both sides until the rise of the Parliamentary New Model Army, under Cromwell. That group was specifically forbidden from engaging in the rape, murder, and looting that hitherto had been commonplace. In contrast, when Cromwell led the army into Ireland he didn’t actively prevent any such brutality, and the consequences were the sacking and butchery of Drogheda and Wexford.


Note here that the critical point is how leaders and their followers are not driven into bestial behaviour as a response to a bestial situation; on the contrary, and in sharp contrast to contingency theory, what people do is a consequence of the choices they make, albeit constrained choices. This being the case we might look afresh at whether we should focus on the provision of material sustenance in areas of conflict — water, food, money, jobs, security and so on — or turn instead to the ideological aspects of life. We know historically that people in terrible material conditions don’t automatically revolt when food is short or jobs are scarce. They revolt when an alternative appears viable to a terrible present. We can see this captured in the history of slavery. It was probably common to almost all forms of prior human society and is usually linked to terrible material circumstances, but revolts aren’t a permanent or ever present feature of slave history.


When we apply this to other aspects of society it becomes clear just how important leadership is in its permission-giving — or withholding — capacity. Public assaults upon the Jews in Germany were significantly increased once Hitler had publicly denounced Jews. At the same time racist statements in public in the UK are now much rarer than they used to be because the leading role of the law and the political establishment has rendered such comments beyond the pale. But we do not need to wander into the political arena to notice the importance of permission-giving. According to the Guardian, the Queen’s adoption of the Stocky Heel shoe is responsible for the surge in demand and we only have to watch whatever Kate Middleton is wearing to see sales of the same dress race out of the shops. Permission-giving even affects suicide patterns with copy-cat suicides common.


Might it be then, that a hugely important aspect of leadership is not so much what ‘the situation’ or the ‘system’ determines or facilitates but what individual leaders permit or prohibit through their active or passive leadership?


Keith Grint is Professor of Public Leadership at Warwick Business School. He is the author of Leadership: A Very Short Introduction (2010).


The Very Short Introductions (VSI) series combines a small format with authoritative analysis and big ideas for hundreds of topic areas. Written by our expert authors, these books can change the way you think about the things that interest you and are the perfect introduction to subjects you previously knew nothing about.


Subscribe to the OUPblog via email or RSS.

Subscribe to only VSI articles on the OUPblog via email or RSS.

View more about this book on the  




 •  0 comments  •  flag
Share on Twitter
Published on September 28, 2012 00:30

September 27, 2012

Just what is triple-negative breast cancer?

By Patricia Prijatel



The big news this week comes from the Cancer Genome Atlas program, which has announced a strong molecular connection between basal-like breast cancer tumors and ovarian cancer. The news stories I have read on the topic provide a great deal of hope for women with basal-like cancers. But the hope is, unfortunately, buried in a greater deal of confusion.


Here’s the hope: We’re getting closer and closer to understanding what makes breast cancer tick on a molecular level, and that means we could ultimately have treatments that target specific molecular anomalies, making the treatment more effective, efficient, and possibly less toxic. Most important, it would be clearer to doctors which patients need chemo and which don’t, saving thousands of people from unnecessary and dangerous treatment. And making chemo for those who need it more precise.


Among those who could benefit most from this research are the women and men with triple-negative breast cancer (TNBC), which is currently treated with anthracylines — chemotherapy drugs such as Cytoxan and Adriamycin — that can cause long-term side effects including heart disease and increased risk of leukemia. Moreover, metastatic TNBC — cancer that has spread beyond to distant organs — can be resistant even to current forms of chemo.


The research on the genome project, published in the 23 September 2012 online edition of the journal Nature, ties basal-like breast cancers to ovarian cancers on a molecular level, suggesting that ultimately TNBC could be treated with the less-toxic chemotherapy used for ovarian cancer, including a mix of a carboplatin (Paraplatin) or cisplatin with a taxane such as paclitaxel (Taxol) or docetaxel (Taxotere).


That’s the hope. Now the confusion. The biggest confusion revolves around our understanding of triple-negative breast cancer (TNBC).


Some news stories I have read equate TNBC with basal-like tumors, as though the two were synonymous, which is not accurate. There is a correlation between TNBC and basal-like cancers, but not all TNBC tumors are basal-like, and not all basal-like tumors are TNBC. In fact, some researchers break TNBC into three subtypes, including basal-like and non-basal-like.


The other, and bigger, problem, come in the terms journalists and researchers use for TNBC: “particularly deadly,” “especially aggressive,” and “lethal.”


I understand why these words are used; it makes the research appear more significant. But these terms can frighten and depress those with TNBC and their families, and the research is significant enough to stand on its own without hyperbole.


Lost in the hyperbole is the fact that most women survive TNBC. The rates depend on too many factors to offer a generalization, but multiple studies have shown from 70 to 90% of patients with TNBC with no recurrence after five years. And rates for TNBC recurrence drop significantly after three years, so TNBC patients who have reached five years without recurrence often face better long-term odds than those with other forms of breast cancer.


It is a disease to take seriously, but who doesn’t take cancer seriously? Patients and survivors don’t need frightening words for effect. They are frightened enough.


Triple-negative breast cancer gets its name because tumors of this subtype lack receptors for estrogen, progesterone, and the human growth hormone Her2/neu. The significance of being negative for these receptors is that TNBC lacks a targeted therapy, such as tamoxifen, which blocks the effects of estrogen, Arimidex, which prevents the production of estrogen, and Herceptin, which treats her/2-positive tumors.


So, TNBC is a disease defined by what it lacks. But, thanks to research on the human genome, it may soon be defined by its specific molecular characteristics. And treating something you can define is a whole lot easier than treating what you can’t.


Patricia Prijatel is author of Surviving Triple-Negative Breast Cancer, published by Oxford University Press. She is the E.T. Meredith Distinguished Professor Emerita of Journalism at Drake University. She is doing a webcast with the Triple Negative Breast Cancer Foundation on 16 October 2012.


Subscribe to the OUPblog via email or RSS.

Subscribe to only health and medicine articles on the OUPblog via email or RSS.

View more about this book on the




 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2012 05:30

Bob Chilcott on choral workshops

Bob Chilcott talks to Oxford University Press about why he likes running choral workshops, the challenges that these days present, and what he hopes singers take away.


Click here to view the embedded video.


Bob Chilcott has been involved with choral music all his life, first as a Chorister and then a Choral Scholar at King’s College, Cambridge. Later, he sang and composed music for 12 years with the King’s Singers. His experiences with that group, his passionate commitment to young and amateur choirs, and his profound belief that music can unite people, have inspired him both to compose full-time and, through proactive workshopping, to promote choral music worldwide. Two of his works featured in this video are: “Jesus, Springing” and “A Little Jazz Mass.” You can follow him on Twitter at @BobChilcott.


Subscribe to the OUPblog via email or RSS.

Subscribe to only music articles on the OUPblog via email or RSS.

View more about “Jesus, Springing” on the  

View more about “A Little Jazz Mass” on the  


Oxford Sheet Music is distributed in the USA by Peters Edition.




 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2012 03:30

What is marriage?

By Matthew Grimley



As I write, a committee is meeting to decide which two names to submit to the British prime minister for the post of archbishop of Canterbury. Whoever gets the job, a major issue that he will have to deal with is that of gay marriage, which the British government has pledged to introduce, and which the Church of England, along with most other religious confessions in Britain, opposes. The current debate about gay marriage forces all religions, as well as the government and the general public, to re-examine both their views on homosexuality, and their definitions of exactly what marriage is.


This may be a good moment for a spot of historical perspective on how British Christians’ views on marriage and sexuality have changed over the past century. An update from the Oxford Dictionary of National Biography today focuses on the men and women who shaped the twentieth-century churches. Of the 60 featured biographies, five in particular help us chart these changes. These new biographies offer a useful corrective to two prevalent but false assumptions in the media. The first is that the position of the Christian churches on sexual ethics is automatically reactionary; the new Oxford DNB lives display a much more nuanced and conflicted picture than that. The second is that arguments between Christians about homosexuality are new. As one of the new biographies shows, modern debate in the Church of England about homosexuality can be dated back to 1952 — 60 years ago this year.


Arthur Herbert Gray


Two biographies by Alana Harris demonstrate the central role Christians played in re-conceiving marriage from the 1920s onwards. Arthur Herbert Gray (1868-1956) was a Presbyterian minister whose service as an army chaplain in the Great War led him to champion the need for sex education. In 1923 he wrote a best-seller entitled Men, Women and God: a Discussion of Sex Questions from the Christian Point of View, which called for “a fuller understanding of the problems of sex … for the enrichment of human life and the glory of God.” In 1938 Gray joined with a Methodist minister, David Mace (1907-1990), to form what became the Marriage Guidance Council (and is now Relate). As a minister in London’s Old Kent Road during the Depression, Mace had been struck that “nothing seemed to equal the power of a truly happy marriage to keep people going under great stress,” an insight that shaped his whole career. Appropriately, Mace did much of his work in partnership with his wife Vera Chapman (1902-2008) and, after emigrating to the USA, the couple became leading figures in the global marriage guidance movement.


While the marriage guidance movement took off after the Second World War, some Christians turned their attention to the issue of homosexuality. One such was Derrick Sherwin Bailey (1910-1984), an Anglican priest and friend of David Mace, who wrote an article for the journal Theology in 1952 calling for the decriminalization of homosexual acts between men. He received a huge postbag on this, and convened a panel of doctors, lawyers, and clergyman to produce a report called The Problem of Homosexuality (1954). Originally intended only for internal church circulation, the report received a much wider readership, even eliciting a fan-letter from the American sexologist Alfred Kinsey. It was a contributing factor to the government’s establishment of the Wolfenden Committee, and to John Wolfenden’s recommendations that homosexual acts between adult males be legalized. Sherwin Bailey was an unassuming railway enthusiast who had been largely forgotten by the time of his death, but by securing the support of senior Anglicans and other opinion-formers for homosexual law reform, he had played a key role in its eventual enactment in England and Wales in 1967.


The permissive reforms of the 1960s prompted two different sorts of grassroots backlash in the 1970s, both reflected in lives now published by the Oxford DNB. The first was the conservative moral backlash, represented by Raymond Johnston (1927-1985), director of the 1970s campaign group, the Nationwide Festival of Light. This organization, whose founders included Malcolm Muggeridge and Mary Whitehouse, campaigned against abortion, pornography, and public manifestations of homosexuality. As Andrew Atherstone’s biography of Johnston shows, the Festival of Light’s theology belonged to the conservative evangelical tradition, but its methods — lobbying MPs, using newspapers and TV, and borrowing the language and music of the 1970s pop festival movement — showed a very modern awareness of the mass media and imitated the counter-culture that it attacked.


The second 1970s backlash was a radical one that rejected the 1960s permissive consensus as too cautious, and instead asserted the rights of identity groups. This was seen in new activist movements like the Gay Liberation Front and the Women’s Liberation Movement, both of which picketed the Festival of Light’s first public meeting in September 1971. (Members of the Gay Liberation Front dressed as nuns gatecrashed the meeting, releasing white mice and stink-bombs in the auditorium.) This new radicalism confronted the Christian churches, too, with the foundation of the Gay Christian Movement in 1976. The first president of this movement, Peter Elers (1930-1986), vicar of Thaxted in Essex, is the subject of a new ODNB biography by Julian Litten. A married father of four, he caused controversy in 1976 by coming out as gay at a conference, and by allegedly blessing two lesbian partnerships in Thaxted Church. The following year he again declared his homosexuality on a BBC TV documentary. Thaxted was a parish celebrated for its radical Christian socialist tradition, but gay liberation was a cause too far for many parishioners, who campaigned to have Elers removed as their vicar. Peter Elers kept his job, but the episode led conservative evangelicals like Johnston to become increasingly insistent that the Anglican hierarchy distance itself from the demands of gay rights protesters. The battle between supporters and opponents of gay rights in the Church of England has been going on ever since.


Matthew Grimley is Fellow and Tutor in Modern History at Merton College, Oxford. He was associate editor of the Oxford DNB’s ‘20th Century Churches’ update, published on Thursday 27 September 2012. Highlights from the ODNB update are available, along with an introduction and a full list of the 124 new biographies now added to the dictionary.


The Oxford DNB online is freely available via public libraries across the UK. Libraries offer ‘remote access’ allowing members to log-on to the complete dictionary, for free, from home (or any other computer) twenty-four hours a day. In addition to 58,000 life stories, the ODNB offers a free, twice monthly biography podcast with over 130 life stories now available. You can also sign up for Life of the Day, a topical biography delivered to your inbox, or follow @ODNB on Twitter for people in the news.


Subscribe to the OUPblog via email or RSS.

Subscribe to only articles about British history on the OUPblog via email or RSS.


Image of Arthur Herbert Gray, by Elliott & Fry, whole-plate glass negative. © National Portrait Gallery, London. Used with permission.




 •  0 comments  •  flag
Share on Twitter
Published on September 27, 2012 00:30

September 26, 2012

Monthly etymology gleanings, part 1, September

By Anatoly Liberman



First and foremost, many thanks to those who have sent questions and comments and corrected my mistakes. A good deal has been written about the nature of mistakes, and wise dicta along the errare humanum est lines have been formulated. Yes, to err is human, but it is the stupidity and “injustice” of some mistakes that are particularly vexing. Once something outrageously innocuous has crept into your text, it tends to stay there. God forbid, if you write marital instead of martial or pubic instead of public! You may reread your masterpiece five times before dispatching it (this is what I always do) and still not notice the horror.


For example, I meant to say that the fish name bass was for a long time pronounced as base (which is true) but wrote that it is pronounced so, though the bass happens to be the fish that we most often buy at a local store, and I know the word’s pronunciation as well as anyone around. Or, to be absolutely sure, I checked the spelling of teléfone in Spanish, and, with the dictionary open before me, wrote calmly that only Italian has adopted f in this word. In one of my books, I referred to Kipling’s Just So Stories and explained why the giraffe (!) has a hump. I have known those tales as long as I can remember. What is the blend of camel and giraffe? Carafe?


My work on an etymological dictionary began with the word heifer. If I were interested in having a gravestone commemorating me (fortunately, I am not), I would like to have an image of a frolicsome heifer on it. Now, in A Bibliography of English Etymology, a volume I have edited with utmost care, the definition of heifer is short and sweet: “pig.” How did it get there? One of my brightest assistants remarked darkly: “Sabotage.” In the years to come, who will believe that I could distinguish between a young cow and a piglet? My only comfort is that, according to a Russian saying, the sword does not cut off a repentant head, and, as can be seen, I am brimming over with repentance. More about cutting (s)words will be said below.


As could be expected, my recent post on spelling reform inspired many questions. This part of the “gleanings” will deal only with spelling.


Sword: why does it have w in the middle? Sword was once pronounced approximately as swoord, and w before oo tends to disappear, even though we have swoop and swoon, which did not merge with soup and soon. But as “recently” as at the end of the eighteenth century men of fashion pronounced woman as ooman. (I wonder whether they ’ooed them.) Another word of the same type with w reflecting a pronunciation that has been dead for a long time is two.


Four versus forty. This is a recurring question. I have heard and answered it nearly as often as the one about why flammable and inflammable are synonyms rather than antonyms (because in- can sometimes function as a reinforcing prefix; no one thinks that inflammation is the opposite of flame, so why bother about inflammable?). A serious explanation of the four/forty difference is rather technical, and I’ll give it here in a nutshell. English had two long o sounds (long o means the vowel of law, except in the dialect of those who pronounce law as la). In Middle English, the vowel o was long and closed in both four and forty. Yet in forty (a disyllabic word) it became short and then again long but open before r. The editors of the first volumes of the OED still made this distinction. The spelling of forty reflects the shortened open variant. The same holds for fortnight “two weeks,” that is, “fourteen nights.” Fourteen should have been spelled like forty but retained its ou under the influence of four.


Another perennial cause of wonderment is gh in cough, ought, tough, thorough, and through. Here again I would rather avoid details and confine myself to the essentials. Middle English had the sound we hear in Scots loch or in German ach. There was no letter for it, and scribes used the digraph gh, in which g referred to the place (back in the mouth) and h to the manner of articulation (a breathed sound, a fricative). That consonant dropped out of the language. It either moved all the way to the front of the mouth and became f (hence tough, cough, and enough) or became h and was shed (hence ought, thorough, and through). The spelling of all such words is conservative, and gh gives us grief, much to the joy of those who think that spelling should be a branch of an etymological museum.


The ei ~ ie problem. It is regrettable that we have retrieve, receive, believe, weird, and their likes. The order of the letters in such words is usually arbitrary. Even when it is not, we may be puzzled. For example, in sieve the vowel was at one time long, and e designated its length, but later that vowel underwent shortening. Today, sieve looks as meshy and as messy as seive might.


Circuit: Why ui? Because this is the way it is spelled in French! The same holds for conduit and pursuit. Other ui words are even worse. Recruit has no justification (compare German Rekrut and Dutch recruut).


Vacuum. This spelling needn’t not bother us, for vacuum hardly rhymes with boom, loom, zoom; it is usually pronounced in three syllables. Even ui in vacuity looks fine for a change (the point of syllable division in it, as, for example, also in suicide, falls between u, pronounced long, and i).


Are there many words spelled alike but pronounced differently in British and American English? Yes, very many. One can pinpoint whole groups, as, for example, those ending in -ile (missile, imbecile, versatile: their last syllable sounds like isle in British and like ill in American English), and countless words like herb (homonymous with Herb in British but h-less in American), and schedule (with sh- in British). Often stresses differ (as in British weekend versus American weekend). British English is prone to the diphthongal pronunciation of i. Direction (di-rection) is not the only case. Yet privacy in British English has the same first syllable as in privilege. I have heard mythology pronounced as my-thology. Get a British pronouncing dictionary alongside of its American counterpart and enjoy comparing them.


In my post, I inveighed against the use of the digraph ph (I wish it were digraf!) and wrote that phony was a perfect example of iconicity: the word means “sham” and its spelling is absurd. When in a talk show I mentioned weird as a similar case (a word has a weird spelling and means “bizarre”), I was asked whether a term exists for such cases. No, I conjured up the entire situation myself. But for want of an established term, I suggest calling such phony and weird words graphically iconic. Incidentally, I thought of a single instance of ph being used ingeniously. Consider the word phishing: it suggests fishing for dupes but it has no reference to a fisherman’s (fisherperson’s or simply fisher’s, as in German?) occupation.


Are there languages in which spelling reforms are frequent? I am not aware of such languages. Spelling reform is a major social upheaval, and every attempt to institute it meets with fierce opposition. The most recent example is the reform in Germany, which was approved, stalled, and ultimately carried out. However, examples of several waves of spelling reform are known. This returns me to Masha Bell’s letter, posted as a comment to my blog. It is true that the abolition of ph would be a minor and insignificant step in the right direction, but my suggestion reflects what administrators would probably call my philosophy. If we ever succeed in changing the absurdities of English spelling, we will achieve our aim only by moving slowly. Those who studied Roman history have heard about the Second Punic War and about Fabius Maximus, the great cunctator “delayer.” I believe that in our life revolutions should be resorted to once all the other means have been exhausted, and even then after a good deal of deliberation. So I will offer a maxim, inspired by Fabius Maximus. Don’t frighten the public with hav and giv (have, give). Nibble at it. Choose the woodpecker, not the eagle, as your symbol. Get rid of the nonsense that no one needs and almost no one will fight for (rubbish like ph), stop using redundant double letters, and go on step by step.


The major problem is known from the fable “Belling the Cat.” (The pun on Masha’s family name is fortuitous.) A company of mice did not know what to do with the cat, and one of them suggested that a bell should be put on the enemy. It would ring and warn the mice of the cat’s presence. The motion was carried unanimously and with great enthusiasm. But then an old mouse asked: “A good idea, but who will bell the cat?” The English speaking world has no academy invested with the authority to reform spelling. Apparently, we would need an act of Parliament/Congress or some other legislative action. I no longer remember who was responsible for implementing Noah Webster’s suggestions in the United States (color, the suffix -ize, and so forth), but if the sky did not fall after honour and colour lost their French u, we will survive telefone and even til spelled like until. I wish some candidate for the presidency of the United States announced that, if elected, he or she would fight for simplifying English spelling. I am sure this person would win the election in a landslide. In a happy and literate society, economy and foreign policy would take care of themselves as a matter of course.


Uncertain about what to do with word fisherman, I decided to offer a picture of a fisherman and his wife. She was not a fisherwoman, but this detail can be disregarded.


Grimm, Jacob and Wilhelm. The Fairy Tales of the Brothers Grimm. Mrs. Edgar Lucas, translator. Arthur Rackham, illustrator. London: Constable & Company Ltd, 1909.


TO BE CONTINUED NEXT WEEK.


Anatoly Liberman is the author of Word Origins…And How We Know Them as well as An Analytic Dictionary of English Etymology: An Introduction. His column on word origins, The Oxford Etymologist, appears here, each Wednesday. Send your etymology question to him care of blog@oup.com; he’ll do his best to avoid responding with “origin unknown.”


Subscribe to Anatoly Liberman’s weekly etymology posts via email or RSS.

Subscribe to the OUPblog via email or RSS.

View more about this book on the




 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2012 05:30

West Side Story, 55 years later

By Meghann Wilhoite



Today marks the 55th anniversary of the Broadway premiere of Leonard Bernstein’s West Side Story.


A racially-charged retelling of Romeo and Juliet, West Side Story is set in the “blighted” West Side of 1950s Manhattan, the potent themes of star-crossed love and gang rivalry successfully translated from 16th century Italy to 20th century New York by book-writer Arthur Laurents and lyricist Steven Sondheim. The premiere was timely. One day before the curtain rose on West Side Story, America had witnessed a key event in its Civil Rights Movement with the forced integration of Central High School in Little Rock, Arkansas.


Five years later, Bernstein, then music director of the New York Philharmonic, moved the orchestra’s home from Carnegie Hall to nearby Lincoln Center, the construction of which had cleared away much of the neighborhood portrayed in the musical, ironically displacing Tony and Maria’s real-life counterparts as part of Robert Moses’ urban renewal program.


Carol Lawrence and Larry Kert running down the street in promotional photo for West Side Story. (1957) Source: NYPL Friedman-Abeles Collection.


Yes, in many ways West Side Story reflected the drama of its time, flying in the face of racial taboo by bringing together a Puerto Rican girl and a Polish American boy, while also bringing together ballet and popular dance, opera and musical theater, with the attendant implications regarding class and the perception of “high” vs. “low” art.


Of course, when I first saw the West Side Story movie as a child, none of this factored into my perception of the musical, at least not consciously. What jumped out at me was the beauty of the dancers’ movement and the infectious, sometimes tragically beautiful music.


Even in college, when I played piano and celesta in Bernstein’s suite of orchestral movements from the musical, called Symphonic Dances from West Side Story, all I could focus on was the music. I still get a thrill when I think about playing that piece — my heartbeat literally quickens a bit — with its catchy dance tunes and heart-wrenching arias (for an example of the latter, watch this, starting about 0:35).


Click here to view the embedded video.


Looking back, I see now that West Side’s socio-political message was probably a larger part of my appreciation of the work than I realized at the time. Raised in a blue collar environment, I was mostly surrounded by the culture of rock music (with a little polka here and there), while my mother and grandmother instilled in me an appreciation of Mozart and Beethoven, which ultimately led to hours and hours on the piano bench. My young life revolved around a mix of “high” and “low” art, and West Side’s expert mixing of “popular” and “art” music probably appealed to me far more than I realized at the time.


Bernstein, also from a blue collar background, was perhaps the first composer to endeavor to truly marry the compositional rigor of art music with the vibrant spontaneity of popular music, and see the result earn a widespread and lasting influence. The 2009 Broadway revival of West Side Story sold over one million tickets during its two-year run. This clip from Bernstein’s 1973 Harvard lectures perfectly summarizes his views on the relationship of high and low art, with prophetic pronouncements regarding a “great new era of eclecticism.”


Click here to view the embedded video.


Bernstein’s overarching use of words such as dignity, passion, and Earth (by which he means the harmonic series, and, ultimately, tonality) to describe the coming era of eclectic music may go against the grain of postmodern, relativistic thinking, but, as I witness more and more young composers attempting to fuse their formal education with their “popular” music backgrounds, I’m inclined to think Bernstein a true prophet.


Meghann Wilhoite is an Assistant Editor at Grove Music/Oxford Music Online, music blogger, and organist. Follow her on Twitter at @megwilhoite. Read her previous blog posts:“Saving Sibelius: Software in peril,” “The king of instruments: Scary or sleepy?” and “John Zorn at 59.”


Oxford Music Online is the gateway offering users the ability to access and cross-search multiple music reference resources in one location. With Grove Music Online as its cornerstone, Oxford Music Online also contains The Oxford Companion to Music, The Oxford Dictionary of Music, and The Encyclopedia of Popular Music.


Subscribe to the OUPblog via email or RSS.

Subscribe to only music articles the OUPblog via email or RSS.




 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2012 03:30

Why should we care about what we call cancer care?

By Matti Aapro



Supportive care and palliative care: two terms that I often use when talking about cancer care. Without consulting the dictionary one might say that palliation means alleviation, or decrease, while supportive means sustaining. Apparently contradictory terms? Really? Come on, be creative and follow me. It is time for us to stop placing these concepts in opposition to each other.


Many have, for too long, tried to divide medical care into isolated compartments. The present politically correct world says that one should have a “holistic” and “multidisciplinary” approach. What is your opinion? Mine is that good medicine has always been about taking care of the patient, in all the areas where help is needed. In some fortunate countries there has been a multitude of possibilities for this, without exclusivity. But in these same countries some medical practitioners are happy to isolate themselves in their specialty, excellent in their micro-environment but forgetting that the patient is not simply a disease, but a human being.


So what is the difference between the two aspects of patient care I mention? Where in cancer care does “support” end and “palliation” begin?



This topic has been an area of debate for a long time, but one has to realize that there is a continuum in patient care and that it is unacceptable to stay in an ivory tower. Even if historical reasons have led to the development of separate specialist groups that have dedicated their expertise more towards issues frequent at the end of life (often called palliative care), or more towards issues around treatment management and post-treatment issues (often called supportive care), this is not a reason to continue looking at each other with diffidence, like chiens de faïence, as we would say in French (like disdainful China dogs). According to the Multinational Association for Supportive Care in Cancer (MASCC), of which I had the honour of being president a few years ago, supportive care includes management of physical and psychological symptoms and side effects across the continuum of the cancer experience from diagnosis through anticancer treatment to post-treatment care.


Annals of Oncology is the official journal of The European Society for Medical Oncology (ESMO) which has also recognized the importance of these approaches for a long time and encourages the development of integrated centers of excellence. So, if ESMO supports this concept, why is it not more widely embraced? What are we waiting for?


The word used to designate these activities is not trivial. Obviously translations in various languages and cultures renders the discussion even more complex, but for several years there has been a negative perception of palliation. Many have felt that palliation is synonymous with patient abandonment by the “active” treatment team. Thus simply a name change from palliative unit to supportive care unit has been reported to be associated with more inpatient referrals and earlier referrals in the outpatient setting.


There has been much recent interest in palliative care as a means to support patients and have an impact on survival, an interest sparked by the publication of a study evaluating patients with advanced non small-cell lung cancer, which reported that early palliative care in these patients improves quality of life, mood, and survival despite less aggressive end-of-life care compared with standard oncology care alone. The American Society of Clinical Oncology (ASCO) panel’s expert consensus on this matter concluded not long ago that combined standard oncology care and palliative care should be considered early in the course of illness for any patient with metastatic cancer and/or high symptom burden.


In the end, but not just in the end, it is patient care that is important.


Dr. Matti Aapro is Dean of the Multidisciplinary Oncology Institute, Genolier, Switzerland. He serves as Executive Director of the International Society for Geriatric Oncology (SIOG). Dr. Aapro is Editor-in-Chief of Critical Reviews in Oncology/Hematology,  Associate Editor of Annals of Oncology and founding editor of the Journal of Geriatric Oncology. Dr. Aapro’s recent editorial, Supportive care and palliative care: a time for unity in diversity has been made freely available for a limited time by the Annals of Oncology Journal.


Annals of Oncology is devoted to the rapid publication of editorials, reviews, original articles and letters related to oncology, particularly medical oncology. Annals of Oncology is the official journal of the European Society for Medical Oncology. Since 2008 the Journal is also affiliated with the Japanese Society of Medical Oncology.


Subscribe to the OUPblog via email or RSS.

Subscribe to only health and medicines articles on the OUPblog via email or RSS.


Image credit: Doctor with senior female patient discussing treatment. Photo by monkeybusinessimages, iStockphoto.




 •  0 comments  •  flag
Share on Twitter
Published on September 26, 2012 00:30

September 25, 2012

Afghanistan 2013: The road narrows

By Andrew J. Polsky



Three recent developments in Afghanistan underscore the difficulty that will confront the next American president, whether he is Barack Obama or Mitt Romney. First, as Secretary of Defense Leon Panetta announced, the last of the 33,000 additional troops sent to Afghanistan by President Obama two years ago to quell the revived Taliban insurgency have now returned home. The total US force of 68,000 is about the same as in 2009, as is the 100,000 figure for NATO troops overall. Second, despite the obligatory claims that the surge accomplished its purpose of degrading the Taliban, insurgents remain capable of launching spectacular attacks nearly anywhere in Afghanistan. On 14 September 2012, fifteen militants penetrated Camp Bastion in Helmand Province, killed two US Marines, and inflicted more than $200 million in damage. Third, after repeated attacks by Afghanistan government soldiers and police on American and other NATO troops, commanders of the outside forces have suspended most joint operations. More than fifty NATO soldiers have died this year alone in so-called green-on-blue incidents.


Of the three, the last is the most troubling because it undercuts the rationale for a continuing large-scale external presence in support of the Kabul regime. The NATO program has envisioned a transition to complete Afghan responsibility for security operations in 2014. President Obama has promised that the American combat mission would end at that point, though he has offered an ongoing security partnership with Kabul. I have argued that his approach, a reflection of the limited willingness of the American people to continue supporting the war, made sense. But if US forces can no longer operate safely in tandem with Afghan security personnel, then it becomes unclear what useful purpose is served by an ongoing large-scale NATO commitment.


Add to this the continuing toll in civilian deaths resulting from NATO military operations, a running sore in Afghan public opinion. Thus far in 2012, some 200 civilians have been killed by foreign troops, about half the number at this point in 2011, a decline that reflects tighter rules of engagement. Nevertheless, the losses have inflamed hostility toward the outside forces. Lest he be accused of being a puppet of the West, Afghanistan President Hamid Karzai has condemned a recent bombing attack that reportedly killed eight women in Laghman province.


A US Marine with Alpha Company, 1st Battalion, 6th Marine Regiment, patrols in Helmand province, Afganistan, 30 Nov. 2011. US Marine Corps photo by Cpl. James Clark/Released


Indeed, external military involvement may have reached the point of diminishing returns. Although President Karzai and other Afghan officials have tried to pass off the violent episodes between government soldiers and police and their NATO co-belligerents as the work of Taliban provocateurs, the evidence suggests that most of the attacks do not involve the Taliban. Instead they appear to stem from the vast cultural differences between the Afghan recruits and their Western “embedded team trainers” (ETTs). A dozen years into the war, moreover, the problem has been getting worse, not better. Given the trend, continuing the training mission seems most likely to exacerbate the problem. General Martin Dempsey, chairman of the US Joint Chiefs of Staff, has acknowledged that the insider attacks require a shift of some kind in the American approach.


But what options remain? No American leader of either party contemplates escalation. Although Mitt Romney has spoken of slowing the pace of withdrawal, no larger political purpose can be served by causing the deaths of a few more Taliban militants. The situation illustrates a broad pattern of wartime presidential leadership — the ongoing loss of agency that a commander in chief experiences as a conflict continues. At a certain point, a president ceases to exercise meaningful choices. All signs indicate that we have reached this point in Afghanistan.


Much as we might want the two candidates to address the future course of Afghanistan policy before Election Day, they are unlikely to do so. I expect the subject to be raised in the upcoming presidential debates, of course. But President Obama will do no more than defend his policy of gradual disengagement, while Mitt Romney may look to score some points by criticizing his rival for announcing a withdrawal timetable that let the Taliban wait out the American departure. As to what should come next, however, vague generalities will be the order of the day.


Come 20 January 2013, the question of what comes next in Afghanistan should no longer be evaded. If nothing else, it will be difficult to sidestep the ongoing costs of the war in a time of painful budget choices. The new president needs to make clear to the American people how a continuing large-scale US and NATO military presence in Afghanistan connects to a realistic and achievable political goal. Most of all, as commander in chief, he owes it to the troops scheduled for deployment in the next two years to explain why their sacrifices remain vital to American security and what mission he expects them to accomplish. If he can’t do that, then he should announce that they won’t be sent.


Andrew Polsky is Professor of Political Science at Hunter College and the CUNY Graduate Center. A former editor of the journal Polity, his most recent book is Elusive Victories: The American Presidency at War. Read Andrew Polsky’s previous blog posts.








Subscribe to the OUPblog via email or RSS.

Subscribe to only law and politics articles on the OUPblog via email or RSS.

Subscribe to only American history articles on the OUPblog via email or RSS.

View more about this book on the 




 •  0 comments  •  flag
Share on Twitter
Published on September 25, 2012 05:30

Oxford University Press's Blog

Oxford University Press
Oxford University Press isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Oxford University Press's blog with rss.