Oxford University Press's Blog, page 833
March 19, 2014
Gloomy terrors or the most intense pleasure?
In 1814, just two hundred years ago, the radical philosopher Jeremy Bentham (1748–1832) began to write on the subject of religion and sex, and thereby produced the first systematic defence of sexual liberty in the history of modern European thought. Bentham’s manuscripts have now been published for the first time in authoritative form. He pointed out that ‘regular’ sexual activity consisted in intercourse between one male and one female, within the confines of marriage, for the procreation of children. He identified the source of the view that only ‘regular’ or ‘natural’ sexual activity was morally acceptable in the Mosaic Law and in the teachings of the self-styled Apostle Paul. ‘Irregular’ sexual activity, on the other hand, had many variations: intercourse between one man and one woman, when neither of them were married, or when one of them was married, or when both of them were married, but not to each other; between two women; between two men; between one man and one woman but using parts of the body that did not lead to procreation; between a human being and an animal of another species; between a human being and an inanimate object; and between a living human and a dead one. In addition, there was the ‘solitary mode of sexual gratification’, and innumerable modes that involved more than two people. Bentham’s point was that, given that sexual gratification was for most people the most intense and the purest of all pleasures and that pleasure was a good thing (the only good thing in his view), and assuming that the activity was consensual, a massive amount of human happiness was being suppressed by preventing people, whether from the sanction of the law, religion, or public opinion, from engaging in such ‘irregular’ activities as suited their taste.
Bentham was writing at a time when homosexuals, those guilty of ‘the crime against nature’, were subject to the death penalty in England, and were in fact being executed at about the rate of two per year, and were vilified and ridiculed in the press and in literature. If an activity did not cause harm, Bentham had argued as early as the 1770s and 1780s, then it should not be subject to legal punishment, and had called for the decriminalization of homosexuality. By the mid-1810s he was prepared to link the problem not only with law, but with religion. The destruction of Sodom and Gomorrah was taken by ‘religionists’, as Bentham called religious believers, to prove that God had issued a universal condemnation of homosexuality. Bentham pointed out that what the Bible story condemned was gang rape. Paul’s injunctions against homosexuality were also taken to be authoritative by the Church. Bentham pointed out that not only did Jesus never condemn homosexuality, but that the Gospels presented evidence that Jesus engaged in sexual activity, and that he had his male lovers — the disciple whom he loved, traditionally said to be John, and the boy, probably a male prostitute, who remained with Jesus in the Garden of Gethsemane after all the disciples had fled (for a more detailed account see ‘Not Paul, but Jesus’).
Bentham was writing after Malthus had in 1798 put forward his argument that population growth would always tend to outstrip food supply, resulting in starvation and death until an equilibrium was restored, whereupon the process would recommence. Bentham had been convinced by Malthus, but Malthus’s solution to the problem, that people should abstain from sex, was not acceptable to him. He pointed out that one advantage of non-procreative sex was that it would not add to the increase of population. Bentham also took up the theme of infanticide. He had considerable sympathy for unmarried mothers who, because of social attitudes, were ostracized and had little choice but to become prostitutes, with the inevitable descent into drink, disease, and premature death. It would be far better, argued Bentham, to destroy the child, rather than the woman. Moreover, it was kinder to kill an infant at birth than allow it to live a life of pain and suffering.
Bentham looked to ancient Greece and Rome, where certain forms of homosexual activity were not only permitted but regarded as normal, as more appropriate models for sexual morality than that which existed in modern Christian Europe. Bentham attacked the notion, still propagated by religious apologists, that homosexuality was ‘unnatural’. All that ‘unnatural’ meant, argued Bentham, was ‘not common’. The fact that something was not common was not a ground for condemning it. Neither was the fact that something was not to your taste. It was a form of tyranny to say that, because you did not like to do a particular thing, you were going to punish another person for doing it. Because you thought something was ‘disgusting’ did not mean that everyone else thought it was disgusting. You might not want to have sex with a sow, but the father of her piglets thought differently.
These writings were, for Bentham, a critical part of a much broader attack on religion and the ‘gloomy terrors’ inspired by the religious mentality. By putting forward the case for sexual liberty, he was undermining religion in one of the areas where, in his view, it was most pernicious. Bentham did not dare publish this material. He believed that his reputation would have been ruined had he done so. He died in 1832. He would have been saddened that it still retains massive relevance in today’s world.
Philip Schofield is Professor of the History of Legal and Political Thought in the Faculty of Laws, University College London, Director of the Bentham Project, and General Editor of the new authoritative edition of The Collected Works of Jeremy Bentham. The latest volume in the edition, Of Sexual Irregularities, and other writings on Sexual Morality, was published on 30 January 2014. The research that led to the preparation of the volume was funded by the Leverhulme Trust. The Bentham Project is responsible for Transcribe Bentham, the prize-winning scholarly crowdsourcing initiative, where volunteers transcribe previously unread Bentham manuscripts.
Subscribe to the OUPblog via email or RSS.
Subscribe to only philosophy articles on the OUPblog via email or RSS.
Image credit: Jeremy Bentham, aged about 80. Frontispiece to Jeremy Bentham, Principles of Legislation, edited by John Neal, Boston: Wells and Lilly, 1830. Public domain
The post Gloomy terrors or the most intense pleasure? appeared first on OUPblog.










March 18, 2014
Kathleen J. Pottick on Superstorm Sandy and social work resources
In the aftermath of Superstorm Sandy, one group of dedicated social work scholars at Rutgers University explored options to offer funding and training programs to assist clients who were hit hard. One of their more recent initiatives provided subscriptions to the Encyclopedia of Social Work Online to seven agency directors who needed access to scholarly research to guide their work in the field. We spoke to Kathleen Pottick, professor in Rutgers University’s School of Social Work, who spearheaded this endeavor to hear the story behind their work.
How did the Rutgers grant for this project come about?
While we were struggling — as a university, as a community, as a state — to recover from Superstorm Sandy in October 2012, our development director recognized an opportunity for funding through foundations supporting Sandy relief efforts in New Jersey. We knew that our students were in field agencies throughout the state serving those directly impacted by the storm, and we wanted to give those students the tools to help those surviving this natural disaster. Many of our survivors were newly needy, and our students had to be trained in dealing with this population.
We believed there might be private foundation backing to develop training programs because they would have an immediate, concrete effect on helping clients. After we received interest from a foundation, several key faculty and staff sat together to map the best strategy that would make the largest impact in our state. We presented our proposal within the week, and they gave us a budget to implement this disaster training and service work.
What was your initial aim?
Our initial aim was to respond quickly to this disaster to assist victims of the event, while, for the first time, integrating graduate social work students into disaster-related clinical and non-profit organization management agencies through field placements. We wanted students quickly to be able to provide concrete services, as well as directed mental health counseling.
How will this work support future goals at the School of Social Work?
Students participating in the program were called Disaster Fellows and given supplemental training on disaster response and disaster mental health counseling, in addition to their usual advanced social work training. They applied their training through supervised field placements, mostly in nontraditional, newly developed agency locations, and we now have a base of alumni who can take those skills to their professional agencies throughout the state, and beyond. There are opportunities to present our work at national and regional conferences to ensure the human experience is not overlooked, as environmentalists and climatologists discuss Sandy’s impact. The school will continue to enhance its reputation within the university as an integral collaborator on current issues, as well as a leader in innovative responses to disaster.
How did you and Cynthia Franklin, Editor of Encyclopedia of Social Work online, come together to start discussing your plans? Why did you select Encyclopedia of Social Work to assist the research of these individuals?
It was important that our agency partners continued to feel support from the School of Social Work as they continued to serve clients in their communities, especially for long-term effects of the Sandy disaster, after our original students were gone. We wanted those agencies to have the most current and comprehensive information at their fingertips. The Encyclopedia of Social Work was the first place we went because we knew that the continuously update online function, with links to relevant articles and journals, would be most effective for these partner agencies. Our partners could easily search for specific, up-to-the-minute information without thumbing through pages of a static text.
When we told agency directors that they would have access to the Encyclopedia, they were extremely appreciative. The director of the Mental Health Association of New Jersey said, “The Encyclopedia will help us tremendously with our ongoing efforts to enhance the education and training of our recovery community as we continue to work with the survivors of Hurricane Sandy who require ongoing attention. We will also use this resource in Mental Health First-Aid and other disaster-related trainings that are conducted throughout New Jersey.”
Another community agency director stated, “This subscription will be one more useful tool in our ‘resource toolbox’ to have on hand when assisting individuals and families who have been exposed to a traumatic life event such as Hurricane Sandy and who need professional guidance.”
For readers not residing in this part of the country, can you elaborate on how and why Sandy was particularly catastrophic for New Jersey residents?
The superstorm caused unprecedented havoc across a broad swath of the Middle Atlantic region of the United States, and unlike normal storms, its hurricane-force winds did not dissipate quickly once the storm got over land, but extended approximately 100 miles inland. People who had never experienced hardship in their lives were drastically affected, and in New Jersey, many of the survivors of this storm lived in middle- and upper-class areas. A very large number of residents were seeking help from local agencies for the first time. They didn’t know what resources were available to them, nor did they understand the effects this disaster would have on them for months and years to come. In some areas of the state, entire communities were destroyed, and survivors from them had to relocate completely, leaving not just their physical residence, but their neighbors, stores, resources, and their general sense of community. The theme for this year’s National Social Work month, All People Matter, is quite relevant to us in New Jersey because we have stood together and have shown great resiliency as our communities rebuild.
In your opinion, what challenges are social workers facing in the current workforce? What about in education or research settings?
It is not specific challenges, but rather the breadth of them, that poses the major difficulty for the current social work workforce in serving not just traditionally vulnerable families and children, but newly needy classes of clients — be it in responding to increasingly common and devastating natural events, stagnant middle-class income, or the growing maze of governmental programs such as the Affordable Health Care Act. Interdisciplinary collaborations have become necessary so that social workers can function as parts of teams providing coordinated responses that require multiple interventions. Education is necessary to provide them with these skills, and research is necessary to understand the effectiveness of delivery mechanisms.
What are your goals for the coming year? Is this investment in ESW to social agencies just the beginning?
We hope to continue to educate our social work students in disaster work. Through our grant we developed training modules and coursework that can be replicated for interested students in the future. Our goal is to present our work and serve as a model for other workgroups when responding to disaster. Many organizations focus on the physical clean-up process, post-disaster – the debris, the remediation, the coastline. But we want to have social work serve as the helping profession that not only assists people coping with immediate tangible needs, but also that brings awareness to the mental and emotional issues that survivors face, and prepares them for potential future ones.
Any final thoughts?
Universities and non-profit organizations are strategically poised to reach out to philanthropic organizations for financial resources to invest in training a competent workforce for new special areas needing immediate attention. The investment in the Encyclopedia of Social Work for agencies working in the new areas is critical for long-term practice effects.
Kathleen J. Pottick (Ph.D., Michigan, 1982) is professor in Rutgers University’s School of Social Work and Core Senior Faculty at Rutgers’ Institute for Health, Health Care Policy and Aging Research. She has served in a variety of administrative roles in the School of Social Work, including Acting Dean (2011-2013) and Associate Dean for Faculty Development (2009-2011).
The Encyclopedia of Social Work Online is the first continuously updated online collaboration between the National Association of Social Workers (NASW Press) and Oxford University Press (OUP). Building off the classic reference work, a valuable tool for social workers for over 85 years, the online resource of the same name offers the reliability of print with the accessibility of a digital platform. Over 400 overview articles, on key topics ranging from international issues to ethical standards, offer students, scholars, and practitioners a trusted foundation for a lifetime of work and research, with new articles and revisions to existing articles added regularly.
Subscribe to the OUPblog via email or RSS.
Subscribe to only social work articles on the OUPblog via email or RSS.
Image credit: A street in Bay Head, NJ after Hurricane Sandy. © tzam via iStockphoto.
The post Kathleen J. Pottick on Superstorm Sandy and social work resources appeared first on OUPblog.










Thomas Jefferson’s Statute for Religious Freedom
Surprisingly, in a country that cares about its founding history, few Americans know of Thomas Jefferson’s Statute for Religious Freedom, a document that Harvard’s distinguished (emeritus) history professor, Bernard Bailyn called, “the most important document in American history, bar none.”
Yet that document is not found in most school standards, so it’s rarely taught. How come? Maybe because it is a Virginia document, passed by Virginia’s General Assembly. Or maybe because its ideas found their way directly into the Constitution’s First Amendment and that seems enough for most Americans.
Why do Bailyn and some others think it so important? Because it was radical, trailblazing, and uniquely American: no government before had ever taken the astonishing stand that it takes. In essence it says that religious belief is a personal thing, a matter of heart and soul, and that government has no right to meddle with beliefs, or tax citizens to support churches they may disavow. That wasn’t the way governments were expected to operate.
Imagine you live in 18th century Virginia: According to a law on the books you have to go to church—every day–usually for both morning and evening prayer. If you don’t go you could be whipped, sentenced to work on an oceangoing galley, or worse. And you don’t have a choice of churches. In Virginia the established church is Anglican. That means you are assessed taxes to support that church whether you believed in it or not. You couldn’t be a member of the legislature unless you are a member of the Established Church. So the legislation of the House of Burgesses reflect the world of its Anglican delegates.
But the colonies are home to independent folk who braved a turbulent ocean to be able to think for themselves. Roger Williams, kicked out of Puritan Massachusetts, talks of “freedom of conscience” and lets anyone who wishes settle in the colony he founds on Rhode Island. The Long Island town of Flushing has a charter that promises settlers freedom of conscience, so leaders there petition New York Governor Peter Stuyvesant concerning the public torturing of Quaker preacher Robert Hodgson. In the Flushing Remonstrance they write, “We desire in this case not to judge lest we be judged.” In Virginia, Jemmy Madison stands outside the Orange County jail and listens as a Methodist preacher, behind bars, spreads the word of the Gospel. Why are Methodists, Baptists and Presbyterians being put in jail, Madison asks?
In 1777 Jefferson writes the eloquent Statute for Religious Freedom, introducing it into the legislature in 1779. This is a Revolutionary time and England’s church has to go. Will an established American church succeed it?
Patrick Henry, popular and powerful, introduces a bill that seems enlightened: it calls for public assessments to support all Christian worship. James Madison counters in his remarkable Memorial and Remonstrance Against Religious Assessments, arguing the case for full religious freedom. Most legislators are perplexed. Every nation has its state church. George Washington is on the fence. If citizens are not forced to go to church will they sink into immorality? What is the role of government?
John Locke has written a famous letter on religious tolerance, but Jefferson and Madison are clear: this statute is not about tolerance, it is about full acceptance — and jurisdiction. Jefferson’s point is that governments have no business telling people what they should believe. In his Notes on Virginia he says,
“The legitimate powers of government extend to such acts only as are injurious to others. But it does me no injury for my neighbor to say there are twenty gods or one god. It neither picks my pocket nor breaks my leg. […] To suffer the civil magistrate to intrude his powers into the field of opinion. . .is a dangerous fallacy, which at once destroys all religious liberty.”
Meanwhile Patrick Henry’s bill passes its first two readings. A third and it becomes law. Jefferson writes to Madison from Paris where he is ambassador, “What we have to do is devotedly pray for his death.” (He’s talking about Henry.) Madison takes a pragmatic path. He gets Patrick Henry kicked upstairs into the governor’s chair, where he has no vote. Then he reintroduces Jefferson’s bill for establishing religious freedom. It finally passes, on 16 January 1786.
It turns out that Virginians are no more, nor less, moral than they were when they were forced to go to church. George Washington becomes one of the biggest fans of the idea of religious freedom. In a famous letter to the Hebrew Congregation in Newport Rhode Island, he says,
“The citizens of the United States of America have a right to applaud themselves for having given to mankind examples of an enlarged and liberal policy–a policy worthy of imitation. . .It is now no more that toleration is spoken of as if it were the indulgence of one class of people that another enjoyed the exercise of their inherent natural rights, for, happily, the Government of the United States, which gives to bigotry no sanction, to persecution no assistance, requires only that they who live under its protection should demean themselves as good citizens.”
Joy Hakim, a former teacher, editor, and writer won the prestigious James Michener Prize for her series, A History of US, which has sold over 5 million copies nationwide. From Colonies to Country, one of the volumes in that series, includes the full text of Jefferson’s statute. Hakim is also the author of The Story of Science, published by Smithsonian Books. A graduate of Smith College and Goucher College she has been an Associate Editor at Norfolk’s Virginian-Pilot, and was Assistant Editor at McGraw-Hill’s World News.
Subscribe to the OUPblog via email or RSS.
Subscribe to only American history articles on the OUPblog via email or RSS.
Image Credit: Official Presidential portrait of Thomas Jefferson. 1800. By the White House Historical Association. Public Domain via Wikimedia Commons.
The post Thomas Jefferson’s Statute for Religious Freedom appeared first on OUPblog.










Composer Hilary Tann in eight questions
We asked our composers a series of questions based around their musical likes and dislikes, influences, challenges, and various other things on the theme of music and their careers. Each month we will bring you answers from an OUP composer, giving you an insight into their music and personalities. Today, we share our interview with composer Hilary Tann.

Hilary Tann, photo credit: Lawrence White.
Praised for its lyricism and formal balance, Hilary Tann’s music is influenced by her love of Wales and a strong identification with the natural world. A deep interest in the traditional music of Japan has led to private study of the shakuhachi and guest visits to Japan, Korea, and China. Her compositions have been widely performed and recorded by ensembles such as the European Womenʼs Orchestra, Tenebrae, Lontano, Meininger Trio, Thai Philharmonic, Royal Liverpool Philharmonic, BBC National Orchestra of Wales, and KBS Philharmonic in Seoul, South Korea.
Which of your pieces are you most proud of and/or holds the most significance to you?
The large orchestral work, From Afar, comes to mind immediately. I especially enjoyed the Korean Broadcast Symphony performance because it captured the sense of the traditional music of Japan so well. From the chamber music repertory, Nothing Forgotten (piano trio) stands out as an Adirondack piece, and from the choral repertory, The Moor (SA) recalls Wales and my interest in sacred music.
Which composer were you most influenced by and which of their pieces has had the most impact on you?
In the early days I was influenced by the music of Roberto Gerhard, especially Libra, Hymnody, and the Concerto for Orchestra. In fact, I began postgraduate work with Jonathan Harvey in Southampton University, studying Gerhard’s oeuvre, and it was this work which initially took me to Princeton University in the United States.
Can you describe the first piece of music you ever wrote?
The Wye Valley for piano. When he interviewed me for BBCWales, Ian Skidmore called this “the beginning of my tradition of being inspired by nature”. I responded that at age 6 I wasn’t thinking that I was beginning a tradition!
If you could have been present at the premiere of any one work (other than your own) which would it be?
Monteverdi’s Vespers of 1610 — all the pageantry, all those timbres, wonderful!
What might you have been if you weren’t a composer?
A geologist. I grew up when plate tectonics were coming into the public eye and, coming from Wales, rocks have always excited me. But actually, writing music has been at the forefront ever since I can remember.
What is your favourite piece of music in the OUP catalogue that isn’t yours?
Hard to say. . . lots . . . but I loved William Matthias‘s Symphony No. 2 (Summer Music) when I first heard it, and John Buller‘s Theatre of Memory bowled me over on first listening.
Is there an instrument you wish you had learnt to play and do you have a favourite work for that instrument?
Harp — not so much the Romantic harp, but the works of Turlough O’Carolan — Celtic harp. In fact I did take lessons some 20 years ago before writing From the Song of Amergin (fl, va, hp) and I really enjoyed getting to know the 43-string instrument. (My main instruments are piano and cello, but my hands are small for these, whereas I’m told have “good” harp hands. Perhaps one day I can return to this haunting sound world.)
Is there a piece of music you wish you had written?
The Bach Cello Suites — especially the preludes and sarabandes. I’ve always loved the narrative solo line and enjoy writing pieces for solo instruments. In fact, I’ve just completed Seven Poems of Stillness for Guy Johnston (Gregynog Festival, June 2013).
Welsh-born composer Hilary Tann lives in the foothills of the Adirondack Mountains in upstate New York where she is the John Howard Payne Professor of Music at Union College, Schenectady. She holds degrees in composition from the University of Wales at Cardiff and from Princeton University. From 1982 to 1995 she held a number of Executive Committee positions with the International League of Women Composers. She was guest Composer-in-Residence at the 2011 Eastman School of Music Women in Music Festival and will be composer-in-residence at the 2013 Women Composers Festival of Hartford.
Subscribe to the OUPblog via email or RSS.
Subscribe to only music articles on the OUPblog via email or RSS.
The post Composer Hilary Tann in eight questions appeared first on OUPblog.










March 17, 2014
An Irish literature reading list from Oxford World’s Classics
By Kirsty Doole
With today being St Patrick’s Day, we’ve taken the opportunity to recommend a few classic works of Irish literature to dip into while you’re enjoying a pint (or two) of Guinness.
Finnegans Wake by James Joyce
Joyce is one of the most famous figures in Irish literature, and Finnegans Wake is infamous for being one of the most formidable books in existence. It plays fantastic games with language and reinvents the very idea of the novel in the process of telling the story of Humphrey Chimpden Earwicker and his wife Anna Livia, in whom the character of Ireland itself takes form. Around them and their dreams there swirls a vortex of world history, of ambition and failure, pride and shame, rivalry and conflict, gossip and mystery.
A Tale of a Tub and Other Works by Jonathan Swift
This was the first major work written by Jonathan Swift. The author explains in a preface that it is the practice of seamen when they meet a whale to throw out an empty tub to divert it from attacking the ship. Hence the title of the satire, which is intended to divert Hobbes’s Leviathan and the wits of the age from picking holes in the weak sides of religion and government. The author proceeds to tell the story of a father who leaves as a legacy to his three sons Peter, Martin, and Jack a coat apiece, with directions that on no account are the coats to be altered. Peter symbolizes the Roman Church, Martin (from Martin Luther) the Anglican, Jack (from John Calvin) the Dissenters. The sons gradually disobey the injunction. Finally Martin and Jack quarrel with the arrogant Peter, then with each other, and separate.
The Playboy of the Western World and Other Plays by J. M. Synge
In The Playboy of the Western World, the action takes place in a public house, when a stranger enters and is persuaded to tell his story. Impressed, the admiring audience thinks he must be very brave indeed to have killed his father, and in turn the young tramp blossoms into the daring rollicking hero they believe him to be. But then his father, with a bandaged head, turns up seeking his worthless son. Disillusioned and angry at the loss of their hero, the crowd turns the stranger, who tries to prove that he is indeed capable of savage deeds, even attempting unsuccessfully to kill his father again. The play ends with father and son leaving together with the words “Shut yer yelling for if you’re after making a mighty man of me this day by the power of a lie, you’re setting me now to think if it’s a poor thing to be lonesome, it’s worse maybe to go mixing with the fools of earth.”
Dracula by Bram Stoker
One of the greatest horror stories ever written. This is the novel that introduced the character of Count Dracula to the world, spawning a whole host of vampire fictions in its wake. As well as being a pioneering text in horror fiction, it also has much to say about the nature of empire, with Dracula hell-bent on spreading his contagion into the very heart of the British empire. Fun fact: Bram Stoker’s wife, Florence Balcombe, had previously been courted by Oscar Wilde.
The Major Works by W. B. Yeats
W. B. Yeats was born in 1865 and died in 1939. His career crossed the 19th and 20th centuries, from the Romantic early poems of Crossways and the symbolist masterpiece The Wind Among the Reeds to his last poems. Myth and folk-tale influence all of his work, most notably in Cathleen ni Houlihan among others. The importance of the spirit world to his life and work is evident in his critical essays and occult writings, and he also wrote a whole host of political speeches, autobiographical writings, and letters.
The Wild Irish Girl by Sydney Owenson (Lady Morgan)
This is the story of the son of an English lord, Horation, who is banished to his father’s Irish estate as punishment for gambling debts, he adopts the persona of knight errant and goes off in search of adventure. On the wild west coast of Connaught he finds remnants of a romantic Gaelic past a dilapidated castle, a Catholic priest, a deposed king and the king’s lovely and learned daughter, Glorvina. In the process he rediscovered a love for the life and culture of his country. Written after the Act of Union, The Wild Irish Girl (1806) is a passionately nationalistic novel and an essential novel in the discourse of Irish nationalism. The novel was so controversial in Ireland that the author, Lady Morgan, was put under surveillance by Dublin Castle. There is a bust of Lady Morgan in the Victoria and Albert Museum in London and the plaque mentions that Lady Morgan was “less than four feet tall.”
In a Glass Darkly by J. Sheridan Le Fanu
This dark collection of five stories was said by none other than Henry James to be “the ideal reading… for the hours after midnight”. Indeed, J. Sheridan Le Fanu himself had a reputation for being both reclusive and rarely seen in the daytime. His fascination with the occult led to his stories being truly spine-chilling, drawing on the Gothic tradition and elements of Irish folklore, as well as on the social and political anxieties of his Anglo-Irish contemporaries.
Kirsty Doole is Publicity Manager for Oxford World’s Classics.
For over 100 years Oxford World’s Classics has made available the broadest spectrum of literature from around the globe. Each affordable volume reflects Oxford’s commitment to scholarship, providing the most accurate text plus a wealth of other valuable features, including expert introductions by leading authorities, voluminous notes to clarify the text, up-to-date bibliographies for further study, and much more. You can follow Oxford World’s Classics on Twitter, Facebook, or here on the OUPblog. Subscribe to only Oxford World’s Classics articles on the OUPblog via email or RSS.
Subscribe to the OUPblog via email or RSS.
Subscribe to only literature articles on the OUPblog via email or RSS.
Image credit: James Joyce. By Djuna Barnes. Public domain via Wikimedia Commons.
The post An Irish literature reading list from Oxford World’s Classics appeared first on OUPblog.










Iraq, detainee abuse, and the danger of humanitarian double standards
Eleven years ago this month the US-led military coalition crossed the ‘line of departure’ from Kuwait into Iraq. The full spectrum dominance of these forces produced a rapid victory over the Iraqi armed forces. Unfortunately, winning the peace turned out to be far more complex than winning the war (although for the Americans who bore the burden of securing that initial victory there was certainly nothing ‘easy’ about it). Not long after defeating organized enemy resistance, coalition forces began the long process of occupation, counter-insurgency, and return to full Iraqi sovereignty. Each phase of this overall effort seemed to produce never-ending operational and tactical challenges, all of which were mirrored by associated legal challenges.
The Iraq campaign was not, however, conducted in a strategic vacuum. Instead, it was part of a broader US effort to disrupt and disable al Qaeda, the transnational terrorist organization responsible for the devastating September 11th attacks. But while perhaps strategically linked to this broader effort, Iraq was – at least for the most part – an operationally distinct effort, at least at the initial stage prior to the rise of al Qaeda Iraq. What is more significant is that unlike the so-called ‘war on terror’, Iraq was much more of a ‘conventional’ fight, generating legal issues that had been contemplated and addressed in international humanitarian law. Coalition forces followed well-established rules related to conduct of hostilities, belligerent occupation, and detainee capture, status, and treatment.
This last category of operational and tactical challenges – dealing with captives and detainees – unfortunately generated what might legitimately be characterized as the My Lai of the Iraq war: the detainee abuse incident at Abu Ghraib. This incident created a media firestorm and generated unquantifiable levels of criticism of US efforts. Much worse was the negative strategic impact, with the Abu Ghraib abuse incident is perhaps the most significant strategic debacle of the war, and provided a major stimulant to the then nascent Iraqi insurgency.
Why Abu Ghraib happened was and will continue to be debated for years to come. While the abuse of al Qaeda detainees captured and held outside Iraq was without question responsive to legal opinions and resulting policy decisions emanating from the highest levels of the US government, these policies never explicitly extended to Iraq. It does seem clear, however, is that the prohibitory effect of the law of war, and the Geneva Conventions more specifically, had been diluted for the US soldiers entrusted with the responsibility to manage and control this detention facility. This dilution ultimately contributed to gross abuses of detainees within the control of the United States and at the complete mercy of their captors. Abuse of power over such individuals should, and must always, engender outrage and condemnation, not merely because of the blatant violation of fundamental humanitarian protections, but because such misconduct is a derogation of the most basic notions of soldier professionalism.
There are important lessons to learn from this incident. These range from the strategic debacles that often flow from violations of the law of armed conflict, to the true meaning of ‘responsible command’ – training, supervising, and correcting subordinates to ensure compliance with all commands, including respecting legal obligations. However, there is another lesson to be drawn from this unfortunate episode: the danger of dehumanization.

Secretary of Defense Donald H. Rumsfeld takes a tour of the Abu Ghraib Detention Center in Abu Ghraib, Iraq, on May 13, 2004. Rumsfeld and Chairman of the Joint Chiefs of Staff Gen. Richard B. Myers are in Iraq to visit the troops in Baghdad and Abu Ghraib. DoD photo by Tech. Sgt. Jerry Morrison Jr., U.S. Air Force. Public domain via defense.gov.
Every US soldier assigned to the Abu Ghraib prison, like every other US service-member who entered the Iraq theater of operations, was instructed to comply with the Geneva Conventions. It was part of their pre-deployment training; it was incorporated into Rules of Engagement cards; it was incorporated into command directives and orders. However, during this same time the United States was prosecuting another conflict against al Qaeda. Unlike the rules applicable to detainees at Abu Ghraib who were subject to the protections of the Fourth Geneva Convention, the conflict against al Qaeda involved no analogous emphasis on Geneva compliance. Instead, leaders at the highest level of US civilian and military organizations repeatedly emphasized that this enemy was composed of, ‘unlawful’ combatants — individuals who had no legitimate claim on the humanitarian protections of the laws and customs of war. Unlike a ‘legitimate’ enemy, this enemy could be subjected to detention and treatment conditions inconsistent with the most basic principle of humane treatment. In short, US forces were applying a genuine double-standard: detainees — whether military or civilian — considered ‘legitimate’ received the benefit of the law; those considered ‘illegitimate’ did not.
Encouraging soldiers to view certain enemies as unworthy of the most basic principles of humanity is a recipe for disaster. War involves an inherent need to dehumanize your opponent, an unfortunate necessity to enable soldiers to engage in the even more unfortunate necessity of killing on demand. Most moral beings are naturally averse to killing, and when doing so is not triggered by the survival instinct in response to an imminent threat, that aversion must be overcome. Dehumanization of the enemy serves this purpose.
But these same warriors must be capable of flicking the proverbial humanity switch, restoring the enemy to a status of human being at the moment the enemy is subdued. This is an even more complex task. Asking a soldier to show human mercy to an enemy, who, only moments prior was just trying to kill him, or perhaps just killed his best friend, is an immense leadership challenge. That challenge is facilitated by bright-line rules of war, rules that aid the warrior in navigating this moral abyss.
Diluting the clarity of these bright line rules is, therefore, terribly dangerous. These rules dictate to soldiers and their leaders that engaging in hostilities is, in the ultimate analysis, not ‘personal’, but instead an obligation imposed by the State or the non-state group. Thus, in a very real sense, the soldier is not acting in an individual capacity, but as the agent of the military organization ordering the soldier to participate in hostilities. In this capacity, the soldier is restrained from allowing the natural human instincts of vengeance and retribution to undermine the objectives of the organization writ large. The principle of humanity, when extended to captured opponents, implements this core tenet of organized hostilities; the struggle cannot be treated as personal.
There is a lesson that transcends the reminder that detainee abuse incidents produce profound strategic and tactical negative consequences. That lesson is that preservation and reinforcement of the bright line rules of humanity in warfare demand that distinctions between ‘categories’ of captured opponents must not be intended or perceived as a justification for treatment inconsistent with this core principle. When this occurs, the dilution may and often will very quickly infect the treatment of individuals granted a more protective status. This is precisely what happened when the United States authorized abusive treatment of unlawful combatants. Although none of the detainees in Iraq fell into that category, the broader message signaled by senior US (mainly civilian) leaders was clear: some captives are unworthy of the full protection of the law of armed conflict. Did this contribute to the inhumane treatment inflicted upon Iraqi detainees? It seems almost self-evident that the answer is yes. What beyond any doubt is that this could not have helped reinforce commitment to the legal obligations that so clearly applied to these victims.
Telford Taylor wrote several decades ago that war does not provide a license to kill; it imposes a duty to kill. But that duty is imposed by the State, and it is subordination to the interests of the State that defines warrior professionalism and permeates the restrictions imposed on warriors by the law of armed conflict. These restrictions serve both military and humanitarian interests, by protecting individuals from gratuitous violence and by facilitating mission accomplishment through the mitigation of resentment and disdain among opponents and potentially hostile civilian populations. But it is easy to understand why these restrictions may frequently be perceived as counter-intuitive for individuals engaged in mortal combat who must, in order to overcome the human aversion to killing, dehumanize their opponents. The States and military leaders who demand this conduct from men and women must, therefore, be vigilant in reinforcing these bright lines and avoid the temptation to extend the dehumanization that is an unfortunate necessity of pre-submission encounters with the enemy to their post-submission treatment. If this is a lesson learned from the Abu Ghraib debacle, then some good will ultimately be derived from that sad incident.
Geoffrey S. Corn is Presidential Research Professor of Law, South Texas College of Law; Lieutenant Colonel (Retired), U.S. Army Judge Advocate General’s Corps. Prior to joining the faculty at South Texas, Professor Corn served in a variety of military assignments, including as the Army’s senior law of war advisor, supervisory defense counsel for the Western United States, Chief of International Law for U.S. Army Europe, and as a tactical intelligence officer in Panama. He is the co-author of The War on Terror and the Laws of War: A Military Perspective with Michael Lewis, Eric Jensen, Victor Hansen, Richard Jackson, and James Schoettler.
Oxford University Press is a leading publisher in international law, including the Max Planck Encyclopedia of Public International Law, latest titles from thought leaders in the field, and a wide range of law journals and online products. We publish original works across key areas of study, from humanitarian to international economic to environmental law, developing outstanding resources to support students, scholars, and practitioners worldwide. For the latest news, commentary, and insights follow the International Law team on Twitter @OUPIntLaw.
Subscribe to the OUPblog via email or RSS.
Subscribe to only law articles on the OUPblog via email or RSS.
The post Iraq, detainee abuse, and the danger of humanitarian double standards appeared first on OUPblog.










Britain, France, and their roads from empire
After the Second World War ended in 1945, Britain and France still controlled the world’s two largest colonial empires, even after the destruction of the war. Their imperial territories extended over four continents. And what’s more, both countries seemed to be absolutely determined to hold on their empires; the roll-call of British and French politicians, soldiers, settlers, and writers who promised to defend their colonial possessions at all costs is a long one. But despite that, within just twenty years, both empires had vanished.
In the two videos below Martin Thomas, author of Fight or Flight: Britain, France, and their Roads from Empire, discusses the disintegration of the British and French empires. He emphasizes the need to examine the process of decolonization from a global perspective, and discusses how the processes of decolonization dominated the 20th century. He also compares and contrasts the case of India and Vietnam as key territories of the British and French Empires.
Click here to view the embedded video.
Click here to view the embedded video.
Martin Thomas is Professor of Imperial History in the Department of History at the University of Exeter, where he has taught since 2003. He founded the University’s Centre for the Study of War, State and Society, which supports research into the impact of armed conflict on societies and communities. He is a past winner of a Philip Leverhulme prize for outstanding research and a holder of a Leverhulme Trust Major Research Fellowship. He has published widely on twentieth century French and imperial history, and his new book is Fight or Flight: Britain, France, and their Roads from Empire.
Subscribe to the OUPblog via email or RSS.
Subscribe to only history articles on the OUPblog via email or RSS.
The post Britain, France, and their roads from empire appeared first on OUPblog.










March 16, 2014
Shirley Temple Black: not a personality to be bunked
How does one talk about a child star without lapsing into clichés? Shirley Temple was “the biggest little star,” the “kid who saved the studio,” and as she was called in the 1930s, “the baby who conquered the world.” Temple, who died 10 February 2014, at the age of eighty-five, was not Hollywood’s first child star and she was by no means the last, but she was inarguably the most important and certainly the most iconic. Temple became a cultural phenomenon as well as a movie star. Reported to be the most frequently photographed person of the world in the 1930s, she eclipsed the success of all previous child actors. Number-one box-office draw in the United States for four years in the mid-1930s, she was ranked within the top-ten list of box-office attractions for a record seven years by the Motion Picture Herald exhibitors’ poll. Temple was popular with audiences wherever Hollywood feature films were shown—with the notable exception of France, which never took to her. Her appeal was not just to children, and it was widely asserted that it was the adult public that made her a star.
Mrs. Gertrude Temple, a Santa Monica housewife, groomed her only daughter for stardom. She put three-year old Shirley in a dance school famous as a conduit into the film industry and styled the child’s blonde hair into the famous 55 sausage curls. After a brief stint in Poverty Row shorts, Shirley Temple rose to stardom in 1934, at age six (published age five) after drawing attention for her role in a major feature film, Fox’s Stand Up and Cheer! (1934). In 1934, Hollywood needed a visible renewal of innocence on screen in the wake of threats of a nationwide boycott of the movies by the Catholic Legion of Decency, but Fox Film Corporation was initially not sure what to do with her. The studio loaned her out to Paramount for what would be her first big hit, Little Miss Marker (1934), a Damon Runyon tale in which she plays an orphan reluctantly adopted by a misanthropic bookie (Adolphe Menjou).
Temple quickly went from featured player to “the name above the title” in musical comedies shaped for her. Temple’s profitability to the Fox Film Corporation, reorganized in 1935 as Twentieth Century-Fox, was tremendous. Her almost two-dozen star vehicles made for the studio usually cost less than $300,000 each to produce but were reputed to have grossed from $1 million to $1.5 million on first-run showings alone.
Her popularity may seem strange now, but watching child performers was nothing new. Temple was part of a tradition. Theatrical entertainments highlighting the spectacle of children were very popular in the late 19th century in the United States and Britain, and in the 1910s, Hollywood was filled with child actors, mainly in supporting roles. Temple’s films at Fox have been dismissed as sentimental goo, but there was also something about her — an exciting quality shared with many other mega-stars over the years, an unsettling of boundaries instead of just a confirming of comfortable truths, whether about gender, sexuality, race, class, or age. Other stars had it too: Greta Garbo, in her androgynous beauty; John Wayne, in his occasional display of almost feminine gentleness. Temple was a charismatic musical star, a beautiful little white girl who was an eager acolyte to black tap dance artist Bill “Bojangles” Robinson, a fearless, daring tom boy, but also a cuddly daddy’s girl.
While audiences of the 1930s were fascinated with her energy and humor, 21st century cultural commentators, including many feminist film critics, find Temple’s films redolent of pedophilia, as did British novelist and film critic Graham Greene, who was sued by Fox in 1938 for his published comments on her (he lost). Adult desire and its imposition on children is real but so too is the complexity and range of meaning and pleasures located by audiences in the performances of movie stars, including Temple. While Greene thought Temple a fleshy coquette who aroused old men unaware of their own desires, theater critic Gilbert Seldes thought that Temple communicated a refusal to be fooled, to be “bunked,” sparking men’s admiration for and identification with her as a personality. Temple was a symbol of cheerful resilience and America’s most powerfully persuasive common values. In her films she energetically embodied the promise of a more perfect future.
But love cannot last forever, and little girls grow up. In the late 1930s, the luster of both Temple’s curls and her box-office power dimmed. Temple’s contract with Fox was abrogated in 1940. She made one film Kathleen (1941) at MGM, and a feature for B-picture producer Edward S. Small, Miss Annie Rooney (1942). Both were flops, and she “retired” to finish high school. David O. Selznick offered her a lucrative contract based on his confidence in George Gallup’s Audience Research Institute: their polling suggested Temple was beloved, possessing more drawing power than many of the top female attractions in the film industry. With Selznick, Temple had a brief resurgence, including two films in which she played America’s most famous fictional teenager, Corliss Archer. Temple longed to be in more films like Since You Went Away (1944), which marked her successful return to the screen, but her boss could not mount his personal productions quickly enough to satisfy the balance sheet on her salary, and loaning her out often made him a large profit.
By the late 1940s, a young mother entering her twenties, Temple found herself type cast in the role of the teenage bobbysoxer. In 1949, she divorced John Agar, whom she had wed as a seventeen-year-old. She met Charles Black, ex-Naval officer and scion of one of California’s most socially prominent families. As recounted in her best-selling 1988 autobiography, Child Star, she was determined not to be fooled again by a man’s good looks. he had Charles Black investigated by her friends in the FBI before she walked down the aisle with him in December of 1950. They would be married until his death in 2005.
Shirley Temple reveled in her role of wife and mother (of three), but took a plunge into politics with an unsuccessful run for a congressional seat in 1967. She had a longtime interest in international affairs, first demonstrated when she asked Selznick to let her go to a world youth conference in the United Kingdom at the height of World War II (he refused). She was appointed in 1969 to the United Nations delegation by Richard Nixon and later served as ambassador to Ghana and then Czechoslovakia. In spite of skepticism, she succeeded in these and other important diplomatic assignments, winning praise from Henry Kissinger as “very intelligent, very tough-minded, very disciplined.” Here was a woman, like the child, who was not a personality to be “bunked.”
Gaylyn Studlar is David May Distinguished Professor of the Humanities at Washington University in St. Louis and author of Precocious Charms: Stars Performing Girlhood in Classical Hollywood Cinema (2013, University of California Press), which features a chapter on Temple, “Cosseting the Nation; or, How to Conquer Fear Itself with Shirley Temple.” She is a contributor to Oxford Bibliographies in Cinema and Media Studies.
Developed cooperatively with scholars worldwide, Oxford Bibliographies in Cinema and Media Studies offers exclusive, authoritative research guides. Combining the best features of an annotated bibliography and a high-level encyclopedia, this cutting-edge resource guides researchers to the best available scholarship across the field of Cinema and Media Studies.
Subscribe to the OUPblog via email or RSS.
Subscribe to only television and film articles on the OUPblog via email or RSS.
All images courtesy of Gaylyn Studlar.
The post Shirley Temple Black: not a personality to be bunked appeared first on OUPblog.










Harry Nilsson and the Monkees
Singer-songwriter Harry Nilsson worked in the computer department of a California bank throughout the early 1960s. For much of that time, he managed the night shift, clocking on in the early evening and finishing around 1 a.m. Then, instead of going to sleep, he wrote songs all night. Being a man of considerable energy, he spent the daytime hawking his songs around publishers.
There were a few successes. He made a small number of single records himself, and some of his songs were picked up by others, including Phil Spector. He recorded some of them with the Ronettes and the Modern Folk Quartet, but failed to issue the discs at the time.
Then in March 1967, the Modern Folk Quartet’s bassist Chip Douglas began working as a producer for the Monkees. He invited Nilsson to come and demonstrate some of his songs, and the result was that the Monkees, with Davy Jones singing the lead vocal, recorded “Cuddly Toy”.
This became a hit, and Nilsson earned enough in royalties to be able to quit his job at the bank, and begin his own career as a singer. The song also marked the beginning of a long friendship with the Monkees, and particularly Micky Dolenz and Davy Jones, who would, many years later, star in the London stage production of Nilsson’s fantasy musical “The Point”.
In between, there was plenty more music. Nilsson penned “Daddy’s Song” for the Monkees movie Head, and he and Davy Jones appeared together singing in television commercials. There was horseplay in the Los Angeles studios of RCA, when the Monkees and Nilsson found themselves in adjacent booths, recording for the label. And there was hanging out together, as Nilsson would do at Dolenz’s Laurel Canyon home, thinking up ideas for songs, playing parlor games, and having the occasional drink or seven. Nilsson and Micky became such close buddies, that when Nilsson went to Ireland to meet his prospective wife’s parents, Micky came along as well. Micky was also one of the many rock stars who borrowed Nilsson’s London apartment (that would later become infamous when both Mama Cass and Keith Moon died there).
Although he was perhaps more famous for his associations with John Lennon and Ringo Starr, Nilsson was equally involved with “America’s answer to the Beatles”.
Alyn Shipton is the award-winning author of many books on music including Nilsson: The Life of a Singer-Songwriter, A New History of Jazz, Groovin’ High: the Life of Dizzy Gillespie, and Hi-De-Ho: The Life of Cab Calloway. He is jazz critic for The Times in London and has presented jazz programs on BBC radio since 1989. He is also an accomplished double bassist and has played with many traditional and mainstream jazz bands.
Subscribe to the OUPblog via email or RSS.
Subscribe to only music articles on the OUPblog via email or RSS.
Image credit: Monkees disc cover via 45cat.
The post Harry Nilsson and the Monkees appeared first on OUPblog.










March 15, 2014
Leaning in
I am one of the last professional women I know to read Lean In by Facebook COO Sheryl Sandberg (Knopf, 2013). If you are also among the laggards, it is an inspiring call to women to lean into leadership. Too often, Sandberg shows through research and life story, women are not considered “leadership material,” and not just by men. We also send that message to ourselves, and attribute any success to external factors such as luck and the support of others. We just don’t think we have the right stuff to be leaders.
Too bad Sheryl Sandberg has not been to Germantown Avenue in Philadelphia. After studying the communities of faith along that one street—around 88 congregations, the number fluctuating year to year—I found one thing that stumped me. There are a whole lot more women in leadership in these houses of worship than in any national sample of clergy. The most generous research findings reflect 10-20% of congregations to be headed by women in the United States today. In my sample, 44% of communities of faith have female leadership. This phenomenon is true across the religious spectrum. “Prestigious pulpits” in the historic Mainline Protestant churches are disproportionately occupied by women. But so were the pulpits in small independent African-American churches. Two of the three mega-churches had women as co-pastors. In the third, the associate pastor is a woman and considered the heir-apparent for the senior position. Two of the three peace churches had women leaders. There are no longer Catholic churches on the Avenue (which don’t have women priests), and the two mosques I researched were led exclusively by men. But the small Black spiritualist Hurleyite congregation (Universal Hagar) has a woman as pastor.

Universal Hagar Church, a Hurleyite congregation, is located across the street from Fair Hill Burial Ground. Photo by Edd Conboy. Used with permission.
How can we account for this? It might have something to do with Philadelphia’s cultural history of inclusivity, providing a context in which women broke through the stained glass ceiling in the AMEZ and Episcopal traditions. Perhaps it is more closely related with the Great Migration North, in which women sought out church anchors in neighborhoods in which to settle. Frankly, I am hoping a researcher will figure this out…and bottle it!
More impressive to me than the numbers are the amazing women I interviewed. Women like Pastor Jackie Morrow, who started a church and a school in a row house, and ministers to everyone in her corner of Northwest Philly, from the young men who play basketball in her parking lot to the mentally challenged woman who regularly stops by for prayer, food, and a hug. Or Rev. Melanie DeBouse, who pastors in the poorest neighborhood in the city and is teaching young children to “kiss your brain” and older men how to read. Or Rev. Cindy Jarvis, senior pastor at the Presbyterian Church of Chestnut Hill, where she oversees a budget of over a million dollars and has underwritten efforts to prevent gun violence, provide health care for the poor, and a vibrant social and educational program for seniors. These women, and others on the Avenue, are leaning in to take leadership roles not in corporations but in the trenches of gnarly urban problems.
Make no mistake: I like Sandberg’s book. But the clergy women of Germantown Avenue are leaning into stronger headwinds with impressive competence and confidence. They inspired me more.
Katie Day is the Charles A. Schieren Professor of Church and Society at the Lutheran Theological Seminary at Philadelphia. She is the author of Faith on the Avenue: Religion on a City Street and three other books and numerous articles that look at how religion impacts a variety of social realities.
Subscribe to the OUPblog via email or RSS.
Subscribe to only religion articles on the OUPblog via email or RSS.
The post Leaning in appeared first on OUPblog.










Oxford University Press's Blog
- Oxford University Press's profile
- 238 followers
