Oxford University Press's Blog, page 626
August 23, 2015
The value of knowledge
Traditionally, the story that opens chapter three of Genesis is called ‘The Fall’. In the Christian tradition, both the name and the interpretation of the story associated with it were made canonical by Saint Augustine in the first decades of the fifth century AD, about fourteen hundred years after Genesis was written down. The interpretation, which derives essentially from Paul’s letter to the Romans, is as follows.
Before they ate the knowledge-giving fruit, Adam and Eve were, we are told in the last verse of chapter two, “naked and not ashamed”. (According to Augustine, their nakedness was not shameful for the odd reason, which has no basis in the Bible, that the physical signs of sexual arousal were until then under their voluntary control).
Satan, a fallen angel, envious of man’s innocent and non-fallen state, chose the serpent to “insinuate his persuasive guile into the mind of man” because “being slippery, and moving in tortuous windings, it was suitable for his purpose” (City of God, 14.11).
God had told Adam he would die if he ate the fruit, but Eve was persuaded by the serpent that the threat was empty, and that if she ate the fruit she would herself become like a god. Adam was not persuaded, but he yielded to Eve, “the husband to the wife, the one human being to the only other human being” (14.11).
This is the orthodox interpretation of the story in the Christian tradition, and the canonical interpretation in the Roman Catholic church. But it cannot be be right.
Augustine acknowledges that it may not be immediately obvious that Adam and Eve committed an act of “great wickedness” (14.12). But he insists that we should not think that the sin was a small and light one, because it was committed about food. On the contrary, “obedience is the mother and guardian of all the virtues”, and preferring to fulfil one’s own will, instead of the Creator’s, “is destruction” (14.12).
This is how Augustine summarizes his interpretation of the story: Adam and Eve committed such a great sin “that by it human nature was altered for the worse, and was transmitted also to their posterity, liable to sin and subject to death” (14.1).
This is the orthodox interpretation of the story in the Christian tradition, and the canonical interpretation in the Roman Catholic church. But it cannot be be right.
First, nakedness was not regarded as a sign of blissful innocence when the story was originally told and written down, but as primitive and animal-like. It is extraordinary that commentators continue to miss this point. For example, the Cambridge New Bible Commentary on Genesis glosses the last verse of Genesis 2: “Although they were ‘naked’ there was no shame in it.” But this is not what the verse says. It says, “And they were both naked, the man and his wife, and not ashamed”, which is of course quite different.

Second, Satan is not mentioned in the story. He appears in Jewish writings about four centuries after Genesis was written down; and there, in Job for example, he is clearly subordinate to God and unable to act without his permission. He only emerges as an independent personality, and as the personification of evil, in the first century AD, and the earliest statement in Jewish writings that he was responsible for the Fall is at the end of the first century.
Third, as for the serpent himself, there is no indication in the text that he is wicked. He is described as “arum”, which means, crafty, shrewd or cunning—like the Greek word polymetis, which Homer uses as Odysseus’s epithet. What is clear is that he knows that the humans will not die upon eating the forbidden fruit, but will become “like Gods, knowing good and evil” (Genesis, 3.5), as God himself acknowledges they have done: “’Behold, the man is become as one of us, to know good and evil” (3.22).
Fourth, the orthodox interpretation of the story ignores God’s lie. God says to Adam: “Of the tree of the knowledge of good and evil, thou shalt not eat of it: for in the day that thou eatest thereof thou shalt surely die” (2.17). The serpent says: “Ye shall not surely die” (3.4), which turns out to be true. Ever since Paul, commentators have finessed this point by reading “die” as “become mortal” or “become susceptible to eternal death”. But “die” is not used to mean these things anywhere else in the Hebrew scriptures. And besides, the creation story implies that Adam and Eve were mortal before they ate the fruit, because God expels Adam from the Garden of Eden to ensure that he will not become immortal, by eating from the tree of life (3.22-3).
Fifth, it cannot have been wicked or sinful on the part of Adam and Eve to eat the fruit of the tree of knowledge, because when they ate it they did not yet know the difference between good and evil. Of course, they knew they were disobeying God. The story implies that this is something one can know without yet understanding evil, wickedness or sin. And no doubt this is correct. But disobedience in a state of moral innocence or ignorance, even deliberate disobedience—for example, by young children—is not evil, wicked or sinful, regardless of whom one disobeys.
Sixth, knowledge in general, and knowledge of good and evil in particular, are good for human beings. This has always been acknowledged as the greatest obstacle to regarding God’s commandment not to eat the fruit as just, as Milton explains in Book 9 of Paradise Lost, when the serpent advocates disobedience to Eve with consummate forensic skill.
For these reasons, it seems certain that the story was not originally meant to be about human sin and just punishment by a just God. Like the story of Prometheus, it is about a deceitful god who is jealous of human progress and visits the most terrible retribution on the man and woman who take the first perilous and defiant step towards civilized human life, and on the semi-divine character who helps them. In sum, it is the earliest affirmation in our culture of the value of knowledge for human beings, and its indispensible place in human life.
The post The value of knowledge appeared first on OUPblog.

August 22, 2015
The curious case of culprit
Amnesia, disguises, and mistaken identities? No, these are not the plot twists of a blockbuster thriller or bestselling page-turner. They are the story of the word culprit.
At first glance, the origin of culprit looks simple enough. Mea culpa, culpable, exculpate, and the more obscure inculpate: these words come from the Latin culpa, “fault” or “blame.” One would suspect that culprit is the same, yet we should never be so presumptuous when it comes to English etymology. Culprit is indeed connected to Latin’s culpa, but it just can’t quite keep its story straight.
A criminal history
The Norman Conquest in 1066 crowned the French language in England for centuries, forever changing the English tongue as a result, as is particularly evident in vocabulary. As it was the language of the nobility, French became the language of government and administration, including the law courts. Separated from the continent and interacting with English, a variety known as Anglo-Norman French developed, serving as the basis of Law French, the language used in the English courts. Even as the status of French gradually began to ebb in the 13th century, French continued to influence the English language, and Law French continued to mediate the proceedings at the bar until the end of the 17th century.
One such proceeding was the opening of a trial, which relied on a particular legal formula. When a defendant pleaded “not guilty” before the judge, the prosecutor replied, originally in Old French: Culpable: prest d’averrer nostre bille, or Guilty: ready to aver our indictment, as the Oxford English Dictionary (OED) provides. In other words, the prosecutor answered the defendant’s plea of “not guilty” with a charge of “guilty”–and was ready to prove it.
Much as today, the court recorded these proceedings and relied on abbreviations to do so. For this formula, culpable was abbreviated as cul. and prest as prit (or prist), an Anglo-Norman variant. Thus, court rolls read Cul. prit. Over time, familiarity with this French formula faded and the words ran together. The result, culprit, was confused for a way of addressing the defendant.
The OED first attests culprit in the 1678 trial of the Earl of Pembroke as accounted in Cobbett’s Complete Collection of State Trials (1810). According to these State Trials, the government indicted Philip, Earl of Pembroke and Montgomery for beating one Nathaniel Cony to death. A court officer, known as the Clerk of the Crown (Cl. of Cr.), brought the indictment of Philip, Earl of Pembroke (E. of Pemb.), as the State Trials documents:
Clerk of the Crown: How say you, Philip earl of Pembroke and Montgomery, Are you guilty of this felony and murder whereof you stand indicted, or not guilty?
E. of Pemb. Not guilty.
Cl. of Cr. Culprit, how will you be tried?
E. of Pemb. By my peers.
Cl. of Cr. God send you a good deliverance.
The presumption of guilt
By the 1700s, culprit was merely naming the accused. Come the latter half of the century, culprit was naming the guilty, knowledge of its specific Law French origin already ceding to a popular assumption that it traced immediately back to Latin’s culpa. Of course, we’re far from misguided in assuming culprit is a direct derivative of culpa. The Anglo-Norm French responsible for the abbreviated cul., culpable, is indeed from the Latin adjective of culpa, culpabilis. That prit re-supplies a p only reinforces the connection. But trials–and etymologies–can indeed be won and lost on technicalities.
On an etymological level, culprit is thus the stuff of a crime drama. The identity of a culprit gets mistaken, the presumption of innocence lost. The fusion of cul. and pritmake culprit a compound noun in disguise. Amnesia, if you will, set in as the original French was forgotten. Etymology seldom has direct evidence as its disposal, but, culprit, for all its twists and turns, has a pretty compelling case.
A version of this post first appeared on the OxfordWords blog.
Image Credit: “bokeh link fence” by Will Montague. CC BY NC 2.0 via Flickr.
The post The curious case of culprit appeared first on OUPblog.

What is climate change law?
Some years ago Dave Markell and I noticed that commentary on climate change law was devoting a tremendous amount of attention to a small handful of judicial opinions as being representative of trends in climate change litigation, whereas inventories of climate change litigation, such as the Columbia Law School’s Sabin Center blog, included hundreds of active and resolved cases. We thought it might be useful to take an empirical look at all the cases to assess what the corpus of climate change litigation was about and its trajectory. This sounded a lot easier at the time than it turned out to be. Much time and effort was spent reading, analyzing, and coding cases over the next two years! But one of the first hurdles the article faced before we could get into the data morass was the seemingly simple task of defining what is and is not climate change litigation.
I had a similar experience when Michael Gerrard of Columbia Law School and Jody Freeman of Harvard Law School invited me to contribute a chapter on climate adaptation law to their wonderful volume, Global Climate Change and U.S. Law. I gladly accepted, but not long after I hit the send button I realized I’d have to define what is and is not climate change adaptation law.
Questions like these are not merely exercises in existentialism; they have practical implications. As a prior post on this blog convincingly argued, climate change is not only environmentally disruptive, it is also a legally disruptive force. In other words, there will be developments in law—claims won or lost in litigation, regulations adopted or repealed, institutions formed or changed—that would not have occurred but for the goals of mitigating and adapting to climate change. If we could scoop together all such events, that mass of legal content would be the corpus of climate change law; it is what one would study and evaluate to assess the legally disruptive effects of climate change.
It is as important for lawyers to study of the effects of climate change on the legal system as it is for ecologists to study the effects of climate change on ecosystems. It is important to find where climate change is putting pressure on the legal system and where the cracks are forming. It is important to examine whether climate change law is filling those cracks or making them more fragile. It is important to study the effects of this new legal “species” on other parts of the legal system as well as other realms of social policy. It is important to consider whether the different parts of climate change law are fitting together or working at cross-purposes. In short, studying climate change law is important, so it is important to know what climate change law is.
But this raises two difficult questions. The first is obvious—how do we apply the “but for” test? It is difficult to draw a sharp line that works in all cases. For example, Dave Markell and I defined climate change litigation as any litigation in which the party filings or tribunal decisions directly and expressly raise an issue of fact or law regarding the substance or policy of climate change causes and impacts. After scouring the US litigation landscape for all active and resolved litigation through 2010, we found about 200 pieces of litigation that met this “direct and express” test. We acknowledged, however, that some litigation might be motivated purely by climate change mitigation or adaptation, and thus meet the “but for” test, yet never mention climate change in pleadings or decisions, and thus fail our “direct and express” test. For example, opposition to a coal-fired power plant permit might allege only technical procedural error as the legal basis for the litigation but be based entirely on concerns about climate change. Nevertheless, given that we could not read minds, we decided to use an objective filter based on the text of the relevant documents.
Similarly, my chapter on climate change adaptation law based its scope on the “but for” test by examining whether climate change adaptation is clearly mentioned in the legal text and motivated the legal event. As with our litigation project, however, one can envision an adaptation legal claim or regulatory measure motivated by climate change but not meeting this strict test. For example, a requirement that new buildings meet energy and water efficiency standards might be a response to climate change, but the measure itself might not mention climate change adaptation as its purpose.
My sense is that over time there will be more and more legal developments of this kind—brought about because of climate change but not expressly about climate change. This is because the vast majority of climate change litigation and regulation to date has been about mitigation policy, which is difficult to frame without directly and expressly engaging the context of climate change in the pleading, regulation, ruling, or other legal text. As the physical and social effects of climate change take greater hold, however, they will increasingly become part of the baseline of legal fabric. They will lead to litigation and regulation, and change institutions. Such physical and social effects have consequences motivating legal responses across a broad realm of practical concerns, but it will not be necessary to identify climate change as their causal force or as their target. When defining what is and is not climate change law, therefore, it may become necessary to relax the “direct and express” standard and focus on the deeper and more subtle motivations and consequences to identify the “but for” body of law defining the corpus of climate change law. This will make identifying climate change law more difficult, but no less important.
The second question is whether, regardless of how we define climate change law, the mass of legal content that passes the test deserves to be known as “climate change law.” As Jim Salzman and I put it in an article exploring this issue, will climate change law develop into a field of law or a law of the horse? Again, this may seem pretty existential, but it too has a profoundly practical dimension. Why, for example, do we think of “tax law” as a distinct field but no one professes to practice “automobile law”? Legal scholars have argued that a field of law requires a commonality of problems and usefulness of joint treatment of those problems by the law. Tax law, for example, deals with taxation wherever it pops up, whereas automobiles intersect with the law in wide variety of disconnected contexts. More importantly, forming a field of law can serve political ends by legitimating a social movement (think of environmental law), can enhance efficiency by providing a focal point for legal and other expertise, and can orient laws and policies in a coherent form. But one cannot simply declare that a field of law exists; it forms organically over time and has to make sense.
So, what of “climate change law”—is it more like tax law or automobile law? This may sound odd coming from someone who spends time thinking about what is and isn’t climate change law, but my sense is that the dynamics will send us more in the direction of automobile law. The point of climate change law is to change the way we do things, broadly and deeply. That effort will affect all aspects of life profoundly, and thus all fields of law will have to adjust and incorporate the new way of doing things internally. What that means for tax lawyers will be different from what it means for environmental lawyers, labor lawyers, land use lawyers, and every other kind of lawyer. In short, if changes to the tax laws are made in response to climate change and that causes new issues for a business, the business will look for a tax lawyer, not a climate change lawyer.
The irony may be, therefore, that lawyers focused on climate change succeed in building a large mass of “but for” climate change mitigation and adaptation law, but in doing so work themselves out of a job. I can live with that!
Headline image credit: Climate change. Photo by Eric Wüstenhagen. CC BY-SA 2.0 via Flickr.
The post What is climate change law? appeared first on OUPblog.

Moral responsibility and the ‘honor box’ system
If you’ve worked in an office, you’re probably familiar with “honor box” coffee service. Everyone helps themselves to stewed coffee, adds to the lounge’s growing filth, and deposits a nominal sum in the honor box, with the accumulated proceeds being used to replenish supplies. Notoriously, this system often devolves into a tragedy of the commons, where too many people drink without paying. Unless some philanthropic soul goes out of pocket to cover freeriders, the enterprise goes in the red, and everyone’s back to extortionate prices at the cafe.
Fortunately, the tragedy of the honor box may be readily ameliorated; if images of eyes are placed prominently near the coffee service, deposits increase. Or so Bateson and her colleagues (2006) found: the take in a Psychology Department’s honor box (computed by amount contributed per liter of milk consumed) was nearly three times as large when the posted payment instructions were augmented with an image of eyes as when they were augmented with an image of flowers.
On the standard interpretation, the eyes remind people that they may be seen – not so easy to stiff the honor box in front of a disapproving colleague – and pay a reputational cost for freeriding. Since human beings are social organisms sensitive to reputational considerations, they may thereby be moved to donate.
Participants in such studies are not typically debriefed, so we don’t know for sure what they were thinking. But the most likely reading is that people are, in a sense, not thinking much of anything. That is, the Watching Eyes Effect is supposed to involve an unconscious, effortless, processing, rather than conscious, concerted calculation; the eyes are hypothesized to influence behavior without those influenced being aware.
If people aren’t typically aware of the Watching Eyes Effect when they’re being affected by it, what might they think, if they found out afterwards? A cheapskate with a policy of freeriding might feel a little resentful; he’s been made to do something he doesn’t judge to be sensible. A more upright sort might think she’s done the right thing, but not for the right reasons; doing it because you’re watched is not the same thing as doing it because it’s decent, honest, or fair. Those who favor fair play only when it burnishes their reputation might also have qualms, since Watching Eyes may influence people in conditions conducive to anonymity (e.g., Haley and Fessler 2005: 250). In none of these cases does “I did it because of the eye spots,” sound like a compelling rationale.
The Watching Eyes Effect is part of a large family of studies identifying influences on behavior that are both unconscious and unexpected. In sum: you may not know what you’re doing, or why you’re doing it, and if you did know, you might not like it. Evidently, the subversive unconscious is everywhere at work (though these workings may be more absurd than Oedipal). Should you take this prospect seriously, you ought begin to worry about who — or what — is running your show. You should begin to wonder about the extent to which you exert rational control over your behavior: maybe the “rational animal” isn’t so rational after all. This worry, if I’m right, is a worry about what philosophers call agency, the ability of a person to order her behavior such that the attribution of moral responsibility – and so, the assigning of credit and blame – is appropriate.
If the foregoing is right, we ought experience skeptical anxiety about morally responsible agency. But we needn’t remain in a state of anxiety. When we put in the right sort of theoretical work, we can see that people do exert the sort of rational control over their lives fitting them for the honorific agent, the multitude of disturbing scientific notwithstanding. But that’s a story for another post.
This post originally appeared on the Washington University in St. Louis Centre for the Humanities website on April 13, 2015, and on the Philosophy of Brains blog on May 12, 2015. It has been slightly modified for the OUPblog.
Featured image credit: “7/365 – Blue eyes”, by Axel Naud. CC BY 2.0 via Flickr.
The post Moral responsibility and the ‘honor box’ system appeared first on OUPblog.

August 21, 2015
Off the beaten path: An insider’s guide to Tampa history for #OHA2015
There are less than two months left before we converge on Tampa for the Oral History Association’s annual meeting! This week, we asked Jessica Taylor of the University of Florida’s Samuel Proctor Oral History Program, who authored “We’re on Fire: Oral History and the Preservation, Commemoration, and Rebirth of Mississippi’s Civil Rights Sites” in the most recent Oral History Review, to give us the inside scoop on some local stories of interest to oral historians. Check it out below, and make sure to book your tickets while they’re still available.
“You, the favored inhabitants of the Sun God’s Golden Land—of the pearl-made beaches, conversing palms, persuasive breezes, of little giggling waves which, surprising you on the seashore seeking amusement, steal up and deposit gifts of tinted shells and pebbles at your feet, then dance away to gather more for you; you for whom the more than stately pleasure domes that far surpass the most extravagant dreams of Kubla Khan in Xanadu have been built—how aware are you of the subterranean force which flows beneath your fabulous mansions?”
—Zora Neale Hurston, “Florida’s Migrant Farm Labor
Unless we all boarded a plane for Sandals™, it’d be hard to plan a conference in an environment more expertly engineered for relaxation than Florida in the winter. A hundred million tourists agree; they’d like to spend their vacation time and cash (around $82 billion in 2014) on the coast, in restaurants with marvelous food. They’d like to drive very, very slowly down shady streets framed by Victorians houses and live oaks and Spanish moss. They’d like to take a long series of pictures in a new and exciting place. They’d like to escape.
Florida historians, including the ones with recorders, insist time and again that escape isn’t possible. Florida is a political and social reflection of both the South and the nation as a whole, and the movement of people and changes in landscapes represent, as Steve Noll writes in Ditch of Dreams, “visions of progress, economic growth, and preservation.” Online news sources might highlight Florida’s bizarre or terrifying daily occurrences, but visitors continue to pour into the Sunshine State in record-breaking numbers even after news of Trayvon Martin’s death, Jeb Bush’s run, manatee insanity, and the struggle for migrant farm worker rights reverberated beyond.
Florida is still unique, though, for the same reason that the field of oral history is special. Florida’s beaches are still beautiful, and its people still diverse, because historians, environmentalists, and working people struggled over the past century to preserve their communities. That’s why you’ll like it so much. Here’s what to look forward to:

Off the beaten path: An insider’s guide to Tampa history for #OHR2015
There are less than two months left before we converge on Tampa for the Oral History Association’s annual meeting! This week, we asked Jessica Taylor of the University of Florida’s Samuel Proctor Oral History Program, who authored “We’re on Fire: Oral History and the Preservation, Commemoration, and Rebirth of Mississippi’s Civil Rights Sites” in the most recent Oral History Review, to give us the inside scoop on some local stories of interest to oral historians. Check it out below, and make sure to book your tickets while they’re still available.
“You, the favored inhabitants of the Sun God’s Golden Land—of the pearl-made beaches, conversing palms, persuasive breezes, of little giggling waves which, surprising you on the seashore seeking amusement, steal up and deposit gifts of tinted shells and pebbles at your feet, then dance away to gather more for you; you for whom the more than stately pleasure domes that far surpass the most extravagant dreams of Kubla Khan in Xanadu have been built—how aware are you of the subterranean force which flows beneath your fabulous mansions?”
—Zora Neale Hurston, “Florida’s Migrant Farm Labor
Unless we all boarded a plane for Sandals™, it’d be hard to plan a conference in an environment more expertly engineered for relaxation than Florida in the winter. A hundred million tourists agree; they’d like to spend their vacation time and cash (around $82 billion in 2014) on the coast, in restaurants with marvelous food. They’d like to drive very, very slowly down shady streets framed by Victorians houses and live oaks and Spanish moss. They’d like to take a long series of pictures in a new and exciting place. They’d like to escape.
Florida historians, including the ones with recorders, insist time and again that escape isn’t possible. Florida is a political and social reflection of both the South and the nation as a whole, and the movement of people and changes in landscapes represent, as Steve Noll writes in Ditch of Dreams, “visions of progress, economic growth, and preservation.” Online news sources might highlight Florida’s bizarre or terrifying daily occurrences, but visitors continue to pour into the Sunshine State in record-breaking numbers even after news of Trayvon Martin’s death, Jeb Bush’s run, manatee insanity, and the struggle for migrant farm worker rights reverberated beyond.
Florida is still unique, though, for the same reason that the field of oral history is special. Florida’s beaches are still beautiful, and its people still diverse, because historians, environmentalists, and working people struggled over the past century to preserve their communities. That’s why you’ll like it so much. Here’s what to look forward to:

Military radiology and the Boer War
The centenary of the Great War has led to a renewed interest in military matters, and throughout history, war has often been the setting for medical innovation with major advances in the treatment of burns, trauma, and sepsis emanating from medical experience in the battlefield.
X-rays, discovered in 1895 by Roentgen, soon found a role in military conflict. The first use of X-rays in a military setting was during the Italo-Abyssinian war in 1896. The Italians lost the battle at Adoa and casualties were taken to the military hospital in Naples where X-rays were performed under the leadership of Colonel Alvaro.
In the United Kingdom, a Birmingham-based doctor was one of the earliest radiologists to gain experience in a military setting. John Hall-Edwards (1858-1926) started off as a general practitioner and following Roentgen’s discovery gave one of the earliest demonstrations of X-rays in the UK on 4 March 1896. He subsequently published an article in the Photographic Review journal of that year. Hall-Edwards, who was an accomplished photographer and an honorary fellow of the Royal Photographic Society, championed the new discovery of X-rays. In 1899 he became surgeon radiographer at the Birmingham General, Orthopedic, Children’s, and Eye hospitals, taking charge of the X-ray departments.
In 1900 Hall-Edwards went to South Africa to help during the Boer War and set about X-raying wounded soldiers. The field hospitals had no X-ray equipment; portable X-ray equipment was based in the general hospitals in Deelfontein. The portable equipment consisted of dynamos, coils, vacuum tubes, a bicycle frame, and a 12-month supply of developing chemicals and film. Over 200 patients underwent X-rays in a one year period to identify fractures, shrapnel, and bullet wounds from the Mauser bullets being used in the conflict. Today some of the early radiology equipment used in the Boer War is on display in the Museum of the History of Science in Oxford. (Oxford undergraduate E.G. Spencer Churchill bought the Wimshurst machine, X-ray tube, and fluorescent screen in 1898. He also went to South Africa to assist The Royal Army Medical Corps in the Boer War.) The experience of military radiology led him to produce his famous scientific paper ‘Bullets and their Billets’ published in the Archives of the Roentgen Ray, the leading British Radiology journal. He also wrote about the Boer War in the prestigious medical journal The Lancet.
Hall-Edwards returned to Birmingham where he had a distinguished career serving as editor of Archives of Roentgen Rays from 1904-1905, President of the Electrotherapeutic Society 1906 (forerunner of the Radiology section of the Royal Society of Medicine), and Vice-President of the Roentgen Society in 1915. He even became a City of Birmingham Councillor in 1920 and played an important role in civic affairs of the city serving on the Public Health, Library, Museum, and Art Gallery committees. Unfortunately he paid the ultimate price for being a pioneer. He succumbed to the harmful effects of X-rays, developing X-ray dermatitis. He was forced to have his hands amputated. He wrote a paper about this condition warning others about the harmful effects of X-rays. This, however, did not curtail his activities and he took up painting to an exhibition standard. He died in 1928, one of the British radiologists whose name is inscribed in the Martyrs Memorial in Hamburg.
Featured image: Gandhi with the stretcher-bearers of the Indian Ambulance Corps during the Boer War, South-Africa. Public domain via Wikimedia Commons
The post Military radiology and the Boer War appeared first on OUPblog.

Learning from Chris Norton over three decades—Part III
This is the third of a three-post series on the pro-feminist and activist Chris Norton by Michael A. Messner.
Flash forward to 2010. I was now a full tenured professor. I was working with two young male Ph.D. students who in some ways reminded me of myself thirty years earlier—inspired by feminism, wanting to have an impact on the world. Both Tal Peretz and Max Greenberg had, as undergrads, gotten involved in campus-based violence prevention work with men. Unlike three decades ago, this work had become pretty much institutionalized; a guy like Tal or Max now can plug in to a campus or community organization, be handed an anti-violence curriculum, and get to work with boys and men. I figured this was a great opportunity to do a study with these two guys, tapping in to the roots of men’s work against gender-based violence in the 70’s and 80’s, and contrasting it with the work being done today.
Of course, I thought of Chris Norton and Men Against Sexist Violence (MASV). I located Chris online. Ever generous, he agreed to be interviewed. In December of 2010 we went out to lunch, and did what older guys do: caught up on each other’s lives, shared our hopes, fears, and challenges we’d faced with our kids, commiserated about our ageing bodies. On this latter topic, Chris had more serious news to share than I. He was facing, with strength and optimism, a liver transplant in the near future.
We returned to his home, and settled in for the interview. We fell right in to a nice conversation, and I used bits of the transcript of my 1980 interview with Chris to prod his memory, and to probe ways in which he’d changed, or not, since then. Most interesting to me were his reflections on the work that MASV had done so many years ago. He joined the several other MASV men whom I would eventually also interview in saying that he was very proud of the work the group had done. But in retrospect, he said he wondered how effective they’d been, and figured that if the group had it to do all over again, they might have done their work differently:
“I don’t think I would go at it at all the way we did then, ‘cause I think in some ways, … I think we were doing something to prove something to ourselves and other people of our age group, and I don’t think we were thinking about, like, what’s it like to be a teenage boy in high school, and what are these images going to do to you when they’re shown up on a screen, and is it going to have any of the effect that we’re hoping to have? And I think it would have been really good to kind of get guys to talk about, well, there’s issues of bullying, issues of, you know, being popular, not popular. I mean, it seems like there could be a lot of things that could have been much more valuable, ‘cause in some ways, I think we almost had this stick and we’re going to beat you over the head with this thing. And… perhaps if they felt like they were more understood, maybe they could be more understanding of women and, and where women are coming from. And I think that would be more the way I would go about it now.
[Back then], we were really just making it up, I mean, it was the seat of our pants… we felt like we should be doing something. We were feeling like we need to be also talking about the same things that the women were talking about—but we basically just took their analysis and presented it. You know, it didn’t—it didn’t feel like it was coming from our core, you know, from who we were, other than maybe from our guilt.”
Part of my goal is to encourage today’s anti-violence activists to re-connect to that larger vision. It is stories from this generation of activists like Chris Norton that help to keep alive this larger vision.
Chris’s statement very neatly encapsulated about thirty years of change in the ways that men now approach doing violence prevention work with boys and men. It is important to chronicle the grassroots of this activism—set in place in the 70’s and 80’s by community groups like MASV— and its foundations of positive social change. It’s also important to honor this activism because today’s younger activists, however savvy and pragmatic they may be about the ways they approach boys and men with their message, may also have lost something very important that earlier activists like Chris Norton had: a grounding in a larger view of social change that viewed their efforts to stop sexist violence against women as intricately connected with efforts to humanize and bring justice to the world. For groups like MASV, feminist work with boys and men, Chris explained, was an integral part of a larger transformative movement:
“… an important way of sort of humanizing socialism, or getting rid of some of the hard-edged more Stalinistic tendencies that some socialist movements could have. And it also just made a lot of sense [for] those of us too, who also were rejecting militarism and the traditional terms of being masculine or man, and were looking for some kind of a new way—when I came to Berkeley initially I was involved in the anti-war movement, and lived in communal houses, and we’d gotten involved in the food conspiracy, and—it was all part of this whole, you know, sort of community, alternative society in a way that we kinda’ felt like we were creating it back then.”
In retrospect, like many radicals of his generation, Chris expressed frustration with the current prospects for transformative social change.
“I think I’ve retreated some degree from utopianism. But I do feel that it’s definitely possible to have a far more egalitarian society than we do. And I just, I feel like—and that’s part of my frustration too, is those of us who feel that way haven’t found ways to be very effective in putting forward that vision and, and making that vision something that’s attractive to people, and making people realize that what we’re living under is not actually that great for a lot of people, and it’s very difficult for a lot of people.”
Part of my goal is to encourage today’s anti-violence activists to re-connect to that larger vision. It is stories from this generation of activists like Chris Norton that help to keep alive this larger vision.
Featured image credit: Between the Door and the Street, Suzanne Lacy, Installation at the Brooklyn Museum by the Brooklyn Museum. CC0 public domain via Wikimedia Commons.
The post Learning from Chris Norton over three decades—Part III appeared first on OUPblog.

How much do you know about the American Revolution? [quiz]
This year was the 240th anniversary of the beginning of the American Revolution. Between 1760 and 1800, the American people cast off British rule to create a new nation and a radically new form of government based on the idea that people have the right to govern themselves. Ultimately, America defeated Britain, creating an independent nation called the United States of America.
Do you know your George Washingtons from your Thomas Jeffersons? Do you know your British tyrants from your American Patriots?
Test your knowledge of the American Revolution with this quiz, based on Robert J. Allison’s The American Revolution: A Very Short Introduction.
Quiz and Featured Image Credit: “American Flag”, by DWilliams. Public domain via Pixabay.
The post How much do you know about the American Revolution? [quiz] appeared first on OUPblog.

August 20, 2015
Misty Copeland dances On the Town
Misty Copeland captured the world’s attention this summer when she became the first black female principal dancer at the American Ballet Theatre. In late August, Copeland will once again be in the headlines when she stars in Leonard Bernstein’s On the Town for a limited engagement at New York’s Lyric Theatre, where she will bring the show’s nearly year-long run to a close. There, she will be joining a different sort of racial history from that of classical ballet. When On the Town debuted towards the end of World War II, its original production featured a mixed-race cast and progressive interracial staging, which directly challenged real-world segregation.
Copeland’s successes are heartening, yet they are taking place some seventy years after the breakthroughs of On the Town. Here we are, decades later, still celebrating racial “firsts” in the realm of high-profile performance.
In On the Town, Misty Copeland will take the lead role of “Ivy Smith,” which in 1944 was danced by the Japanese-American ballerina Sono Osato. It was an audacious casting choice that confronted historic exclusions; very few Asian Americans had previously been cast in leading roles on Broadway (Anna May Wong was the main exception). Furthermore, the United States was at war with Japan, and public discourse routinely vilified the Japanese. At a personal level, the FBI had interned Osato’s father as an “enemy alien.” As with most Japanese-American detainees at the time, Shoji Osato was held with no evidence whatsoever of subversive activity.
The original On the Town also included six African Americans out of a total cast of sixty. This ratio was not remarkable in those days. Rather, it was the actions on stage that made the difference. Black and white performers mingled to represent a multiracial Navy, which countered the segregated military of World War II. White women danced with black men, which transgressed an absolute social taboo in an era when anti-miscegenation laws were still enforced in many states. In addition to this radical onstage integration, a black violinist named Everett Lee was appointed concertmaster of the otherwise all-white pit orchestra. Nine months into the run, Lee became Music Director, which marked a racial milestone. These racial details might seem fussy or even minor to us today, but they were largely unheard of during World War II, when segregation continued to define American life.
On the Town also marked the Broadway debut of the composer Leonard Bernstein, whose passionate advocacy for civil rights became a lifelong commitment. Bernstein consistently featured African-American performers over the course of his illustrious career, whether the black pianist André Watts in a Young People’s Concert with the New York Philharmonic in 1963 or the choreographer Alvin Ailey in Bernstein’s Mass a decade later.
Misty Copeland certainly dances in a different world from Sono Osato. But hailing her promotion with American Ballet Theatre as a “first” feels a bit like déjà vu. This time, let’s hope that the achievements of a stunning dancer of color will set a precedent, opening the door so that mixed-race performances are no longer an exception but instead a regular component of everyday casting in both Broadway musicals and classical ballet.
Featured image: The cast of On The Town. (c) On the Town via onthetownbroadway.com.
The post Misty Copeland dances On the Town appeared first on OUPblog.

Oxford University Press's Blog
- Oxford University Press's profile
- 238 followers
