Oxford University Press's Blog, page 144
June 23, 2020
Black lives matter in prisons too
Recent events have spotlighted the pervasive and historic problem of racial disparities in criminal justice treatment in the United States. Videos of people seeking to use the police for racial control as well as videos of black people being killed by police have sparked outrage across the nation, and the world. Much of the attention, understandably, has been on police reform. However, if America’s knee, literally and figuratively, is on the neck of black people, its prisons and jails have played no small role in this situation. African Americans have much higher rates of incarceration than non-Latinx whites. The many collateral consequences of mass incarceration have thus fallen much more heavily on the necks of African Americans than on those of whites.
African Americans account for about 14% of the national population, but more than twice that, 33%, of all state and federal prisoners. African Americans are nearly six times more likely than whites to be in prison at any one time. A total of 1 in 3 black men born in 2001 is predicted to spend time in prison.
Why does this stark difference exist? Some propose that African Americans have higher incarceration rates mainly because they have higher rates of street crime (violent crime and property crime). Others say that African Americans have higher incarceration rates mainly because of racial bias, however unconscious, by police, prosecutors, and judges, and because the police target their behavior and neighborhoods.
An objective assessment of the research evidence yields a rather ambiguous conclusion: Both theses have support. To some extent, African Americans’ higher incarceration rates do reflect their higher crime rates, and to some extent they also reflect racial bias in the criminal justice system.
If black people do have higher street crime rates, it’s not because of any biological deficiencies, a racist explanation popular a century ago. Rather, it’s because they are so much more likely than whites to live in poverty and near poverty; to live in low-income urban neighborhoods whose social and physical features contribute to crime rates; and to be victims of racial discrimination and microaggressions. Criminologists have written that if white people somehow lived in these exact same circumstances, their crime rates would be as high as black people’s crime rates.
But criminal justice bias also puts many African Americans behind bars. Racial profiling does occur. African Americans are more likely than whites to be arrested for a given crime and furthermore, to be prosecuted on more serious charges and sentenced to incarceration after conviction. This is especially true regarding the legal war on drugs. Although illegal drug use by African Americans is no higher than that of whites, they are arrested and go to prison at far higher rates for such use. Even skin tone matters; African Americans with darker skin tones are more likely than those with lighter skin tones to be sentenced to prison, to be sentenced for longer terms, and to receive the death penalty in capital cases. Some researchers have sought to examine the sources of racial disparities in imprisonment rates, finding that from 20% to 56% of disparities in incarceration cannot be explained by disparities in arrest rates.
These two theses are not independent. Involvement in the criminal justice system increases the likelihood of future antisocial behavior. In addition, in what scholars have termed a process of “cumulative disadvantage,” racial and other disparities in the early stages of criminal justice can lead to further disparities in later stages and in the lives of people caught up in criminal justice.
Racial bias is not just an issue in terms of who goes behind bars. Inside prison walls, African-Americans are more likely to be assaulted by staff, while whites are more likely to be assaulted by other inmates. In addition, some research has found African-Americans are more likely to be placed in solitary confinement, which is associated with poorer mental health outcomes.
Importantly, if African-Americans are more likely to be sent to prison, there will be disparities in terms of who is returning to society after serving their sentences. We know that those leaving prison have a number of hurdles toward reintegration, including difficulties getting certain jobs, voting, housing, and re-establishing familial relations. All of this leads to a greater likelihood of engaging in crime and encountering the criminal justice system again.
So what to do? Many observers have offered recommendations on how to reduce racial disparities in the criminal justice system, and in corrections specifically. Advocates involved with the Sentencing Project argued that such strategies as ending legacy policies from the war on drugs, removing mandatory minimums, and offering implicit bias training to criminal justice system actors can help. Since there is racial disparity in incarceration rates, any reduction in imprisonment overall will alleviate these disparities; recent data suggests that the lowering of incarceration rates have reduced racial disparities in just this way. We agree with these points. Increasing the focus on reintegration, particularly the obstacles facing people released from prison, will also help break the cycle. While some work has suggested that “ban the box” policies, which prohibit employers from asking about criminal history until late in the job application process, may backfire due to implicit bias, other strategies to ensure employment of recently released individuals are likely to positively affect lives.
Racial bias is implicated in every stage of the criminal justice system. We are heartened by the recent worldwide protests against police killings and other abuse of African Americans, and we fervently hope that these protests will end this abuse. But if America does not also address the bias that exists in other stages of criminal justice, its knee will continue to lie on the necks of African Americans.
Image courtesy of Pixabay
The post Black lives matter in prisons too appeared first on OUPblog.

June 22, 2020
How Buddhist monasteries were brought back from destruction
In Beijing in 1900, as the chaos of the Boxer Uprising raged on, a Buddhist monk arrived at Dafo Monastery, seeking master Datong to make him an offer. The visitor was abbot of Cihui Monastery and wanted to offer Cihui Monastery to Datong. Datong agreed, and he arrived at his new monastery to find it dilapidated and overgrown with weeds. Undeterred, Datong worked day and night to clean up the space, engaged in fundraising both in the city and back in his native region in the northeast of China, and within about seven years he had completed a full reconstruction of the monastery, with new buildings, religious images, and a small monastic community. For his efforts, Datong was recorded for posterity as a “restorer of Buddha’s light.”
More than simply a story of a renovation project, accounts of monastery reconstruction in Chinese Buddhist history are religiously charged narratives that celebrate the renewal of religious life and the restoration of a lost frame for religious activity. As historical phenomena, they are embedded within histories of material culture, social networks of patronage, conflicts with local society, and national narratives of salvation and renewal. Rebuilding a single monastery often brought together elements from across the religious, social, and political realms. It was a deeply symbolic act both within the world of professional Buddhists as well as in the wider realm of Chinese society.
Between the end of the catastrophically destructive Taiping War (1850-1864) and the eve of the Cultural Revolution (1966-1976), thousands of Buddhist sacred sites were destroyed, sometimes accidentally, other times intentionally, but many of these were later rebuilt from the ground up. As they were reconstructed, the site that emerged from the ashes of the old was always at least slightly different from what had been there before, and in many cases, reconstruction was an opportunity to introduce change into what is otherwise a highly conservative institutional structure. New halls for new practices, new buildings for new endevours, and new symbolic roles in the Chinese nation, were some of the innovations added to the reconstructed sites.
Learning about the history of these reconstructions is thus a window into how Buddhism in China changed, both in response to pressures from without, but also thanks to innovation and dynamic leadership from within. The story of Buddhism in modern China cannot fully be understood without also understanding the story of how its centres of religious practice were resurrected from decay and destruction. The concerns and goals of monastery leaders in the 1890s, for example, were very different from those in the 1930s, and later those in the 1950s. Although they all operated within a religious framework that put great value into rebuilding these sacred structures and renewing Buddhist culture, the changing nature of the Chinese nation-state meant that patronage and regulations changed rapidly, as even monasteries were expected to play a productive and supportive role in the new nation.
Buddhist monasteries and other religious sites in China today continue to face exigent threats to their continued survival and vitality, with many historic sites having been rebuilt as static museums without any dynamic religious life within. They are also under increasing pressure to visibly and enthusiastically ally themselves to the party-state, and to redefine themselves as symbols of an imagined national identity that minimizes or excludes recognition of foreign influences and inspirations. Examining the history of how Buddhist monasteries were brought back from destruction and how reconstruction leaders such as Datong navigated a rapidly changing modern world, helps us to better understand the historical place of sacred sites in China, and to imagine how they might continue to play a leading role in both their local society as well as the nation as a whole.
Featured image via Internet Archive.
The post How Buddhist monasteries were brought back from destruction appeared first on OUPblog.

Income inequality drives health disparities
Pretax incomes for the poorest 50% of Americans have stayed mostly unchanged for the past 40 years, widening income gaps in the country. We leave the question of why inequality matters for the economy to others. What is of concern to us is whether income inequality matters to our health, and, to the extent that it does, how the health profession should respond.
In 1992, Richard Wilkinson, then a professor at the University of Sussex, published a paper in The British Medical Journal called “Income distribution and life expectancy.” The paper concerned 12 European countries and concluded that “the relation between income distribution and life expectancy is sufficiently strong to produce significant associations.” The paper’s thesis launched two decades of intense scientific discussion about the influence of national income inequality on health (and death), including several systematic reviews and books. This work, which continues to the present day, shows that income inequality is a foundational driver of physical and mental health. By way of example, a 2018 systematic review considers the relationship between income inequality and depression, and it concludes that across studies there is “greater risk of depression in populations with higher income inequality relative to populations with lower inequality.”
Why might income inequality affect the health of the public?
Countries or regions where there are wide gaps in income tend to be characterized by weaker social ties and less investment in the social and physical resources that create health. Countries with more income inequality are less likely to have healthy air, water, and food, safe places to work and play, and affordable quality housing—all of which are necessary for health.
For these reasons, income inequality should be a core focus of public health. It is true that discussing subjects like taxation policies, the minimum wage, and universal income guarantees broadens the scope of what we are used to discussing in the context of health. However, broadening the conversation in this way has long been a first step toward creating a healthier world. After all, at one time, few imagined that cigarettes had anything to do with health, and even doctors promoted smoking. It took a widening of our collective imagination to see tackling smoking and the tobacco industry as central to promoting health. In the same way, by discussing income inequality, we start to see the truth of what the data are telling us: money—who has it and who does not—is at the heart of health in our society. Until we address this, we will continue to see health gaps between those at the top of the economic ladder and those at the bottom.
Featured image: Pixabay
The post Income inequality drives health disparities appeared first on OUPblog.

June 21, 2020
Understanding quantum mechanics [quiz]
Mechanics is that part of physics concerned with stuff that moves, from cannonballs to tennis balls, cars, rockets, and planets. Quantum mechanics is that part of physics which describes the motions of objects at molecular, atomic, and sub-atomic levels, such as photons and electrons.
Although quantum mechanics is an extraordinarily successful scientific theory, on which much of today’s tech-obsessed lifestyles depend, it is also completely mad. The theory quite obviously works, but it appears to leave us chasing ghosts and phantoms, particles that are waves and waves that are particles, cats that are at once both alive and dead, lots of seemingly spooky goings-on, and a desperate desire to lie down quietly in a darkened room.
The madness is nothing new. Those who nursed quantum mechanics through its difficult birth and early childhood knew full well what they were getting themselves into, and endlessly debated its interpretation and meaning. Although the science has moved on, and we now know much more than we did a century ago, many of these debates remain unresolved. As the charismatic American physicist Richard Feynman once claimed, “I think I can safely say that nobody understands quantum mechanics.”
Some knowledge of its history helps to understand why Feynman felt justified in making this claim. So, how well do you know your quantum history? Take this quiz find out.
Featured Image Credit: Shanadat Rahman via Unsplash
The post Understanding quantum mechanics [quiz] appeared first on OUPblog.

June 20, 2020
Why we should revive dead languages
Approximately 7,000 languages are currently spoken worldwide. The majority of these are spoken by small populations. Approximately 96% of the world’s population speaks around 4% of the world’s languages, leaving the vast majority of tongues vulnerable to extinction and disempowering their speakers. Linguistic diversity reflects many things beyond accidental historical splits. Languages are essential building blocks of community identity and authority.
With globalization of dominant cultures, homogenization and Coca-colonization, cultures at the periphery are becoming marginalized, and more and more groups all over the world are added to the forlorn club of the lost-heritage peoples. One of the most important symptoms of this cultural disaster is language loss. Should we reclaim these languages? Absolutely. Here are three of the reasons:
The first reason for language revival is ethical: It is right.
Indigenous and minority languages are worthy of reviving for historic social justice. They deserve to be reclaimed in order to right the wrong of the past. These languages were wiped out in a process of linguicide. I personally know dozens of Aboriginal people who were stolen from their mothers when they were kids. I believe in Native Tongue Title, an extension of Native Title (compensation for the loss of land, in Australia). Governments should grant financial compensation for the loss of languages – to cover efforts to resuscitate a tongue or empower an endangered one. Language is more important than land. Loss of language leads not only to loss of cultural autonomy, intellectual sovereignty, spirituality and heritage, but also to the loss of the soul, metaphorically speaking.
The second reason for language revival is aesthetic: It is beautiful.
Diversity is beautiful, aesthetically pleasing. Just as it is fun to embrace koalas or to photograph baby rhinos and elephants, so, too, it is fun to listen to a plethora of languages and to learn odd and unique words.
For example, I love the word mamihlapinatapai in the Yaghan language, spoken in Chile’s Tierra del Fuego archipelago. The word is very precise and to the point in its meaning. Any attempt to translate it cannot be performed in fewer words than the following: “a look shared by two people, each wishing that the other would offer something that they both desire but are unwilling to suggest or offer themselves.” Despite the fact that any word in a language is translatable, there is a difference, at least aesthetically, between saying mamihlapinatapai and saying that long sentence in English.
An example for a concept that I have never imagined prior to learning Ancient Persian is nakhur, “a camel that will not give milk until her nostrils have been tickled.” As Nelson Mandela said, “when you speak a language, English, well many people understand you, including Afrikaners, but when you speak Afrikaans, you know you go straight to their hearts.”
The third benefit for language revival is utilitarian: It is viable and socially beneficial.
Language reclamation empowers people who have lost their sense of pride and at times even the reason to live. This well-being empowerment can save governments billions of dollars that would otherwise need to be invested in mental health and incarceration. Not to mention the various cognitive and health benefits of bilingualism. For example, native bilinguals are cleverer than themselves as monolinguals. Native and even non-native bilingualism delays dementia.
Given globalization and languages loss, language revival is becoming more and more relevant in 2020 as people seek to recover their cultural autonomy, empower their spiritual and intellectual sovereignty, and improve their well-being. Language revival has moral, aesthetic, psychological, cognitive, and economic benefits. It encompasses social justice, social harmony, diversity, employability, and mental health.
Featured Image Credit: sea water during sunset, by Anton Gorlin via Unsplash
The post Why we should revive dead languages appeared first on OUPblog.

Are militaries justified in existing?
Pacifism, in its most recognisable form, is an absolute, principled condemnation of war. Military abolitionism is the view that institutions devoted to war are not justified in existing. Most pacifists are also military abolitionists. This is unsurprising. After all, if you think that going to war is always wrong, then you’ll likely think that having armed forces at the ready does nothing but enable us—and perhaps even temp us—to do things that we ought never to do. One can, however, be a military abolitionist without being a pacifist. There is no incoherence in conceding that it is sometimes morally justifiable to use military force, while at the same time opposing the creation and maintenance of establishments that exist for the purpose of doing that.
Consider an analogy. Most of us would agree that shooting someone in self-defence can be morally permissible under certain circumstance. Imagine a home invader is culpably threatening your life, there is nowhere to hide, no time to call the police, and so on. But this does not logically commit us to accepting that the private ownership of firearms for self-defence is legitimate. Many of us are adamant that it is not, presumably for some combination of the following reasons.
First, having a gun in the home does not make an unequivocally positive contribution to the safety of the people in it. There is a trade-off involved. A gun kept for protection against murderous intruders can (and all too often will) be used by one family member against another, or for self-harm. Second, using a gun in self-defence risks stray bullets and harm to innocent bystanders, especially in built-up urban areas. Third, if one keeps a gun in the home, one may be prone to use it more often than is justified, and not only in those rare cases where it is a necessary and proportional means of defending oneself and one’s family. Give a boy a hammer, and he will find that everything he encounters needs pounding.
If these considerations are enough to make private gun-ownership morally suspect, then by parity of reasoning the same goes for states and their war-machines.
First, militaries do not make an unequivocally positive contribution to the security of their parent societies; there is, again, a trade-off involved. It is not uncommon for armed forces to turn against the states that they are supposed to protect. Since 1950 there have been 232 military coups in ninety-four countries, and this is only counting the successful ones where an incumbent government was unseated. Moreover, if our adversaries happen to be paranoid types, protective of what they have rather that greedy for more, our military might inadvertently provoke what it is meant to deter. An insecure foreign regime might attack us not despite our armed forces, but because of the threat they pose. Militaries discourage opportunistic aggression, yes, but they simultaneously invite fear-based defensive aggression.
Second, the use of military force for national defence (or any other reason) almost invariably produces collateral damage. Particularly if we are talking about urban wars where soldiers and civilians and intermingled in the theatre of combat, harm to innocent bystanders is to be expected. Neta Crawford, a director of the Costs of War Project at Brown University, tells us that “the US military increasingly emphasized civilian protection during the wars in Afghanistan and Iraq at rhetorical, doctrinal, and operational levels. Minimizing collateral damage went from being one concern among several, to an imperative that was institutionalised to a degree that it had never been before.” And yet, civilians continue to be killed by the United States and its allies. As things stand, the prospect of war without collateral damage is wishful thinking.
Third, militaries—like guns—are liable to be overused, not just by warmongering regimes, but also by otherwise decent, democratic governments that are sincerely committed to never waging unjust wars. It is the law of the instrument writ large: insofar as we invest considerable resources into our military establishments, we are bound to seek out opportunities to use them, and some of these uses will turn out to be misuses, or abuses.
The upshot is this: the reasons we usually give for why people ought not to carry firearms—despite the fact that they would occasionally be justified in using them if they had them—are equally reasons why states should not keep militaries, despite some wars being just. There are, of course, important differences between these two contexts, which may or may not undercut the analogy. But the point is just that we have here an argument against the existence of military establishments that does not presuppose or rely on any kind of absolute, principled opposition to war. And it is not the only such argument available. One can be an abolitionist, without being a pacifist. One can be anti-military, without being anti-war. It is a mistake to suppose that these two positions stand or fall together philosophically.
Featured image from cover of Ethics, Security, and The War-Machine by Ned Dobos.
The post Are militaries justified in existing? appeared first on OUPblog.

June 19, 2020
Why talk about bad actors versus good people misses the problem of systemic racism
In an eerie echo of the 2016 presidential campaign, President Trump has denied that the brutal murder of George Floyd by police officer Derek Chauvin reveals systemic racism and implicit bias in the U.S., instead describing it as a horrible act by a “bad apple.” Tweeting about law and order and vowing that the police will not be defunded, President Trump asserted that 99% of the police are “great, great people.” In 2016, while on the campaign trail, vice presidential candidate Mike Pence remarked, after a fatal shooting by police of Keith Lamont Scott, a 43-year old African American man: “Donald Trump and I know and believe that the men and women of law enforcement…they’re the best of us and we ought to set aside this talk…about institutional racism and institutional bias.”
In the first presidential debate between Democratic nominee Hillary Rodham Clinton and Republican nominee Trump, moderator Lester Holt asked Clinton, “Do you believe that police are implicitly biased against black people?” She answered: “Lester, I think implicit bias is a problem for everyone, not just police. I think, unfortunately, too many of us in our great country jump to conclusions about each other… I think we need all of us to be asking hard questions about, you know, ‘Why am I feeling this way?'” Clinton suggested de-biasing tasks that police departments could undertake. In the vice presidential debate, Pence charged Clinton with using “a broad brush to accuse law enforcement of implicit bias or institutional racism” and stated: “Enough of this seeking every opportunity to demean law enforcement broadly by making the accusation of implicit bias every time tragedy occurs.” Conservative critics charged that Clinton had suggested “everyone is bigoted” and called “the entire nation racist.”
Across the United States, record numbers of people have been protesting to express outrage over the murder of George Floyd, express solidarity with Black Lives Matter, and demand systemic change, including the end of police violence against black and brown people. Numerous statements by public figures, educational and religious institutions, and corporations express similar sentiments, urging that a tribute to Floyd should be addressing systemic racism. The Trump administration and some prominent political allies, however, resist diagnoses of institutional racism and calls for fundamental reform. Showing a shocking lack of empathy and a hollow understanding of civil rights, President Trump – when citing the latest employment numbers as evidence of economic recovery – suggested that Floyd was “hopefully looking down right now [from heaven] and saying that this is a great thing that is happening for our country” (the numbers actually showed a continuing gap between white and black unemployment rates). This jarring remark ignores the racial injustice triggering the protests. Indeed, Trump’s remark proposes that positive economic indicators should comfort a deceased person whose last words were “I can’t breathe” because of a police officer’s knee on his neck.
At a recent roundtable in Dallas on race and policing, Trump again resisted any diagnosis of systemic racism in the United States, attributing police brutality to “bad apples” and predicting that addressing racism “will go quickly and…go very easily.” He acknowledged the need to “confront bigotry and prejudice wherever they appear,” but stated: “we’ll make no progress and heal no wounds by falsely labelling tens of millions of decent Americans as racists or bigots.”
Why do calls to address systemic racism get reduced so readily to charges of blaming “good” or “decent” people” for the conduct of a few bad ones? One reason is the premise that racism includes only overt bigotry. If people associate racial discrimination only with the brazen racial bigotry of historical figures like Sheriff Bull Connor and other white segregationists, they may assume that because that era of overt public racism is over, racism itself is a thing of the past. These “racial ghosts” (as anthropologist John Jackson Jr calls them) may give a false sense of progress and distract from present-day problems like institutional racism and implicit bias.
In her book, White Fragility, anti-racist educator Robin Diangelo explains that white people have difficulty perceiving the forms that present-day racism takes because they believe that the racist acts intentionally and is “ignorant, bigoted, prejudiced, mean-spirited, old, and Southern.” To be a racist, is to fall on the “bad” side of the good/bad binary. As Pence’s response to candidate Clinton’s remarks suggest, people bristle at any charge of bias, whether explicit or implicit, because it seems to suggest that they are bad people.
Undeniably, the persistence of overt racism and extremist hate that spills over into lethal violence should not be denied. It is impossible to watch the video of Chauvin’s murderous cruelty toward Floyd – even as medical personnel and bystanders pled for Floyd’s life – without believing he is a very bad person. But the facts that he remained on the Minneapolis police force despite seventeen prior complaints of police misconduct and that the Minneapolis police union criticized the firing of Chauvin and his fellow officers suggest broader structural problems about race and policing at the heart of current protests. Moreover, some commentators argue that contributing to these problems are Trump’s “constant bigotry, his dismantling of police reforms, his encouragement of police aggression and his violent speech.”
A focus on a few people with bad motives, evocative of famous bigots of the past, distracts from attention to the meaning and effect of present-day social practices that embody institutional racism and implicit bias. Protestors, civil society groups, and politicians seek to avoid that distraction and to focus on such social practices. Of course, Chauvin and his fellow officers should be prosecuted to the full extent of the law for their crimes, but addressing too-long ignored problems of systemic racism is also imperative. In 2016, when Pence urged that talk about institutional racism and bias be set aside, he suggested that “the faith community…has before and can again play an enormously important role in healing the divide in our country.” In 2020, even as the Trump administration deflects calls for structural transformation with narratives of bad apples versus good people, many religious leaders call for such transformation and argue that national healing must begin by reckoning with institutional racism. Such calls tap into a deep strand in American history: racial injustice represents a failure to live up to democratic and constitutional ideals and we should work to realize those ideals. And, as the multi-generational and multi-racial protests continue, some political and civic leaders are heeding that call with the important step of legislative reform of policing.
Featured image credit: apple-rotten-frozen-shrunk-haegen by ManfredRichter. Free to use via Pixabay .
The post Why talk about bad actors versus good people misses the problem of systemic racism appeared first on OUPblog.

How an unlikely pair became legendary molecular biologists
In 1962 the Nobel Prize in chemistry was awarded jointly to John Kendrew (1917-1997) and Max Perutz (1914-2002). They were the first scientists to accurately describe the three-dimensional structure of proteins. Enzymes, hormones, and antibodies are only a few examples of the many kinds of proteins present in all living organisms and knowledge of their structure is essential for progress in curing human diseases. Consequently, Kendrew and Perutz have become legendary scientists whose research is celebrated internationally.
Kendrew and Perutz were only three years apart in age. They were close colleagues for nearly 30 years, first at the Cavendish Laboratory and then at the Laboratory of Molecular Biology in Cambridge, and both were fellows of Peterhouse College. Kendrew and Perutz founded the Medical Research Council-supported Molecular Biology Research Unit at the Cavendish with Perutz as director and Kendrew as deputy director. It became a magnet for outstanding staff, such as Fred Sanger and Sydney Brenner, research students, such as Francis Crick and Hugh Huxley, and postdoctoral fellows, such as James Watson and Michael Rossmann, all deeply interested in what came to be known as molecular biology. What began as a tiny research unit grew enormously over seven decades and produced more than a dozen Nobel laureates and 50 fellows of the Royal Society.
Although Kendrew and Perutz worked closely together for many years, they were an unlikely pair. They had very different backgrounds, lifestyles, and work styles. Perutz emigrated from Austria to England when he was 22 to be a research student with John Desmond Bernal at the Cavendish, whereas Kendrew was born in Oxford into a professional middle-class family. During World War II Perutz was classified an enemy alien by the British and held in an internment camp in Canada, whereas Kendrew left the military as an honorary wing commander after six years of distinguished service. For Perutz his immediate family and hands-on bench research were central to his life, whereas Kendrew was a bachelor with wide ranging interests outside of science. Kendrew confessed that he never felt himself to be “a fanatic for laboratory bench research.”

Kendrew was, however, an extraordinary organizer and manager of research, and wise enough to recognize the usefulness of early computers in Cambridge for rapidly handling huge sets of crystallographic data. To a large extent these qualities enabled him to be the first to determine the three-dimensional structure of a protein, myoglobin. Jeannine Alton, who catalogued Kendrew’s archives for the Bodleian Library, felt that his organizational and managerial skills were fostered by participation in operational research during World War II and led him to “a kind of bureaucratic apotheosis in the sustained effort of accuracy required for the long haul to the final successful three-dimensional picture.” It would take an additional five years for researchers to determine the three-dimensional structure of another protein, lysozyme. Today several thousand protein structures are solved each year.
By the early 1960s, after less than 15 years in research, Kendrew had made up his mind to significantly reduce his research commitments. He said that he was bored with research and thought that “future protein structures were not going to be so interesting.” Kendrew insisted that his departure from academic research was not influenced by the Nobel Prize. Perhaps his responsibilities during World War II, and shortly thereafter his desire to join the scientific civil service, caused him to question the importance of further academic research for himself.
Kendrew continued on as a very active fellow of Peterhouse and editor-in-chief of the Journal of Molecular Biology, but took on new administrative duties such as scientific advisor to the Ministry of Defense and chairman of the Defense Scientific Advisory Council. He became a key figure in debates on molecular biology in the United Kingdom and in efforts to establish an international European Laboratory of Molecular Biology. He was appointed director-general of the new European Laboratory of Molecular Biology in 1975, a position he held until 1982 when he was appointed president of St. John’s College, Oxford. In 1987 Kendrew retired but continued to travel extensively, attend concerts, and participate in college life at Oxford and Cambridge until his death in August 1997 at the age of 80.
As for Perutz, he continued to carry out research on protein structure for more than six decades, right up to his death in February 2002 at the age of 87. It took him 30 years to determine the three-dimensional structure of hemoglobin, prompting him to say in a self-deprecating manner that he could have done it in half the time if he had been a bit brighter. Today Perutz is an iconic figure, revered by many in and out of science because of his unwavering dedication to hands-on bench research and his grand achievements in research, leadership, writing, and humanitarian causes. Although an unlikely pair, Kendrew and Perutz will be remembered as two of the most gifted, accomplished, and influential pioneers among twentieth-century molecular biologists.
Featured Image Credit: by PublicDomainPictures via Pixabay
The post How an unlikely pair became legendary molecular biologists appeared first on OUPblog.

June 18, 2020
Growing up in the shadow of Sri Lanka’s civil war
Today’s Sri Lankan young adults grew up during the 26-year civil war between the Sri Lankan government and an insurgent group, the Tamil Tigers, between 1983 and 2009. People living in the Sinhala-majority south were far from battlefields in the north and east of the island, but Tamil minorities everywhere lived under ethnic tension and discrimination. From 2007 to 2008 I conducted anthropological research on language and education in Kandy, a large city in the central highlands. When I first met Anitha, a Tamil Hindu, she was a 14-year-old student at a leading multilingual government school who dreamed of becoming a doctor. I was impressed by her poise and confidence. She got high grades in all her subjects and was also well liked and respected by her classmates. Now Anitha is a 27-year-old information technology professional living in Colombo. She spoke to me about how the war impacted her childhood and shaped the adult she has become.
Anitha was quick to tell me that she never experienced any direct violence. The only major incident in Kandy was the 1998 bombing of the Temple of the Tooth, she said, so it was one of the safest cities at the time. But she, her mother, and her younger sister constantly worried about her father, a high-ranking officer in the Sri Lankan army. War coverage on the news would throw them into a panic, and she had recurring nightmares of her father returning home in a coffin. Afraid to be at home alone, they were startled by even the smallest sounds in the night. But she didn’t show her stress to others; she became a “very social and funny person to hide my inner wounds.” It was only once her father had retired from the army that she was able to feel happiness. Anti-Tamil discrimination was a constant when she was a child. She noted that even though Tamils like herself from the hill-country region were not involved in the war, majority Sinhalas considered all Tamil people to be Tigers or “terrorist blood.” Sinhala teachers at her school discriminated against Tamil students, discouraging them from joining the sports teams and rarely selecting them for coveted prizes in English, drama, and choir. Most of her Sinhala classmates were nice, but there were narrow-minded girls who would whisper about them when they walked by.
After finishing grade 13, Anitha made several attempts to get into one of Sri Lanka’s competitive public universities, but narrowly missed the required exam scores. She opted to study information technology at a private university in Colombo and has now been working in the city for a few years. Last year she married a young Tamil Hindu man who also attended a top Kandy school. Though the war is over, she said the discrimination she faced as a child is still present. Her old classmates work in offices now, but their mentality is the same. She noted that anti-Tamil sentiment “will take some generations to fade away.” On the street in Colombo, it is still best “not to show you are Tamil.” Tamil women have stopped wearing bindis on their foreheads, and people are careful about speaking Tamil. She explained that Sinhalas have even told her to stop speaking Tamil because they couldn’t understand what is being said. Friends and coworkers have referenced the army’s defeat of the Tamil Tigers in her presence. Once, when she mistakenly used the Sinhala word for tiger instead of leopard, a Sinhala woman retorted that there are no longer any Tigers in Sri Lanka as they have been driven out of the mother land.
As a child Anitha was ambitious and optimistic. She would talk to me in Tamil and English about her plan to study medicine in Sri Lanka or abroad before eventually returning to Kandy. When I mentioned her past optimism, she said, without skipping a beat, “Still I am optimistic about my future.” She told me with regret that she had given up on becoming a doctor. I said that I had the same ambition when I was a teenager, and she replied, “Perhaps it is the dream of the age group . . .” She told me that her and her husband’s best chance for a comfortable life is to go abroad, possibly to Canada or Australia. They are not just seeking job opportunities but also respite from discrimination. “If you achieve something in Sri Lanka,” she pointed out, “they won’t say a Sri Lankan did it. They will say a Tamil did it.” When I asked if she could be happy if she stayed in Sri Lanka, she said she had a gut feeling she would be happy there as well. In fact, she still fantasized about returning to Kandy, which she described as an ethnically mixed city where people have strong family values.
Anitha said being a minority in Sri Lanka is a challenge. But when people tell her she can’t complete a task, she simply works harder. She shows them she isn’t weak and believes “nobody can stop me.” Back in 2008, I wondered if her optimism wasn’t a product of her youth. But she clearly has the same positive outlook today. And, like the schoolgirl who used to read Harry Potter books with gusto, she is still trying to figure out what kind of future will be possible for her.
Featured Image Credit: View point of Kandy via Wikimedia Commons
The post Growing up in the shadow of Sri Lanka’s civil war appeared first on OUPblog.

Remembering Anna Arnold Hedgeman
As we reflect on the murder of George Floyd in Minneapolis, and on the continuation of white supremacy’s enactment through police violence, we might also reflect on the region’s histories of integration and segregation, community building and racism, which in the Twin Cities as elsewhere have long gone hand in hand. Take, for example, the case of civil rights activist Anna Arnold Hedgeman, a descendant of the black southern migrants who first claimed Twin Cities citizenship during the Civil War. These pioneers, who called themselves pilgrims, founded Pilgrim Baptist Church in St. Paul and continued their community building by founding a black newspaper, the Western Appeal, securing employment in the fire, police, and postal departments, and forming enough of a presence to draw Frederick Douglass and Booker T. Washington to town for lectures. The black elite settled on “Oatmeal Hill,” a bluff overlooking downtown St. Paul, where they were part of a patchwork of African American communities comprising a rich array of people, including barbers, lawyers, clerks, railway porters, domestic workers, carpenters, tailors, and government workers.
Growing up about twenty miles outside of Minneapolis, in the otherwise almost exclusively white town of Anoka, Anna Arnold was secure in her sense of self and family. When she claimed one of her many “firsts” or “onlys” as the first black graduate of Hamline University in St. Paul, in 1922, Arnold was not, as far as she and her family were concerned, doing anything they hadn’t expected of her. So secure were her sense of belonging, and the subtleties of midwestern racism, that many of the racial injustices she experienced as a child and then as a college student took some time to register. She knew that a neighbor child’s use of a racial slur was wrong, but its blatancy made it seem an aberration. Other slights, such as the ways in which the community celebrated its multiethnic histories and traditions, were more pedestrian in their violence. “We were Norwegian, Swedish, Irish, and German,” Hedgeman noted years later, while Africa “was a jungle to which we sent our missionary money by way of tiny missionary boxes.”
During her four years of college, Anna Arnold resided with family friends in St. Paul, understanding it then as a family choice but later recognizing that she had, most likely, been unwelcome in campus housing. When she was a college senior, Arnold’s assignment to a teacher training position in the university rather than out in the community seemed an apt recognition of her academic excellence. It was some time before she realized that she secured the assignment only because St. Paul would not permit her to teach in its public schools. Similarly, Arnold’s mentor’s encouragement to take a teaching position in the South following graduation resulted not from passion but from resignation, as his efforts on her behalf in Minnesota were not taking root. “I’ve since put together all sorts of things that happened to me during that time,” Anna Arnold Hedgeman would later recall with a mixture of anger and sadness, remembering that the white people she had grown up among, regardless of the love she had felt for and from many of them, ultimately had a limited capacity to honor her life and her blackness.
Anna Arnold took a teaching position at Rust College in Holly Springs, Mississippi, a state that was home to some of the nation’s most egregious manifestations of Jim Crow. Her residence in Holly Springs, marked by a series of indignities and a growing understanding of the institutional nature of racism, also served to postpone her recognition of the realities up North. After two years, Arnold determined to head north again, to enlist people, black and white, to save the South from itself. Her realization of racism’s reach grew, however, when she lived in Minnesota, Ohio, New Jersey, New York, and Washington, D.C., and came to understand the geographic boundlessness of racism. “I did not know yet,” she later recalled, “that the basic difference between the North and the South is the difference between an ax and a stiletto.”
Anna Arnold Hedgeman would go on in her long life to achieve many “firsts” and “onlys,” as New York City’s first consultant on racial issues, the first black American and first woman to serve in a mayoral cabinet in New York City, the first black woman to run for Congress in the Bronx, the only woman to serve on the organizing committee for the 1963 March on Washington, and the keynote speaker at the first joint African African American women’s conference, in Ghana. She devoted her life to promoting justice, which she viewed as a collaborative effort among people of all backgrounds, but she demanded of white people a basic recognition we still have to make: that the eradication of racism has a primary prerequisite, our owning up.
More than half a century ago, in words that resonate as clearly now as they did then, Hedgeman described the chaos that an unrestrained racism had wreaked on the nation, north and south: “There is chaos in the land; utter confusion and fear,” she wrote. “Where people thought there had been order there was no order; only day by day reports of our world falling apart.” No justice, no peace.
“I have decided that I want to live to be 120,” Anna Arnold Hedgeman once declared, “to see whether we can in this country produce some people who have sense enough to know that they have the world in their hands.” Were she alive today, Hedgeman would, in fact, be 120 years old. Let’s honor her in the only way that makes sense: by being those people of sense and sensibility, bringing forth justice and peace.
Feature Image Credit: “March on Washington Aug 28 1963 ” by Unknown. Public Domain via Wikimedia Commons .
The post Remembering Anna Arnold Hedgeman appeared first on OUPblog.

Oxford University Press's Blog
- Oxford University Press's profile
- 238 followers
